Sample records for computing implementation plan

  1. 76 FR 1578 - Approval and Promulgation of Implementation Plans; New Mexico; Federal Implementation Plan for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... house and public hearing will be held at San Juan College, 4601 College Boulevard, Computer Science... Boulevard, Computer Science Building, Room 7103, Farmington, New Mexico 87402, (505) 326-3311. The open...

  2. Strategic Planning for Computer-Based Educational Technology.

    ERIC Educational Resources Information Center

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  3. Cloud computing strategic framework (FY13 - FY15).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  4. Productivity enhancement planning using participative management concepts

    NASA Technical Reports Server (NTRS)

    White, M. E.; Kukla, J. C.

    1985-01-01

    A productivity enhancement project which used participative management for both planning and implementation is described. The process and results associated with using participative management to plan and implement a computer terminal upgrade project where the computer terminals are used by research and development (R&D) personnel are reported. The upgrade improved the productivity of R&D personnel substantially, and their commitment of the implementation is high. Successful utilization of participative management for this project has laid a foundation for continued style shift toward participation within the organization.

  5. Sonoma County Office of Education Computer Education Plan. County Level Plans.

    ERIC Educational Resources Information Center

    Malone, Greg

    1986-01-01

    This plan describes the educational computing and computer literacy program to be implemented by the schools in Sonoma County, California. Topics covered include the roles, responsibilities, and procedures of the county-wide computer committee; the goals of computer education in the county schools; the results of a needs assessment study; a 3-year…

  6. Planning Guide for Instructional Computing.

    ERIC Educational Resources Information Center

    League for Innovation in the Community Coll., Laguna Hills, CA.

    Designed to assist academic administrators at community colleges in developing strategies for the application of computers to teaching and learning, this guide provides background information and recommendations for the design and implementation of an instructional computing plan. Chapter 1 examines computers as a topic of instruction, as a medium…

  7. Implementing Computer-Based Training for Library Staff.

    ERIC Educational Resources Information Center

    Bayne, Pauline S.; And Others

    1994-01-01

    Describes a computer-based training program for library staff developed at the University of Tennessee, Knoxville, that used HyperCard stacks on Macintosh computers. Highlights include staff involvement; evaluation of modules; trainee participation and feedback; staff recognition; administrative support; implementation plan; supervisory…

  8. District Computer Concerns: Checklist for Monitoring Instructional Use of Computers.

    ERIC Educational Resources Information Center

    Coe, Merilyn

    Designed to assist those involved with planning, organizing, and implementing computer use in schools, this checklist can be applied to: (1) assess the present state of instructional computer use in the district; (2) assist with the development of plans or guidelines for computer use; (3) support a start-up phase; and (4) monitor the…

  9. Implementing Equal Access Computer Labs.

    ERIC Educational Resources Information Center

    Clinton, Janeen; And Others

    This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…

  10. Computer Security: Governmentwide Planning Process Had Limited Impact. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    As required by the Computer Security Act of 1987, federal agencies have to identify systems that contain sensitive information and develop plans to safeguard them. The planning process was assessed in 10 civilian agencies as well as the extent to which they had implemented planning controls described in 22 selected plans. The National Institute of…

  11. A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example

    ERIC Educational Resources Information Center

    Elnagar, Ashraf; Lulu, Leena

    2007-01-01

    We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…

  12. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  13. A Planning Guide for Instructional Networks, Part II.

    ERIC Educational Resources Information Center

    Daly, Kevin F.

    1994-01-01

    This second in a series of articles on planning for instructional computer networks focuses on site preparation, installation, service, and support. Highlights include an implementation schedule; classroom and computer lab layouts; electrical power needs; workstations; network cable; telephones; furniture; climate control; and security. (LRW)

  14. Computers in Medical Education: A Cooperative Approach to Planning and Implementation

    PubMed Central

    Ellis, Lynda B.M.; Fuller, Sherrilynne

    1988-01-01

    After years of ‘ad hoc’ growth in the use of computers in the curriculum, the University of Minnesota Medical School in cooperation with the Bio-Medical Library and Health Sciences Computing Services developed and began implementation of a plan for integration of medical informatics into all phases of medical education. Objectives were developed which focus on teaching skills related to: 1) accessing, retrieving, evaluating and managing medical information; 2) appropriate utilization of computer-assisted instruction lessons; 3) electronic communication with fellow students and medical faculty; and 4) fostering a lifelong commitment to effective use of computers to solve clinical problems. Surveys assessed the status of computer expertise among faculty and entering students. The results of these surveys, lessons learned from this experience, and implications for the future of computers in medical education are discussed.

  15. Design controls for large order systems

    NASA Technical Reports Server (NTRS)

    Doane, George B., III

    1991-01-01

    The output of this task will be a program plan which will delineate how MSFC will support and implement its portion of the Inter-Center Computational Controls Program Plan. Another output will be the results of looking at various multibody/multidegree of freedom computer programs in various environments.

  16. An allotment planning concept and related computer software for planning the fixed satellite service at the 1988 space WARC

    NASA Technical Reports Server (NTRS)

    Miller, Edward F.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Whyte, Wayne A., Jr.

    1987-01-01

    The authors describe a two-phase approach to allotment planning suitable for use in planning the fixed satellite service at the 1988 Space World Administrative radio Conference (ORB-88). The two phases are (1) the identification of predetermined geostationary arc segments common to groups of administrations and (2) the use of a synthesis program to identify example scenarios of space station placements. The planning approach is described in detail and is related to the objectives of the conference. Computer software has been developed to implement the concepts, and the logic and rationale for identifying predetermined arc segments is discussed. Example scenarios are evaluated to give guidance in the selection of the technical characteristics of space communications systems to be planned. The allotment planning concept described guarantees equitable access to the geostationary orbit, provides flexibility in implementation, and reduces the need for coordination among administrations.

  17. PC-based Multiple Information System Interface (PC/MISI) detailed design and implementation plan

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    The design plan for the personal computer multiple information system interface (PC/MISI) project is discussed. The document is intended to be used as a blueprint for the implementation of the system. Each component is described in the detail necessary to allow programmers to implement the system. A description of the system data flow and system file structures is given.

  18. An allotment planning concept and related computer software for planning the fixed satellite service at the 1988 space WARC

    NASA Technical Reports Server (NTRS)

    Miller, Edward F.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Whyte, Wayne A., Jr.; Zuzek, John E.

    1987-01-01

    Described is a two-phase approach to allotment planning suitable for use in establishing the fixed satellite service at the 1988 Space World Administrative Radio Conference (ORB-88). The two phases are (1) the identification of predetermined geostationary arc segments common togroups of administrations, and (2) the use of a synthesis program to identify example scenarios of space station placements. The planning approach is described in detail and is related to the objectives of the confernece. Computer software has been developed to implement the concepts, and a complete discussion on the logic and rationale for identifying predetermined arc segments is given. Example scenarios are evaluated to give guidance in the selection of the technical characteristics of space communications systems to be planned. The allotment planning concept described guarantees in practice equitable access to the geostationary orbit, provides flexibility in implementation, and reduces the need for coordination among administrations.

  19. Educational Technology, E.C.I.A. Chapter 2. Final Evaluation Report.

    ERIC Educational Resources Information Center

    District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.

    The Planning, Monitoring and Implementing (PMI) Evaluation Model for Decision-Making was used by the District of Columbia Public Schools to monitor their Office of Educational Technology in its efforts to provide direction and coordination for computer related activities, and to plan and implement educational television projects in math and…

  20. Information Technology for the Twenty-First Century: A Bold Investment in America's Future. Implementation Plan.

    ERIC Educational Resources Information Center

    Office of Science and Technology Policy, Washington, DC. National Science and Technology Council.

    This document is the implementation plan for the Information Technology for the 21st Century (IT[squared]) initiative. With this initiative, the Federal Government is making an important re-commitment to fundamental research in information technology. The initiative proposes $366 million in increased investments in computing, information, and…

  1. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  2. A Four-Stage Model for Planning Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Morrison, Gary R.; Ross, Steven M.

    1988-01-01

    Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…

  3. Multi-GPU configuration of 4D intensity modulated radiation therapy inverse planning using global optimization

    NASA Astrophysics Data System (ADS)

    Hagan, Aaron; Sawant, Amit; Folkerts, Michael; Modiri, Arezoo

    2018-01-01

    We report on the design, implementation and characterization of a multi-graphic processing unit (GPU) computational platform for higher-order optimization in radiotherapy treatment planning. In collaboration with a commercial vendor (Varian Medical Systems, Palo Alto, CA), a research prototype GPU-enabled Eclipse (V13.6) workstation was configured. The hardware consisted of dual 8-core Xeon processors, 256 GB RAM and four NVIDIA Tesla K80 general purpose GPUs. We demonstrate the utility of this platform for large radiotherapy optimization problems through the development and characterization of a parallelized particle swarm optimization (PSO) four dimensional (4D) intensity modulated radiation therapy (IMRT) technique. The PSO engine was coupled to the Eclipse treatment planning system via a vendor-provided scripting interface. Specific challenges addressed in this implementation were (i) data management and (ii) non-uniform memory access (NUMA). For the former, we alternated between parameters over which the computation process was parallelized. For the latter, we reduced the amount of data required to be transferred over the NUMA bridge. The datasets examined in this study were approximately 300 GB in size, including 4D computed tomography images, anatomical structure contours and dose deposition matrices. For evaluation, we created a 4D-IMRT treatment plan for one lung cancer patient and analyzed computation speed while varying several parameters (number of respiratory phases, GPUs, PSO particles, and data matrix sizes). The optimized 4D-IMRT plan enhanced sparing of organs at risk by an average reduction of 26% in maximum dose, compared to the clinical optimized IMRT plan, where the internal target volume was used. We validated our computation time analyses in two additional cases. The computation speed in our implementation did not monotonically increase with the number of GPUs. The optimal number of GPUs (five, in our study) is directly related to the hardware specifications. The optimization process took 35 min using 50 PSO particles, 25 iterations and 5 GPUs.

  4. Multi-GPU configuration of 4D intensity modulated radiation therapy inverse planning using global optimization.

    PubMed

    Hagan, Aaron; Sawant, Amit; Folkerts, Michael; Modiri, Arezoo

    2018-01-16

    We report on the design, implementation and characterization of a multi-graphic processing unit (GPU) computational platform for higher-order optimization in radiotherapy treatment planning. In collaboration with a commercial vendor (Varian Medical Systems, Palo Alto, CA), a research prototype GPU-enabled Eclipse (V13.6) workstation was configured. The hardware consisted of dual 8-core Xeon processors, 256 GB RAM and four NVIDIA Tesla K80 general purpose GPUs. We demonstrate the utility of this platform for large radiotherapy optimization problems through the development and characterization of a parallelized particle swarm optimization (PSO) four dimensional (4D) intensity modulated radiation therapy (IMRT) technique. The PSO engine was coupled to the Eclipse treatment planning system via a vendor-provided scripting interface. Specific challenges addressed in this implementation were (i) data management and (ii) non-uniform memory access (NUMA). For the former, we alternated between parameters over which the computation process was parallelized. For the latter, we reduced the amount of data required to be transferred over the NUMA bridge. The datasets examined in this study were approximately 300 GB in size, including 4D computed tomography images, anatomical structure contours and dose deposition matrices. For evaluation, we created a 4D-IMRT treatment plan for one lung cancer patient and analyzed computation speed while varying several parameters (number of respiratory phases, GPUs, PSO particles, and data matrix sizes). The optimized 4D-IMRT plan enhanced sparing of organs at risk by an average reduction of [Formula: see text] in maximum dose, compared to the clinical optimized IMRT plan, where the internal target volume was used. We validated our computation time analyses in two additional cases. The computation speed in our implementation did not monotonically increase with the number of GPUs. The optimal number of GPUs (five, in our study) is directly related to the hardware specifications. The optimization process took 35 min using 50 PSO particles, 25 iterations and 5 GPUs.

  5. Application of computer-aided dispatch in law enforcement: An introductory planning guide

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Gurfield, R. M.; Garcia, E. A.; Fielding, J. E.

    1975-01-01

    A set of planning guidelines for the application of computer-aided dispatching (CAD) to law enforcement is presented. Some essential characteristics and applications of CAD are outlined; the results of a survey of systems in the operational or planning phases are summarized. Requirements analysis, system concept design, implementation planning, and performance and cost modeling are described and demonstrated with numerous examples. Detailed descriptions of typical law enforcement CAD systems, and a list of vendor sources, are given in appendixes.

  6. Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.

    2008-01-07

    The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less

  7. High performance computing and communications: FY 1997 implementation plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-12-01

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage, with bipartisan support, of the High-Performance Computing Act of 1991, signed on December 9, 1991. The original Program, in which eight Federal agencies participated, has now grown to twelve agencies. This Plan provides a detailed description of the agencies` FY 1996 HPCC accomplishments and FY 1997 HPCC plans. Section 3 of this Plan provides an overview of the HPCC Program. Section 4 contains more detailed definitions of the Program Component Areas, with an emphasis on the overall directions and milestones planned for each PCA. Appendix A providesmore » a detailed look at HPCC Program activities within each agency.« less

  8. Multi-GPU implementation of a VMAT treatment plan optimization algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun

    Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less

  9. Information Technology: Making It All Fit. Track VIII: Academic Computing Strategy.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Six papers from the 1988 CAUSE conference's Track VIII, Academic Computing Strategy, are presented. They include: "Achieving Institution-Wide Computer Fluency: A Five-Year Retrospective" (Paul J. Plourde); "A Methodology and a Policy for Building and Implementing a Strategic Computer Plan" (Frank B. Thomas); "Aligning…

  10. Opportunity costs of implementing forest plans

    NASA Astrophysics Data System (ADS)

    Fox, Bruce; Keller, Mary Anne; Schlosberg, Andrew J.; Vlahovich, James E.

    1989-01-01

    Intellectual concern with the National Forest Management Act of 1976 has followed a course emphasizing the planning aspects of the legislation associated with the development of forest plans. Once approved, however, forest plans must be implemented. Due to the complex nature of the ecological systems of interest, and the multiple and often conflicting desires of user clientele groups, the feasibility and costs of implementing forest plans require immediate investigation. For one timber sale on the Coconino National Forest in Arizona, forest plan constraints were applied and resulting resource outputs predicted using the terrestrial ecosystem analysis and modeling system (TEAMS), a computer-based decision support system developed at the School of Forestry, Northern Arizona University, With forest plan constraints for wildlife habitat, visual diversity, riparian area protection, and soil and slope harvesting restrictions, the maximum timber harvest obtainable was reduced 58% from the maximum obtainable without plan constraints.

  11. The Application of Computer Technology to the Development of a Native American Planning and Information System.

    ERIC Educational Resources Information Center

    McKinley, Kenneth H.; Self, Burl E., Jr.

    A study was conducted to determine the feasibility of using the computer-based Synagraphic Mapping Program (SYMAP) and the Statistical Package for the Social Sciences (SPSS) in formulating an efficient and accurate information system which Creek Nation tribal staff could implement and use in planning for more effective and precise delivery of…

  12. The Status of Ubiquitous Computing.

    ERIC Educational Resources Information Center

    Brown, David G.; Petitto, Karen R.

    2003-01-01

    Explains the prevalence and rationale of ubiquitous computing on college campuses--teaching with the assumption or expectation that all faculty and students have access to the Internet--and offers lessons learned by pioneering institutions. Lessons learned involve planning, technology, implementation and management, adoption of computer-enhanced…

  13. Designing and Implementing a Faculty Internet Workshop: A Collaborative Effort of Academic Computing Services and the University Library.

    ERIC Educational Resources Information Center

    Bradford, Jane T.; And Others

    1996-01-01

    Academic Computing Services staff and University librarians at Stetson University (DeLand, Florida) designed and implemented a three-day Internet workshop for interested faculty. The workshop included both hands-on lab sessions and discussions covering e-mail, telnet, ftp, Gopher, and World Wide Web. The planning, preparation of the lab and…

  14. Optimized planning methodologies of ASON implementation

    NASA Astrophysics Data System (ADS)

    Zhou, Michael M.; Tamil, Lakshman S.

    2005-02-01

    Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.

  15. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  16. Educational Leadership and Planning for Technology. Third Edition.

    ERIC Educational Resources Information Center

    Picciano, Anthony G.

    The purpose of this book is to provide educators with both the theoretical and the practical considerations for planning and implementing technology, particularly computer applications, in schools. Section I provides the basic concepts and foundation material for an overall understanding of the themes and major issues related to planning for…

  17. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  18. Computer Electronics. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This packet contains a program guide and Career Merit Achievement Plan (Career MAP) for the implementation of a computer electronics technology (computer service technician) program in Florida secondary and postsecondary schools. The program guide describes the program content and structure, provides a program description, lists job titles under…

  19. Comparative Implementation of High Performance Computing for Power System Dynamic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng

    Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less

  20. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 6: Implementation schedule, development costs, operational costs, benefit assessment, impact on company organization, spin-off assessment, phase 1, tasks 3 to 8

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.; Dublin, M.

    1973-01-01

    A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance.

  1. User-Centered Authentication: LDAP, WRAP, X.509, XML (SIG LAN: Library Automation and Networks).

    ERIC Educational Resources Information Center

    Coble, Jim

    2000-01-01

    Presents an abstract for a planned panel session on technologies for user-centered authentication and authorization currently deployed in pilot or production implementations in academic computing. Presentations included: "Implementing LSAP for Single-Password Access to Campus Resources" (Layne Nordgren); "Implementing a Scalable…

  2. Sensor planning for moving targets

    NASA Astrophysics Data System (ADS)

    Musman, Scott A.; Lehner, Paul; Elsaesser, Chris

    1994-10-01

    Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.

  3. Clinical implementation of a GPU-based simplified Monte Carlo method for a treatment planning system of proton beam therapy.

    PubMed

    Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T

    2011-11-21

    We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30-16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9-67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning.

  4. A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning.

    PubMed

    Lu, Weiguo

    2010-12-07

    We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N(3))) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets and lead to better plan quality. The computation parallelization on a GPU instead of a computer cluster significantly reduces hardware and service costs. Compared with using the current VBS framework on a computer cluster, the planning time is significantly reduced using the NVBB framework on a single workstation with a GPU card.

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less

  6. The software analysis project for the Office of Human Resources

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil service employee with several years until retirement. The employee enters current salary and savings information as well as goals concerning salary at retirement, assumptions on inflation, and the return on investments. The program produces a picture of the employee's retirement income from all sources based on the assumptions entered. A session showing features of the program was conducted for key personnel at the Center. After analysis, it was decided to offer the program through the Learning Center starting in August 1994.

  7. An emulator for minimizing finite element analysis implementation resources

    NASA Technical Reports Server (NTRS)

    Melosh, R. J.; Utku, S.; Salama, M.; Islam, M.

    1982-01-01

    A finite element analysis emulator providing a basis for efficiently establishing an optimum computer implementation strategy when many calculations are involved is described. The SCOPE emulator determines computer resources required as a function of the structural model, structural load-deflection equation characteristics, the storage allocation plan, and computer hardware capabilities. Thereby, it provides data for trading analysis implementation options to arrive at a best strategy. The models contained in SCOPE lead to micro-operation computer counts of each finite element operation as well as overall computer resource cost estimates. Application of SCOPE to the Memphis-Arkansas bridge analysis provides measures of the accuracy of resource assessments. Data indicate that predictions are within 17.3 percent for calculation times and within 3.2 percent for peripheral storage resources for the ELAS code.

  8. AgRISTARS: Renewable resources inventory. Land information support system implementation plan and schedule. [San Juan National Forest pilot test

    NASA Technical Reports Server (NTRS)

    Yao, S. S. (Principal Investigator)

    1981-01-01

    The planning and scheduling of the use of remote sensing and computer technology to support the land management planning effort at the national forests level are outlined. The task planning and system capability development were reviewed. A user evaluation is presented along with technological transfer methodology. A land management planning pilot test of the San Juan National Forest is discussed.

  9. Computer-aided acquisition and logistics support (CALS): Concept of Operations for Depot Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bourgeois, N.C.; Greer, D.K.

    1993-04-01

    This CALS Concept of Operations for Depot Maintenance provides the foundation strategy and the near term tactical plan for CALS implementation in the depot maintenance environment. The user requirements enumerated and the overarching architecture outlined serve as the primary framework for implementation planning. The seamless integration of depot maintenance business processes and supporting information systems with the emerging global CALS environment will be critical to the efficient realization of depot user's information requirements, and as, such will be a fundamental theme in depot implementations.

  10. Manufacturing engineering: Principles for optimization

    NASA Astrophysics Data System (ADS)

    Koenig, Daniel T.

    Various subjects in the area of manufacturing engineering are addressed. The topics considered include: manufacturing engineering organization concepts and management techniques, factory capacity and loading techniques, capital equipment programs, machine tool and equipment selection and implementation, producibility engineering, methods, planning and work management, and process control engineering in job shops. Also discussed are: maintenance engineering, numerical control of machine tools, fundamentals of computer-aided design/computer-aided manufacture, computer-aided process planning and data collection, group technology basis for plant layout, environmental control and safety, and the Integrated Productivity Improvement Program.

  11. Plan Recognition using Statistical Relational Models

    DTIC Science & Technology

    2014-08-25

    arguments. Section 4 describes several variants of MLNs for plan recognition. All MLN mod- els were implemented using Alchemy (Kok et al., 2010), an...For both MLN approaches, we used MC-SAT (Poon and Domingos, 2006) as implemented in the Alchemy system on both Monroe and Linux. Evaluation Metric We...Singla P, Poon H, Lowd D, Wang J, Nath A, Domingos P. The Alchemy System for Statistical Relational AI. Techni- cal Report; Department of Computer Science

  12. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    ERIC Educational Resources Information Center

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  13. Adaptive Allocation of Decision Making Responsibility Between Human and Computer in Multi-Task Situations. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chu, Y. Y.

    1978-01-01

    A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.

  14. Improving TOGAF ADM 9.1 Migration Planning Phase by ITIL V3 Service Transition

    NASA Astrophysics Data System (ADS)

    Hanum Harani, Nisa; Akhmad Arman, Arry; Maulana Awangga, Rolly

    2018-04-01

    Modification planning of business transformation involving technological utilization required a system of transition and migration planning process. Planning of system migration activity is the most important. The migration process is including complex elements such as business re-engineering, transition scheme mapping, data transformation, application development, individual involvement by computer and trial interaction. TOGAF ADM is the framework and method of enterprise architecture implementation. TOGAF ADM provides a manual refer to the architecture and migration planning. The planning includes an implementation solution, in this case, IT solution, but when the solution becomes an IT operational planning, TOGAF could not handle it. This paper presents a new model framework detail transitions process of integration between TOGAF and ITIL. We evaluated our models in field study inside a private university.

  15. Computer aided diagnosis and treatment planning for developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Li, Bin; Lu, Hongbing; Cai, Wenli; Li, Xiang; Meng, Jie; Liang, Zhengrong

    2005-04-01

    The developmental dysplasia of the hip (DDH) is a congenital malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Early diagnosis and treatment is important because failure to diagnose and improper treatment can result in significant morbidity. In this paper, we designed and implemented a computer aided system for the diagnosis and treatment planning of this disease. With the design, the patient received CT (computed tomography) or MRI (magnetic resonance imaging) scan first. A mixture-based PV partial-volume algorithm was applied to perform bone segmentation on CT image, followed by three-dimensional (3D) reconstruction and display of the segmented image, demonstrating the special relationship between the acetabulum and femurs for visual judgment. Several standard procedures, such as Salter procedure, Pemberton procedure and Femoral Shortening osteotomy, were simulated on the screen to rehearse a virtual treatment plan. Quantitative measurement of Acetabular Index (AI) and Femoral Neck Anteversion (FNA) were performed on the 3D image for evaluation of DDH and treatment plans. PC graphics-card GPU architecture was exploited to accelerate the 3D rendering and geometric manipulation. The prototype system was implemented on PC/Windows environment and is currently under clinical trial on patient datasets.

  16. Automated Procurement System (APS): Project management plan (DS-03), version 1.2

    NASA Technical Reports Server (NTRS)

    Murphy, Diane R.

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Marshall Space Flight Center (MSFC) is implementing an Automated Procurement System (APS) to streamline its business activities that are used to procure goods and services. This Project Management Plan (PMP) is the governing document throughout the implementation process and is identified as the APS Project Management Plan (DS-03). At this point in time, the project plan includes the schedules and tasks necessary to proceed through implementation. Since the basis of APS is an existing COTS system, the implementation process is revised from the standard SDLC. The purpose of the PMP is to provide the framework for the implementation process. It discusses the roles and responsibilities of the NASA project staff, the functions to be performed by the APS Development Contractor (PAI), and the support required of the NASA computer support contractor (CSC). To be successful, these three organizations must work together as a team, working towards the goals established in this Project Plan. The Project Plan includes a description of the proposed system, describes the work to be done, establishes a schedule of deliverables, and discusses the major standards and procedures to be followed.

  17. Stochastic Evolutionary Algorithms for Planning Robot Paths

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard

    2006-01-01

    A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.

  18. Fitting Computers into the Curriculum.

    ERIC Educational Resources Information Center

    Rodgers, Robert J.; And Others

    This paper provides strategies and insights that should be weighed and perhaps included in any proposal for integrating computers into a comprehensive school curriculum. The strategies include six basic stages: Initiation, Needs Assessment, Master Plan, Logistic-Specifics, Implementation, and Evaluation. The New Brunswick (New Jersey) Public…

  19. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation.

    PubMed

    Ziegenhein, Peter; Pirner, Sven; Ph Kamerling, Cornelis; Oelfke, Uwe

    2015-08-07

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37[Formula: see text] compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25[Formula: see text] and 1.95[Formula: see text] faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  20. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  1. Scrap computer recycling in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.H.; Chang, S.L.; Wang, K.M.

    1999-07-01

    It is estimated that approximately 700,000 scrap personal computers will be generated each year in Taiwan. The disposal of such a huge amount of scrap computers presents a difficult task for the island due to the scarcity of landfills and incineration facilities available locally. Also, the hazardous materials contained in the scrap computers may cause serious pollution to the environment, if they are not properly disposed. Thus, EPA of Taiwan has declared scrap personal computers as a producer responsibility recycling product on July 1997 to mandate that the manufacturers, importers and sellers of personal computers have to recover and recyclemore » their scrap computers properly. Beginning on June 1, 1998, a scrap computer recycling plan is officially implemented on the island. Under this plan, consumers can deliver their unwanted personal computers to the designated collection points to receive reward money. Currently, only six items are mandated to be recycled in this recycling plan. They are notebooks, monitor and the hard disk, power supply, printed circuit board and shell of the main frame of the personal computer. This paper presents the current scrap computer recycling system in Taiwan.« less

  2. Examining Behavioral Consultation plus Computer-Based Implementation Planning on Teachers' Intervention Implementation in an Alternative School

    ERIC Educational Resources Information Center

    Long, Anna C. J.; Sanetti, Lisa M. Hagermoser; Lark, Catherine R.; Connolly, Jennifer J. G.

    2018-01-01

    Students who demonstrate the most challenging behaviors are at risk of school failure and are often placed in alternative schools, in which a primary goal is remediating behavioral and academic concerns to facilitate students' return to their community school. Consistently implemented evidence-based classroom management is necessary toward this…

  3. Design and implementation of a compliant robot with force feedback and strategy planning software

    NASA Technical Reports Server (NTRS)

    Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.

    1984-01-01

    Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.

  4. Implementation of Audio Computer-Assisted Interviewing Software in HIV/AIDS Research

    PubMed Central

    Pluhar, Erika; Yeager, Katherine A.; Corkran, Carol; McCarty, Frances; Holstad, Marcia McDonnell; Denzmore-Nwagbara, Pamela; Fielder, Bridget; DiIorio, Colleen

    2007-01-01

    Computer assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. The purpose of this paper is to describe the implementation of one particular ACASI software package, the Questionnaire Development System™ (QDS™), in several nursing and HIV/AIDS prevention research settings. We present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS with diverse populations. We also address issues related to developing and programming a questionnaire, discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data, and summarize advantages and disadvantages of computer assisted research methods. PMID:17662924

  5. Cost-Benefit Analysis of the Officer Career Information and Planning System

    DTIC Science & Technology

    1980-08-01

    information without the use of a computer. - vii COST-BENEFIT ANALYSIS OF THE OFFICER CAREER INFOýKATTON AND PLANNING SYSTEM CONTENTS Page INTRODUCTION ...OF THE OFFICER CAREER INFORMATION AND PLANNING SYSTEM INTRODUCTION The implementation of the Officer Personnel Management System (OPMS) has...Research Report 1256 I / COST-BENEFIT ANALYSIS OF THE OFFICER CAREER INFORMATION AND PLANNING SYSTEM Roger A. Myers, Peter C. Cairo, K - Jon A

  6. 75 FR 65527 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-25

    ... Education Program Evaluation. OMB Number: 3145-0211. Expiration Date of Approval: March 31, 2013. Title of collection: Revitalizing Computing Pathways (CPATH) in Undergraduate Education Program Evaluation. Type of... organizations) to formulate and implement plans to revitalize undergraduate computing education in the United...

  7. The Nature-Computer Camp. Final Evaluation Report, 1984-1985. E.C.I.A. Chapter 2.

    ERIC Educational Resources Information Center

    District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.

    This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Inputs, processes and outcomes based on a Planning, Monitoring and Implementing (PMI) Evaluation Model are reviewed for each of the four…

  8. Nature-Computer Camp. Final Evaluation Report. E.C.I.A. Chapter 2.

    ERIC Educational Resources Information Center

    District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.

    This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Inputs, processes and outcomes based on a Planning, Monitoring and Implementing (PMI) Evaluation Model are reviewed for each of the four…

  9. Status of the CP-PACS Project

    NASA Astrophysics Data System (ADS)

    Iwasaki, Y.

    1997-02-01

    The CP-PACS computer with a peak speed of 300 Gflops was completed in March 1996 and has started to operate. We describe the final specification and the hardware implementation of the CP-PACS computer, and its performance for QCD codes. A plan of the grade-up of the computer scheduled for fall of 1996 is also given.

  10. Plans for wind energy system simulation

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.

    1978-01-01

    A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.

  11. An empirical evaulation of computerized tools to aid in enroute flight planning

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, C. Elaine; Layton, Charles

    1993-01-01

    The paper describes an experiment using the Flight Planning Testbed (FPT) in which 27 airline dispatchers were studied. Five general questions was addresses in the study: Under what circumstances does the introduction of computer-generated suggestions (flight plans) influence the planning behavior of dispatchers; what is the nature of such influences; How beneficial are the general design concepts underlying FPT; How effective are the specific implementation decisions made in realizing these general design concepts; How effectively do dispatchers evaluate situations requiring replanning and how effectively do they identify appropriate solutions to these situations. The study leaves little doubt that the introduction of computer-generated suggestions for solving a flight planning problem can have a marked impact on the cognitive processes of the user and on the ultimate plan selected.

  12. Implementing a Help Desk at a Small Liberal Arts College.

    ERIC Educational Resources Information Center

    Actis, Bev

    1993-01-01

    Planning for a computer use "help desk" at Kenyon College (Ohio) was constrained by very limited resources. However, careful and thorough planning resulted in a low-budget, homegrown, but highly effective facility. Staffing, training, staff communication, and marketing the service were essential elements in its success. (MSE)

  13. Designing Your Computer Curriculum: A Process Approach.

    ERIC Educational Resources Information Center

    Wepner, Shelley; Kramer, Steven

    Four essential steps for integrating computer technology into a school districts' reading curriculum--needs assessment, planning, implementation, and evaluation--are described in terms of what educators can do at the district and building level to facilitate optimal instructional conditions for students. With regard to needs assessment,…

  14. Current implementation and future plans on new code architecture, programming language and user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, B.

    1997-07-01

    Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.

  15. Research and applications: Artificial intelligence

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Duda, R. O.; Fikes, R. E.; Hart, P. E.; Nilsson, N. J.; Thorndyke, P. W.; Wilber, B. M.

    1971-01-01

    Research in the field of artificial intelligence is discussed. The focus of recent work has been the design, implementation, and integration of a completely new system for the control of a robot that plans, learns, and carries out tasks autonomously in a real laboratory environment. The computer implementation of low-level and intermediate-level actions; routines for automated vision; and the planning, generalization, and execution mechanisms are reported. A scenario that demonstrates the approximate capabilities of the current version of the entire robot system is presented.

  16. Instructional Transformation: How Do Teaching and Learning Change with a One-to-One Laptop Implementation?

    ERIC Educational Resources Information Center

    Branch, John C., IV.

    2014-01-01

    A one-to-one computing initiative was implemented at high school in rural southern West Virginia in 2011. The program was implemented with limited time for planning and professional development. Teachers had to anticipate the implications the innovation had for instructional transformation and the impact the laptops would have on the learners. The…

  17. Introducing students to ocean modeling via a web-based implementation for the Regional Ocean Modeling System (ROMS) river plume case study

    NASA Astrophysics Data System (ADS)

    Harris, C. K.; Overeem, I.; Hutton, E.; Moriarty, J.; Wiberg, P.

    2016-12-01

    Numerical models are increasingly used for both research and applied sciences, and it is important that we train students to run models and analyze model data. This is especially true within oceanographic sciences, many of which use hydrodynamic models to address oceanographic transport problems. These models, however, often require a fair amount of training and computer skills before a student can run the models and analyze the large data sets produced by the models. One example is the Regional Ocean Modeling System (ROMS), an open source, three-dimensional primitive equation hydrodynamic ocean model that uses a structured curvilinear horizontal grid. It currently has thousands of users worldwide, and the full model includes modules for sediment transport and biogeochemistry, and several options for turbulence closures and numerical schemes. Implementing ROMS can be challenging to students, however, in part because the code was designed to provide flexibility for the choice of model parameterizations and processes, and to run on a variety of High Performance Computing (HPC) platforms. To provide a more accessible tool for classroom use, we have modified an existing idealized ROMS implementation to be run on a High Performance Computer (HPC) via the WMT (Web Modeling Toolkit), and developed a series of lesson plans that explore sediment transport within the idealized model domain. This has addressed our goal to provide a relatively easy introduction to the numerical modeling process that can be used within upper level undergraduate and graduate classes to explore sediment transport on continental shelves. The model implementation includes wave forcing, along-shelf currents, a riverine source, and suspended sediment transport. The model calculates suspended transport and deposition of sediment delivered to the continental shelf by a riverine flood. Lesson plans lead the students through running the model on a remote HPC, modifying the standard model. The lesson plans also include instruction for visualizing the model output within Matlab and Panoply. The lesson plans have been used within graduate, undergraduate classrooms, as well as in clinics aimed at educators. Feedback from these exercises has been used to improve the lesson plans and model implementation.

  18. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  19. Administrative Computing in the USA and The Netherlands: Implications for Other Countries.

    ERIC Educational Resources Information Center

    Bluhm, Harry P.

    1990-01-01

    Examines the planning, policy, and organizational approaches taken by the United States and the Netherlands to use the computer as an administrative tool. Discusses applications in these countries to manage school finances, personnel data, administrative offices, plant operations, support services, and student data and implementation suggestions.…

  20. Standard practices for the implementation of computer software

    NASA Technical Reports Server (NTRS)

    Irvine, A. P. (Editor)

    1978-01-01

    A standard approach to the development of computer program is provided that covers the file cycle of software development from the planning and requirements phase through the software acceptance testing phase. All documents necessary to provide the required visibility into the software life cycle process are discussed in detail.

  1. Computer Integrated Manufacturing. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This packet contains a program guide and Career Merit Achievement Plan (Career MAP) for the implementation of a computer-integrated manufacturing program in Florida secondary and postsecondary schools. The program guide describes the program content and structure, provides a program description, lists job titles under the program, and includes a…

  2. Computer Engineering Technology. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This packet contains a program guide and Career Merit Achievement Plan (Career MAP) for the implementation of a computer engineering technology program in Florida secondary and postsecondary schools. The program guide describes the program content and structure, provides a program description, lists job titles under the program, and includes a…

  3. 32 CFR Appendix J to Part 154 - ADP Position Categories and Criteria for Designating Positions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., and implementation of a computer security program; major responsibility for the direction, planning... agency computer security programs, and also including direction and control of risk analysis and/or... OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY PROGRAM REGULATION Pt. 154, App. J...

  4. 32 CFR Appendix J to Part 154 - ADP Position Categories and Criteria for Designating Positions

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., and implementation of a computer security program; major responsibility for the direction, planning... agency computer security programs, and also including direction and control of risk analysis and/or... OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY PROGRAM REGULATION Pt. 154, App. J...

  5. 32 CFR Appendix J to Part 154 - ADP Position Categories and Criteria for Designating Positions

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., and implementation of a computer security program; major responsibility for the direction, planning... agency computer security programs, and also including direction and control of risk analysis and/or... OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY PROGRAM REGULATION Pt. 154, App. J...

  6. 32 CFR Appendix J to Part 154 - ADP Position Categories and Criteria for Designating Positions

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., and implementation of a computer security program; major responsibility for the direction, planning... agency computer security programs, and also including direction and control of risk analysis and/or... OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY PROGRAM REGULATION Pt. 154, App. J...

  7. 32 CFR Appendix J to Part 154 - ADP Position Categories and Criteria for Designating Positions

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., and implementation of a computer security program; major responsibility for the direction, planning... agency computer security programs, and also including direction and control of risk analysis and/or... OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY PROGRAM REGULATION Pt. 154, App. J...

  8. Implementing an ROI Measurement Process at Dell Computer.

    ERIC Educational Resources Information Center

    Tesoro, Ferdinand

    1998-01-01

    This return-on-investment (ROI) evaluation study determined the business impact of the sales negotiation training course to Dell Computer Corporation. A five-step ROI measurement process was used: Plan-Develop-Analyze-Communicate-Leverage. The corporate sales information database was used to compare pre- and post-training metrics for both training…

  9. Advanced Free Flight Planner and Dispatcher's Workstation: Preliminary Design Specification

    NASA Technical Reports Server (NTRS)

    Wilson, J.; Wright, C.; Couluris, G. J.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) has implemented the Advanced Air Transportation Technology (AATT) program to investigate future improvements to the national and international air traffic management systems. This research, as part of the AATT program, developed preliminary design requirements for an advanced Airline Operations Control (AOC) dispatcher's workstation, with emphasis on flight planning. This design will support the implementation of an experimental workstation in NASA laboratories that would emulate AOC dispatch operations. The work developed an airline flight plan data base and specified requirements for: a computer tool for generation and evaluation of free flight, user preferred trajectories (UPT); the kernel of an advanced flight planning system to be incorporated into the UPT-generation tool; and an AOC workstation to house the UPT-generation tool and to provide a real-time testing environment. A prototype for the advanced flight plan optimization kernel was developed and demonstrated. The flight planner uses dynamic programming to search a four-dimensional wind and temperature grid to identify the optimal route, altitude and speed for successive segments of a flight. An iterative process is employed in which a series of trajectories are successively refined until the LTPT is identified. The flight planner is designed to function in the current operational environment as well as in free flight. The free flight environment would enable greater flexibility in UPT selection based on alleviation of current procedural constraints. The prototype also takes advantage of advanced computer processing capabilities to implement more powerful optimization routines than would be possible with older computer systems.

  10. A Computer Simulation Modeling Tool to Assist Colleges in Long-Range Planning. Final Report.

    ERIC Educational Resources Information Center

    Salmon, Richard; And Others

    Long-range planning involves the establishment of educational objectives within a rational philosophy, the design of activities and programs to meet stated objectives, the organization and allocation of resources to implement programs, and the analysis of results in terms of the objectives. Current trends of educational growth and complexity…

  11. WastePlan model implementation for New York State. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Visalli, J.R.; Blackman, D.A.

    1995-07-01

    WastePlan is a computer software tool that models solid waste quantities, costs, and other parameters on a regional basis. The software was developed by the Tellus Institute, a nonprofit research and consulting firm. The project`s objective was to provide local solid waste management planners in New York State responsible to develop and implement comprehensive solid waste management plans authorized by the Solid Waste Management Act of 1988, with a WastePlan model specifically tailored to fit the demographic and other characteristics of New York State and to provide training and technical support to the users. Two-day workshops were held in 1992more » to introduce planners to the existing versions; subsequently, extensive changes were made to the model and a second set of two-day workshops were held in 1993 to introduce planners to the enhanced version of WastePlan. Following user evaluations, WastePlan was further modified to allow users to model systems using a simplified version, and to incorporate report forms required by New York State. A post-project survey of trainees revealed limited regular use of software. Possible reasons include lack of synchronicity with NYSDEC planning process; lack of computer literacy and aptitude among trainees; hardware limitations; software user-friendliness; and the work environment of the trainees. A number of recommendations are made to encourage use of WastePlan by local solid waste management planners.« less

  12. Internet-based computer technology on radiotherapy.

    PubMed

    Chow, James C L

    2017-01-01

    Recent rapid development of Internet-based computer technologies has made possible many novel applications in radiation dose delivery. However, translational speed of applying these new technologies in radiotherapy could hardly catch up due to the complex commissioning process and quality assurance protocol. Implementing novel Internet-based technology in radiotherapy requires corresponding design of algorithm and infrastructure of the application, set up of related clinical policies, purchase and development of software and hardware, computer programming and debugging, and national to international collaboration. Although such implementation processes are time consuming, some recent computer advancements in the radiation dose delivery are still noticeable. In this review, we will present the background and concept of some recent Internet-based computer technologies such as cloud computing, big data processing and machine learning, followed by their potential applications in radiotherapy, such as treatment planning and dose delivery. We will also discuss the current progress of these applications and their impacts on radiotherapy. We will explore and evaluate the expected benefits and challenges in implementation as well.

  13. A web-based computer aided system for liver surgery planning: initial implementation on RayPlus

    NASA Astrophysics Data System (ADS)

    Luo, Ming; Yuan, Rong; Sun, Zhi; Li, Tianhong; Xie, Qingguo

    2016-03-01

    At present, computer aided systems for liver surgery design and risk evaluation are widely used in clinical all over the world. However, most systems are local applications that run on high-performance workstations, and the images have to processed offline. Compared with local applications, a web-based system is accessible anywhere and for a range of regardless of relative processing power or operating system. RayPlus (http://rayplus.life.hust.edu.cn), a B/S platform for medical image processing, was developed to give a jump start on web-based medical image processing. In this paper, we implement a computer aided system for liver surgery planning on the architecture of RayPlus. The system consists of a series of processing to CT images including filtering, segmentation, visualization and analyzing. Each processing is packaged into an executable program and runs on the server side. CT images in DICOM format are processed step by to interactive modeling on browser with zero-installation and server-side computing. The system supports users to semi-automatically segment the liver, intrahepatic vessel and tumor from the pre-processed images. Then, surface and volume models are built to analyze the vessel structure and the relative position between adjacent organs. The results show that the initial implementation meets satisfactorily its first-order objectives and provide an accurate 3D delineation of the liver anatomy. Vessel labeling and resection simulation are planned to add in the future. The system is available on Internet at the link mentioned above and an open username for testing is offered.

  14. Facilitating Ambulatory Electronic Health Record System Implementation: Evidence from a Qualitative Study

    PubMed Central

    Hefner, Jennifer; Robbins, Julie; Huerta, Timothy R.

    2013-01-01

    Background. Ambulatory care practices have increasing interest in leveraging the capabilities of electronic health record (EHR) systems, but little information is available documenting how organizations have successfully implemented these systems. Objective. To characterize elements of successful electronic health record (EHR) system implementation and to synthesize the key informants' perspectives about successful implementation practices. Methods. Key informant interviews and focus groups were conducted with a purposive sample of individuals from US healthcare organizations identified for their success with ambulatory EHR implementation. Rigorous qualitative data analyses used both deductive and inductive methods. Results. Participants identified personal and system-related barriers, at both the individual and organization levels, including poor computer skills, productivity losses, resistance to change, and EHR system failure. Implementation success was reportedly facilitated by careful planning and consistent communication throughout distinct stages of the implementation process. A significant element of successful implementation was an emphasis on optimization, both during “go-live” and, subsequently, when users had more experience with the system. Conclusion. Successful EHR implementation requires both detailed planning and clear mechanisms to deal with unforeseen or unintended consequences. Focusing on user buy-in early and including plans for optimization can facilitate greater success. PMID:24228257

  15. "Discover New Worlds with Technology". Proceedings of the Annual College and University Computer Users Conference (37th, Miami, Florida, May 3-6, 1992).

    ERIC Educational Resources Information Center

    Miami-Dade Community Coll., FL.

    This book contains 37 papers on computer use in higher education originally presented at a May, 1992, conference of college and university computer users. Most of the papers describe programs or systems implemented at particular institutions and cover the following: systems for career planning, automating purchasing and financial commitments,…

  16. Implementation of audio computer-assisted interviewing software in HIV/AIDS research.

    PubMed

    Pluhar, Erika; McDonnell Holstad, Marcia; Yeager, Katherine A; Denzmore-Nwagbara, Pamela; Corkran, Carol; Fielder, Bridget; McCarty, Frances; Diiorio, Colleen

    2007-01-01

    Computer-assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer-assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology because of lack of familiarity with the practical issues related to using these software packages. The purpose of this report is to describe the implementation of one particular ACASI software package, the Questionnaire Development System (QDS; Nova Research Company, Bethesda, MD), in several nursing and HIV/AIDS prevention research settings. The authors present acceptability and satisfaction data from two large-scale public health studies in which they have used QDS with diverse populations. They also address issues related to developing and programming a questionnaire; discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data; and summarize advantages and disadvantages of computer-assisted research methods.

  17. A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA

    ERIC Educational Resources Information Center

    Eskrootchi, Rogheyeh; Oskrochi, G. Reza

    2010-01-01

    Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…

  18. Planning for Computer-Based Distance Education: A Review of Administrative Issues.

    ERIC Educational Resources Information Center

    Lever-Duffy, Judy C.

    The Homestead Campus of Miami-Dade Community College, in Florida, serves a sparsely populated area with a culturally diverse population including migrant farm workers, prison inmates, and U.S. Air Force personnel. To increase access to college services, the campus focused on implementing a computer-based distance education program as its primary…

  19. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    PubMed Central

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  20. Managing Information Technology as a Catalyst of Change. Track V: Optimizing the Infrastructure.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    This track of the 1993 CAUSE Conference presents eight papers on developments in computer network infrastructure and the challenges for those who plan for, implement, and manage it in colleges and universities. Papers include: (1) "Where Do We Go from Here: Summative Assessment of a Five-Year Strategic Plan for Linking and Integrating…

  1. Computer-aided resource planning and scheduling for radiological services

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  2. Airline flight planning - The weather connection

    NASA Technical Reports Server (NTRS)

    Steinberg, R.

    1981-01-01

    The history of airline flight planning is briefly reviewed. Over half a century ago, when scheduled airline services began, weather data were almost nonexistent. By the early 1950's a reliable synoptic network provided upper air reports. The next 15 years saw a rapid growth in commercial aviation, and airlines introduced computer techniques to flight planning. The 1970's saw the development of weather satellites. The current state of flight planning activities is analyzed. It is found that accurate flight planning will require meteorological information on a finer scale than can be provided by a synoptic forecast. Opportunities for a new approach are examined, giving attention to the available options, a mesoscale numerical weather prediction model, limited area fine mesh models, man-computer interactive display systems, the use of interactive techniques with the present upper air data base, and the implementation of interactive techniques.

  3. Computer Programs For Automated Welding System

    NASA Technical Reports Server (NTRS)

    Agapakis, John E.

    1993-01-01

    Computer programs developed for use in controlling automated welding system described in MFS-28578. Together with control computer, computer input and output devices and control sensors and actuators, provide flexible capability for planning and implementation of schemes for automated welding of specific workpieces. Developed according to macro- and task-level programming schemes, which increases productivity and consistency by reducing amount of "teaching" of system by technician. System provides for three-dimensional mathematical modeling of workpieces, work cells, robots, and positioners.

  4. Long Range Plan for Embedded Computer Systems Support. Volume II

    DTIC Science & Technology

    1981-10-01

    interface (pilot displays and controls plus visual system), and data collection (CMAC data, bus data and simulation data). Non-real time functions include...unless adequate upfront planning is implemented, the command will be controlled by the dynamics rather than controll - ing them. The upfront planning should...or should they be called manually? What amount and type of data should the various tools pass between each other? Under what conditions and controls

  5. Contact Graph Routing

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    Contact Graph Routing (CGR) is a dynamic routing system that computes routes through a time-varying topology of scheduled communication contacts in a network based on the DTN (Delay-Tolerant Networking) architecture. It is designed to enable dynamic selection of data transmission routes in a space network based on DTN. This dynamic responsiveness in route computation should be significantly more effective and less expensive than static routing, increasing total data return while at the same time reducing mission operations cost and risk. The basic strategy of CGR is to take advantage of the fact that, since flight mission communication operations are planned in detail, the communication routes between any pair of bundle agents in a population of nodes that have all been informed of one another's plans can be inferred from those plans rather than discovered via dialogue (which is impractical over long one-way-light-time space links). Messages that convey this planning information are used to construct contact graphs (time-varying models of network connectivity) from which CGR automatically computes efficient routes for bundles. Automatic route selection increases the flexibility and resilience of the space network, simplifying cross-support and reducing mission management costs. Note that there are no routing tables in Contact Graph Routing. The best route for a bundle destined for a given node may routinely be different from the best route for a different bundle destined for the same node, depending on bundle priority, bundle expiration time, and changes in the current lengths of transmission queues for neighboring nodes; routes must be computed individually for each bundle, from the Bundle Protocol agent's current network connectivity model for the bundle s destination node (the contact graph). Clearly this places a premium on optimizing the implementation of the route computation algorithm. The scalability of CGR to very large networks remains a research topic. The information carried by CGR contact plan messages is useful not only for dynamic route computation, but also for the implementation of rate control, congestion forecasting, transmission episode initiation and termination, timeout interval computation, and retransmission timer suspension and resumption.

  6. Patient-Specific Modeling of Hemodynamics: Supporting Surgical Planning in a Fontan Circulation Correction.

    PubMed

    van Bakel, Theodorus M J; Lau, Kevin D; Hirsch-Romano, Jennifer; Trimarchi, Santi; Dorfman, Adam L; Figueroa, C Alberto

    2018-04-01

    Computational fluid dynamics (CFD) is a modeling technique that enables calculation of the behavior of fluid flows in complex geometries. In cardiovascular medicine, CFD methods are being used to calculate patient-specific hemodynamics for a variety of applications, such as disease research, noninvasive diagnostics, medical device evaluation, and surgical planning. This paper provides a concise overview of the methods to perform patient-specific computational analyses using clinical data, followed by a case study where CFD-supported surgical planning is presented in a patient with Fontan circulation complicated by unilateral pulmonary arteriovenous malformations. In closing, the challenges for implementation and adoption of CFD modeling in clinical practice are discussed.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. Implementation and Testing of VLBI Software Correlation at the USNO

    NASA Technical Reports Server (NTRS)

    Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken

    2010-01-01

    The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.

  9. Conjugate-Gradient Algorithms For Dynamics Of Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1993-01-01

    Algorithms for serial and parallel computation of forward dynamics of multiple-link robotic manipulators by conjugate-gradient method developed. Parallel algorithms have potential for speedup of computations on multiple linked, specialized processors implemented in very-large-scale integrated circuits. Such processors used to stimulate dynamics, possibly faster than in real time, for purposes of planning and control.

  10. [Computerization and robotics in medical practice].

    PubMed

    Dervaderics, J

    1997-10-26

    The article gives the outlines of all principles used in computing included the non-electrical and analog computers and the artifical intelligence followed by citing examples as well. The principles and medical utilization of virtual reality are also mentioned. There are discussed: surgical planning, image guided surgery, robotic surgery, telepresence and telesurgery, and telemedicine implemented partially via Internet.

  11. 40 CFR 96.207 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... BUDGET TRADING PROGRAM AND CAIR NOX AND SO2 TRADING PROGRAMS FOR STATE IMPLEMENTATION PLANS CAIR SO2... period scheduled, under the CAIR SO2 Trading Program, to begin on the occurrence of an act or event shall... the CAIR SO2 Trading Program, to begin before the occurrence of an act or event shall be computed so...

  12. Critiquing: A Different Approach to Expert Computer Advice in Medicine

    PubMed Central

    Miller, Perry L.

    1984-01-01

    The traditional approach to computer-based advice in medicine has been to design systems which simulate a physician's decision process. This paper describes a different approach to computer advice in medicine: a critiquing approach. A critiquing system first asks how the physician is planning to manage his patient and then critiques that plan, discussing the advantages and disadvantages of the proposed approach, compared to other approaches which might be reasonable or preferred. Several critiquing systems are currently in different stages of implementation. The paper describes these systems and discusses the characteristics which make each domain suitable for critiquing. The critiquing approach may prove especially well-suited in domains where decisions involve a great deal of subjective judgement.

  13. New World Vistas: New Models of Computation Lattice Based Quantum Computation

    DTIC Science & Technology

    1996-07-25

    ro ns Eniac (18,000 vacuum tubes) UNIVAC II (core memory) Digital Devices magnetostrictive delay line Intel 1103 integrated circuit IBM 3340 disk...in areal size of a bit for the last fifty years since the 1946 Eniac computer. 1 Planned Research I propose to consider the feasibility of implement...tech- nology. Fiqure 1 is a log-linear plot of data for the areal size of a bit over the last fifty years (from 18,000 bits in the 1946 Eniac computer

  14. Multi-community command and control systems in law enforcement: An introductory planning guide

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Garcia, E. A.; Kennedy, R. D.

    1976-01-01

    A set of planning guidelines for multi-community command and control systems in law enforcement is presented. Essential characteristics and applications of these systems are outlined. Requirements analysis, system concept design, implementation planning, and performance and cost modeling are described and demonstrated with numerous examples. Program management techniques and joint powers agreements for multicommunity programs are discussed in detail. A description of a typical multi-community computer-aided dispatch system is appended.

  15. 3D treatment planning systems.

    PubMed

    Saw, Cheng B; Li, Sicong

    2018-01-01

    Three-dimensional (3D) treatment planning systems have evolved and become crucial components of modern radiation therapy. The systems are computer-aided designing or planning softwares that speed up the treatment planning processes to arrive at the best dose plans for the patients undergoing radiation therapy. Furthermore, the systems provide new technology to solve problems that would not have been considered without the use of computers such as conformal radiation therapy (CRT), intensity-modulated radiation therapy (IMRT), and volumetric modulated arc therapy (VMAT). The 3D treatment planning systems vary amongst the vendors and also the dose delivery systems they are designed to support. As such these systems have different planning tools to generate the treatment plans and convert the treatment plans into executable instructions that can be implemented by the dose delivery systems. The rapid advancements in computer technology and accelerators have facilitated constant upgrades and the introduction of different and unique dose delivery systems than the traditional C-arm type medical linear accelerators. The focus of this special issue is to gather relevant 3D treatment planning systems for the radiation oncology community to keep abreast of technology advancement by assess the planning tools available as well as those unique "tricks or tips" used to support the different dose delivery systems. Copyright © 2018 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  16. Renewable energy in electric utility capacity planning: a decomposition approach with application to a Mexican utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staschus, K.

    1985-01-01

    In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less

  17. Plan for the Characterization of HIRF Effects on a Fault-Tolerant Computer Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.; Koppen, Sandra V.

    2008-01-01

    This report presents the plan for the characterization of the effects of high intensity radiated fields on a prototype implementation of a fault-tolerant data communication system. Various configurations of the communication system will be tested. The prototype system is implemented using off-the-shelf devices. The system will be tested in a closed-loop configuration with extensive real-time monitoring. This test is intended to generate data suitable for the design of avionics health management systems, as well as redundancy management mechanisms and policies for robust distributed processing architectures.

  18. An Annotated Bibliography of the Literature Dealing with Teacher Training in the Uses of the Computer in Education.

    ERIC Educational Resources Information Center

    Burkholder, Jana N.

    The 32 citations in this annotated bibliography were identified by a review of the literature which addressed three questions: (1) is there a need for teacher training in the educational uses of computers? (9 citations); (2) what ideas and competencies should be considered when planning and implementing teacher training programs? (14 citations);…

  19. The application of automated operations at the Institutional Processing Center

    NASA Technical Reports Server (NTRS)

    Barr, Thomas H.

    1993-01-01

    The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.

  20. Interactive Computers: The Development of an Urban Leisure Information Service.

    ERIC Educational Resources Information Center

    Hayward, Jeff; Fairey, Kenyon

    1984-01-01

    This article describes the Leisure Match Project, which allows the public to solicit information about recreation activities by using a microcomputer. Guidelines for planning and implementation of this urban recreation information service are discussed. (DF)

  1. Network Aggregation in Transportation Planning : Volume I : Summary and Survey

    DOT National Transportation Integrated Search

    1978-04-01

    Volume 1 summarizes research on network aggregation in transportation models. It includes a survey of network aggregation practices, definition of an extraction aggregation model, computational results on a heuristic implementation of the model, and ...

  2. Complex Osteotomies of Tibial Plateau Malunions Using Computer-Assisted Planning and Patient-Specific Surgical Guides.

    PubMed

    Fürnstahl, Philipp; Vlachopoulos, Lazaros; Schweizer, Andreas; Fucentese, Sandro F; Koch, Peter P

    2015-08-01

    The accurate reduction of tibial plateau malunions can be challenging without guidance. In this work, we report on a novel technique that combines 3-dimensional computer-assisted planning with patient-specific surgical guides for improving reliability and accuracy of complex intraarticular corrective osteotomies. Preoperative planning based on 3-dimensional bone models was performed to simulate fragment mobilization and reduction in 3 cases. Surgical implementation of the preoperative plan using patient-specific cutting and reduction guides was evaluated; benefits and limitations of the approach were identified and discussed. The preliminary results are encouraging and show that complex, intraarticular corrective osteotomies can be accurately performed with this technique. For selective patients with complex malunions around the tibia plateau, this method might be an attractive option, with the potential to facilitate achieving the most accurate correction possible.

  3. 3D Boolean operations in virtual surgical planning.

    PubMed

    Charton, Jerome; Laurentjoye, Mathieu; Kim, Youngjun

    2017-10-01

    Boolean operations in computer-aided design or computer graphics are a set of operations (e.g. intersection, union, subtraction) between two objects (e.g. a patient model and an implant model) that are important in performing accurate and reproducible virtual surgical planning. This requires accurate and robust techniques that can handle various types of data, such as a surface extracted from volumetric data, synthetic models, and 3D scan data. This article compares the performance of the proposed method (Boolean operations by a robust, exact, and simple method between two colliding shells (BORES)) and an existing method based on the Visualization Toolkit (VTK). In all tests presented in this article, BORES could handle complex configurations as well as report impossible configurations of the input. In contrast, the VTK implementations were unstable, do not deal with singular edges and coplanar collisions, and have created several defects. The proposed method of Boolean operations, BORES, is efficient and appropriate for virtual surgical planning. Moreover, it is simple and easy to implement. In future work, we will extend the proposed method to handle non-colliding components.

  4. VLSI neuroprocessors

    NASA Technical Reports Server (NTRS)

    Kemeny, Sabrina E.

    1994-01-01

    Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.

  5. The implementation of AI technologies in computer wargames

    NASA Astrophysics Data System (ADS)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  6. Leaf position optimization for step-and-shoot IMRT.

    PubMed

    De Gersem, W; Claus, F; De Wagter, C; Van Duyse, B; De Neve, W

    2001-12-01

    To describe the theoretical basis, the algorithm, and implementation of a tool that optimizes segment shapes and weights for step-and-shoot intensity-modulated radiation therapy delivered by multileaf collimators. The tool, called SOWAT (Segment Outline and Weight Adapting Tool) is applied to a set of segments, segment weights, and corresponding dose distribution, computed by an external dose computation engine. SOWAT evaluates the effects of changing the position of each collimating leaf of each segment on an objective function, as follows. Changing a leaf position causes a change in the segment-specific dose matrix, which is calculated by a fast dose computation algorithm. A weighted sum of all segment-specific dose matrices provides the dose distribution and allows computation of the value of the objective function. Only leaf position changes that comply with the multileaf collimator constraints are evaluated. Leaf position changes that tend to decrease the value of the objective function are retained. After several possible positions have been evaluated for all collimating leaves of all segments, an external dose engine recomputes the dose distribution, based on the adapted leaf positions and weights. The plan is evaluated. If the plan is accepted, a segment sequencer is used to make the prescription files for the treatment machine. Otherwise, the user can restart SOWAT using the new set of segments, segment weights, and corresponding dose distribution. The implementation was illustrated using two example cases. The first example is a T1N0M0 supraglottic cancer case that was distributed as a multicenter planning exercise by investigators from Rotterdam, The Netherlands. The exercise involved a two-phase plan. Phase 1 involved the delivery of 46 Gy to a concave-shaped planning target volume (PTV) consisting of the primary tumor volume and the elective lymph nodal regions II-IV on both sides of the neck. Phase 2 involved a boost of 24 Gy to the primary tumor region only. SOWAT was applied to the Phase 1 plan. Parotid sparing was a planning goal. The second implementation example is an ethmoid sinus cancer case, planned with the intent of bilateral visus sparing. The median PTV prescription dose was 70 Gy with a maximum dose constraint to the optic pathway structures of 60 Gy. The initial set of segments, segment weights, and corresponding dose distribution were obtained, respectively, by an anatomy-based segmentation tool, a segment weight optimization tool, and a differential scatter-air ratio dose computation algorithm as external dose engine. For the supraglottic case, this resulted in a plan that proved to be comparable to the plans obtained at the other institutes by forward or inverse planning techniques. After using SOWAT, the minimum PTV dose and PTV dose homogeneity increased; the maximum dose to the spinal cord decreased from 38 Gy to 32 Gy. The left parotid mean dose decreased from 22 Gy to 19 Gy and the right parotid mean dose from 20 to 18 Gy. For the ethmoid sinus case, the target homogeneity increased by leaf position optimization, together with a better sparing of the optical tracts. By using SOWAT, the plans improved with respect to all plan evaluation end points. Compliance with the multileaf collimator constraints is guaranteed. The treatment delivery time remains almost unchanged, because no additional segments are created.

  7. Model compilation for real-time planning and diagnosis with feedback

    NASA Technical Reports Server (NTRS)

    Barrett, Anthony

    2005-01-01

    This paper describes MEXEC, an implemented micro executive that compiles a device model that can have feedback into a structure for subsequent evaluation. This system computes both the most likely current device mode from n sets of sensor measurements and the n-1 step reconfiguration plan that is most likely to result in reaching a target mode - if such a plan exists. A user tunes the system by increasing n to improve system capability at the cost of real-time performance.

  8. Designing systematic conservation assessments that promote effective implementation: best practice from South Africa.

    PubMed

    Knight, Andrew T; Driver, Amanda; Cowling, Richard M; Maze, Kristal; Desmet, Philip G; Lombard, Amanda T; Rouget, Mathieu; Botha, Mark A; Boshoff, Andre F; Castley, J Guy; Goodman, Peter S; Mackinnon, Kathy; Pierce, Shirley M; Sims-Castley, Rebecca; Stewart, Warrick I; von Hase, Amrei

    2006-06-01

    Systematic conservation assessment and conservation planning are two distinct fields of conservation science often confused as one and the same. Systematic conservation assessment is the technical, often computer-based, identification of priority areas for conservation. Conservation planning is composed of a systematic conservation assessment coupled with processes for development of an implementation strategy and stakeholder collaboration. The peer-reviewed conservation biology literature abounds with studies analyzing the performance of assessments (e.g., area-selection techniques). This information alone, however can never deliver effective conservation action; it informs conservation planning. Examples of how to translate systematic assessment outputs into knowledge and then use them for "doing" conservation are rare. South Africa has received generous international and domestic funding for regional conservation planning since the mid-1990s. We reviewed eight South African conservation planning processes and identified key ingredients of best practice for undertaking systematic conservation assessments in a way that facilitates implementing conservation action. These key ingredients include the design of conservation planning processes, skills for conservation assessment teams, collaboration with stakeholders, and interpretation and mainstreaming of products (e.g., maps) for stakeholders. Social learning institutions are critical to the successful operationalization of assessments within broader conservation planning processes and should include not only conservation planners but also diverse interest groups, including rural landowners, politicians, and government employees.

  9. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  10. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  11. EON: a component-based approach to automation of protocol-directed therapy.

    PubMed Central

    Musen, M A; Tu, S W; Das, A K; Shahar, Y

    1996-01-01

    Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854

  12. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery.

    PubMed

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-01-01

    Computer-assisted preoperative simulation surgery is employed to plan and interact with the 3D images during the orthognathic procedure. It is useful for positioning and fixation of maxilla by a plate. We report a case of maxillary retrusion by a bilateral cleft lip and palate, in which a 2-stage orthognathic procedure (maxillary advancement by distraction technique and mandibular setback surgery) was performed following a computer-assisted preoperative simulation planning to achieve the positioning and fixation of the plate. A high accuracy was achieved in the present case. A 21-year-old male patient presented to our department with a complaint of maxillary retrusion following bilateral cleft lip and palate. Computer-assisted preoperative simulation with 2-stage orthognathic procedure using distraction technique and mandibular setback surgery was planned. The preoperative planning of the procedure resulted in good aesthetic outcomes. The error of the maxillary position was less than 1mm. The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  13. A Computer for Every Student and Teacher: Lessons Learned about Planning and Implementing a Successful 1:1 Learning Initiative in Schools

    ERIC Educational Resources Information Center

    Corn, Jenifer O.; Oliver, Kevin M.; Hess, Clara E.; Halstead, Elizabeth O.; Argueta, Rodolfo; Patel, Ruchi K.; Tingen, Jennifer; Huff, Jessica D.

    2010-01-01

    Twelve high schools in North Carolina piloted a 1:1 learning initiative, where every student and teacher received a laptop computer with wireless Internet access provided throughout the school. The overall goals of the initiative were to improve teaching practices; increase student achievement; and better prepare students for work, citizenship,…

  14. Learning COMCAT (Computer Output Microform Catalog): Library Training Program for Foreign Students at New York Institute of Technology.

    ERIC Educational Resources Information Center

    Chiang, Ching-hsin

    This thesis reports on the designer's plans and experiences in carrying out the design, development, implementation, and evaluation of a project, the purpose of which was to develop a training program that would enable foreign students at the New York Institute of Technology (NYIT) to use the Computer Output Microform Catalog (COMCAT) and to…

  15. Model-based Executive Control through Reactive Planning for Autonomous Rovers

    NASA Technical Reports Server (NTRS)

    Finzi, Alberto; Ingrand, Felix; Muscettola, Nicola

    2004-01-01

    This paper reports on the design and implementation of a real-time executive for a mobile rover that uses a model-based, declarative approach. The control system is based on the Intelligent Distributed Execution Architecture (IDEA), an approach to planning and execution that provides a unified representational and computational framework for an autonomous agent. The basic hypothesis of IDEA is that a large control system can be structured as a collection of interacting agents, each with the same fundamental structure. We show that planning and real-time response are compatible if the executive minimizes the size of the planning problem. We detail the implementation of this approach on an exploration rover (Gromit an RWI ATRV Junior at NASA Ames) presenting different IDEA controllers of the same domain and comparing them with more classical approaches. We demonstrate that the approach is scalable to complex coordination of functional modules needed for autonomous navigation and exploration.

  16. Computer Supported Argument Visualisation: Modelling in Consultative Democracy Around Wicked Problems

    NASA Astrophysics Data System (ADS)

    Ohl, Ricky

    In this case study, computer supported argument visualisation has been applied to the analysis and representation of the draft South East Queensland Regional Plan Consultation discourse, demonstrating how argument mapping can help deliver the transparency and accountability required in participatory democracy. Consultative democracy for regional planning falls into a category of problems known as “wicked problems”. Inherent in this environment is heterogeneous viewpoints, agendas and voices, built on disparate and often contradictory logic. An argument ontology and notation that was designed specifically to deal with consultative urban planning around wicked problems is the Issue Based Information System (IBIS) and IBIS notation (Rittel & Webber, 1984). The software used for argument visualisation in this case was Compendium, a derivative of IBIS. The high volume of stakeholders and discourse heterogeneity in this environment calls for a unique approach to argument mapping. The map design model developed from this research has been titled a “Consultation Map”. The design incorporates the IBIS ontology within a hybrid of mapping approaches, amalgamating elements from concept, dialogue, argument, debate, thematic and tree-mapping. The consultation maps developed from the draft South East Queensland Regional Plan Consultation provide a transparent visual record to give evidence of the themes of citizen issues within the consultation discourse. The consultation maps also link the elicited discourse themes to related policies from the SEQ Regional Plan providing explicit evidence of SEQ Regional Plan policy-decisions matching citizen concerns. The final consultation map in the series provides explicit links between SEQ Regional Plan policy items and monitoring activities reporting on the ongoing implementation of the SEQ Regional Plan. This map provides updatable evidence of and accountability for SEQ Regional Plan policy implementation and developments.

  17. Prescription for trouble: Medicare Part D and patterns of computer and internet access among the elderly.

    PubMed

    Wright, David W; Hill, Twyla J

    2009-01-01

    The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 specifically encourages Medicare enrollees to use the Internet to obtain information regarding the new prescription drug insurance plans and to enroll in a plan. This reliance on computer technology and the Internet leads to practical questions regarding implementation of the insurance coverage. For example, it seems unlikely that all Medicare enrollees have access to computers and the Internet or that they are all computer literate. This study uses the 2003 Current Population Survey to examine the effects of disability and income on computer access and Internet use among the elderly. Internet access declines with age and is exacerbated by disabilities. Also, decreases in income lead to decreases in computer ownership and use. Therefore, providing prescription drug coverage primarily through the Internet seems likely to maintain or increase stratification of access to health care, especially for low-income, disabled elderly, who are also a group most in need of health care access.

  18. The flight planning - flight management connection

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.

    1984-01-01

    Airborne flight management systems are currently being implemented to minimize direct operating costs when flying over a fixed route between a given city pair. Inherent in the design of these systems is that the horizontal flight path and wind and temperature models be defined and input into the airborne computer before flight. The wind/temperature model and horizontal path are products of the flight planning process. Flight planning consists of generating 3-D reference trajectories through a forecast wind field subject to certain ATC and transport operator constraints. The interrelationships between flight management and flight planning are reviewed, and the steps taken during the flight planning process are summarized.

  19. Evaluation of data requirements for computerized constructability analysis of pavement rehabilitation projects.

    DOT National Transportation Integrated Search

    2013-08-01

    This research aimed to evaluate the data requirements for computer assisted construction planning : and staging methods that can be implemented in pavement rehabilitation projects in the state of : Georgia. Results showed that two main issues for the...

  20. Report of the LSPI/NASA Workshop on Lunar Base Methodology Development

    NASA Technical Reports Server (NTRS)

    Nozette, Stewart; Roberts, Barney

    1985-01-01

    Groundwork was laid for computer models which will assist in the design of a manned lunar base. The models, herein described, will provide the following functions for the successful conclusion of that task: strategic planning; sensitivity analyses; impact analyses; and documentation. Topics addressed include: upper level model description; interrelationship matrix; user community; model features; model descriptions; system implementation; model management; and plans for future action.

  1. Implementation of a best management practice (BMP) system for a clay mining facility in Taiwan.

    PubMed

    Lin, Jen-Yang; Chen, Yen-Chang; Chen, Walter; Lee, Tsu-Chuan; Yu, Shaw L

    2006-01-01

    The present paper describes the planning and implementation of a best management practice (BMP) system for a clay mining facility in Northern Taiwan. It is a challenge to plan and design BMPs for mitigating the impact of clay mining operations due to the fact that clay mining drainage typically contains very high concentrations of suspended solids (SS), Fe-ions, and [H+] concentrations. In the present study, a field monitoring effort was conducted to collect data for runoff quality and quantity from a clay mining area in Northern Taiwan. A BMP system including holding ponds connected in series was designed and implemented and its pollutant removal performance was assessed. The assessment was based on mass balance computations and an analysis of the relationship between BMP design parameters such as pond depth, detention time, surface loading rate, etc. and the pollutant removal efficiency. Field sampling results showed that the surface-loading rate is exponential related to the removing rate. The results provide the basis for a more comprehensive and efficient BMP implementation plan for clay mining operations.

  2. GPU-accelerated Monte Carlo convolution/superposition implementation for dose calculation.

    PubMed

    Zhou, Bo; Yu, Cedric X; Chen, Danny Z; Hu, X Sharon

    2010-11-01

    Dose calculation is a key component in radiation treatment planning systems. Its performance and accuracy are crucial to the quality of treatment plans as emerging advanced radiation therapy technologies are exerting ever tighter constraints on dose calculation. A common practice is to choose either a deterministic method such as the convolution/superposition (CS) method for speed or a Monte Carlo (MC) method for accuracy. The goal of this work is to boost the performance of a hybrid Monte Carlo convolution/superposition (MCCS) method by devising a graphics processing unit (GPU) implementation so as to make the method practical for day-to-day usage. Although the MCCS algorithm combines the merits of MC fluence generation and CS fluence transport, it is still not fast enough to be used as a day-to-day planning tool. To alleviate the speed issue of MC algorithms, the authors adopted MCCS as their target method and implemented a GPU-based version. In order to fully utilize the GPU computing power, the MCCS algorithm is modified to match the GPU hardware architecture. The performance of the authors' GPU-based implementation on an Nvidia GTX260 card is compared to a multithreaded software implementation on a quad-core system. A speedup in the range of 6.7-11.4x is observed for the clinical cases used. The less than 2% statistical fluctuation also indicates that the accuracy of the authors' GPU-based implementation is in good agreement with the results from the quad-core CPU implementation. This work shows that GPU is a feasible and cost-efficient solution compared to other alternatives such as using cluster machines or field-programmable gate arrays for satisfying the increasing demands on computation speed and accuracy of dose calculation. But there are also inherent limitations of using GPU for accelerating MC-type applications, which are also analyzed in detail in this article.

  3. Air Force Global Weather Central System Architecture Study. Final System/Subsystem Summary Report. Volume 7. Implementation and Development Plans

    DTIC Science & Technology

    1976-03-01

    special access; PS2 will be for the variable perimeter; and PS3, PS4 , and PS5 will make up the normal access area. This added computer power will be...implementation of PS1 and PS4 will continue as new com- munications consoles are actively established for possible side-by-side opera- tion of the

  4. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individualmore » work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.« less

  5. Computer-supported implant planning and guided surgery: a narrative review.

    PubMed

    Vercruyssen, Marjolein; Laleman, Isabelle; Jacobs, Reinhilde; Quirynen, Marc

    2015-09-01

    To give an overview of the workflow from examination to planning and execution, including possible errors and pitfalls, in order to justify the indications for guided surgery. An electronic literature search of the PubMed database was performed with the intention of collecting relevant information on computer-supported implant planning and guided surgery. Currently, different computer-supported systems are available to optimize and facilitate implant surgery. The transfer of the implant planning (in a software program) to the operative field remains however the most difficult part. Guided implant surgery clearly reduces the inaccuracy, defined as the deviation between the planned and the final position of the implant in the mouth. It might be recommended for the following clinical indications: need for minimal invasive surgery, optimization of implant planning and positioning (i.e. aesthetic cases), and immediate restoration. The digital technology rapidly evolves and new developments have resulted in further improvement of the accuracy. Future developments include the reduction of the number of steps needed from the preoperative examination of the patient to the actual execution of the guided surgery. The latter will become easier with the implementation of optical scans and 3D-printing. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y. M., E-mail: ymingy@gmail.com; Bednarz, B.; Svatos, M.

    Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship withinmore » a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead.« less

  7. Concurrent Monte Carlo transport and fluence optimization with fluence adjusting scalable transport Monte Carlo

    PubMed Central

    Svatos, M.; Zankowski, C.; Bednarz, B.

    2016-01-01

    Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship within a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead. PMID:27277051

  8. The Interactive Electronic Technical Manual: Requirements, Current Status, and Implementation. Strategy Considerations.

    DTIC Science & Technology

    1991-07-01

    authoring systems. Concurrently, great strides in computer-aided design and computer-aided maintenance have contributed to this capability. 12 Junod ...J.; William A. Nugent; and L. John Junod . Plan for the Navy/Air Force Test of the Interactive Electronic Technical Manual (IETM) at Cecil Field...AFHRL Logistics and Human Factors Division, WPAFB. Aug 1990. 12. Junod , John L. PY90 Interactive Electronic Technical Manual (IETM) Portable Delivery

  9. Adiabatic Quantum Computing with Neutral Atoms

    NASA Astrophysics Data System (ADS)

    Hankin, Aaron; Biedermann, Grant; Burns, George; Jau, Yuan-Yu; Johnson, Cort; Kemme, Shanalyn; Landahl, Andrew; Mangan, Michael; Parazzoli, L. Paul; Schwindt, Peter; Armstrong, Darrell

    2012-06-01

    We are developing, both theoretically and experimentally, a neutral atom qubit approach to adiabatic quantum computation. Using our microfabricated diffractive optical elements, we plan to implement an array of optical traps for cesium atoms and use Rydberg-dressed ground states to provide a controlled atom-atom interaction. We will develop this experimental capability to generate a two-qubit adiabatic evolution aimed specifically toward demonstrating the two-qubit quadratic unconstrained binary optimization (QUBO) routine.

  10. Study 2.5 final report. DORCA computer program. Volume 4: Executive summary report

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The functions and capabilities of the Dynamic Operational Requirements and Cost Analysis Program are explained. The existence and purpose of the program are presented to provide an evaluation of program applicability to areas of responsibility for potential users. The implementation of the program on the Univac 1108 computer is discussed. The application of the program for mission planning and project management is described.

  11. Putting It All Together.

    ERIC Educational Resources Information Center

    McNamara, Elizabeth T.; Grant, Cathy Miles; Wasser, Judith Davidson

    1998-01-01

    Discusses the parallel between the rapid increase in the acquisition of computer technology and electronic networks by schools and systemic reform movements. Provides some insight on building a school and the community planning process to support technology implementation, connecting content to technology, professional development, and training…

  12. Technology, Students and Faculty...How To Make It Happen!

    ERIC Educational Resources Information Center

    Cummings, Debra; Buzzard, Connie

    2002-01-01

    A plan was developed by Fort Scott Community College (Kansas) to implement continuous improvement with technology integration. In addition to ensuring that college employees acquired and used high-tech computer skills, faculty had to use and incorporate technology in the classroom. (JOW)

  13. Comparisons of Physicians' and Nurses' Attitudes towards Computers.

    PubMed

    Brumini, Gordana; Ković, Ivor; Zombori, Dejvid; Lulić, Ileana; Bilic-Zulle, Lidija; Petrovecki, Mladen

    2005-01-01

    Before starting the implementation of integrated hospital information systems, the physicians' and nurses' attitudes towards computers were measured by means of a questionnaire. The study was conducted in Dubrava University Hospital, Zagreb in Croatia. Out of 194 respondents, 141 were nurses and 53 physicians, randomly selected. They surveyed by an anonymous questionnaire consisting of 8 closed questions about demographic data, computer science education and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers between groups were compared using Kruskal-Wallis and Mann Whitney test for post-hoc analysis. The total score presented attitudes toward computers. Physicians' total score was 130 (97-144), while nurses' total score was 123 (88-141). It points that the average answer to all statements was between "agree" and "strongly agree", and these high total scores indicated their positive attitudes. Age, computer science education and computer usage were important factors witch enhances the total score. Younger physicians and nurses with computer science education and with previous computer experience had more positive attitudes towards computers than others. Our results are important for planning and implementation of integrated hospital information systems in Croatia.

  14. A knowledge-based approach to improving optimization techniques in system planning

    NASA Technical Reports Server (NTRS)

    Momoh, J. A.; Zhang, Z. Z.

    1990-01-01

    A knowledge-based (KB) approach to improve mathematical programming techniques used in the system planning environment is presented. The KB system assists in selecting appropriate optimization algorithms, objective functions, constraints and parameters. The scheme is implemented by integrating symbolic computation of rules derived from operator and planner's experience and is used for generalized optimization packages. The KB optimization software package is capable of improving the overall planning process which includes correction of given violations. The method was demonstrated on a large scale power system discussed in the paper.

  15. Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning

    NASA Technical Reports Server (NTRS)

    Otterstatter, Matthew R.

    2005-01-01

    The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.

  16. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  17. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  18. Review of the Water Resources Information System of Argentina

    USGS Publications Warehouse

    Hutchison, N.E.

    1987-01-01

    A representative of the U.S. Geological Survey traveled to Buenos Aires, Argentina, in November 1986, to discuss water information systems and data bank implementation in the Argentine Government Center for Water Resources Information. Software has been written by Center personnel for a minicomputer to be used to manage inventory (index) data and water quality data. Additional hardware and software have been ordered to upgrade the existing computer. Four microcomputers, statistical and data base management software, and network hardware and software for linking the computers have also been ordered. The Center plans to develop a nationwide distributed data base for Argentina that will include the major regional offices as nodes. Needs for continued development of the water resources information system for Argentina were reviewed. Identified needs include: (1) conducting a requirements analysis to define the content of the data base and insure that all user requirements are met, (2) preparing a plan for the development, implementation, and operation of the data base, and (3) developing a conceptual design to inform all development personnel and users of the basic functionality planned for the system. A quality assurance and configuration management program to provide oversight to the development process was also discussed. (USGS)

  19. Surviving OR computerization.

    PubMed

    Beach, Myra Jo; Sions, Jacqueline A

    2011-02-01

    In 2007, a steering committee at West Virginia University Hospitals, Morgantown, began a three-year, accelerated design, computer implementation project to institute an automated perioperative record. The process included budgeting, selecting a vendor, designing and building the system, educating perioperative staff members, implementing the system, and re-evaluating the system for upgrades. Important steps in designing and building the system included mapping patient care and documentation processes, assessing software and hardware needs, and creating a new preference card system and surgical scheduling system. Staff members were educated to use the new computer applications via contests, inservice programs, hands-on learning modules, and a preimplementation rehearsal. Role-based security ensures that staff members are granted access to the computer applications they need to perform the work defined by their scope of practice. Planning ensures that the computer system will be maintained and enhanced over time. Copyright © 2011 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  20. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  1. Navigation of military and space unmanned ground vehicles in unstructured terrains

    NASA Technical Reports Server (NTRS)

    Lescoe, Paul; Lavery, David; Bedard, Roger

    1991-01-01

    Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.

  2. Report of the AAPM Task Group No. 105: Issues associated with clinical implementation of Monte Carlo-based photon and electron external beam treatment planning.

    PubMed

    Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V

    2007-12-01

    The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

  3. Concept of Smart Cyberspace for Smart Grid Implementation

    NASA Astrophysics Data System (ADS)

    Zhukovskiy, Y.; Malov, D.

    2018-05-01

    The concept of Smart Cyberspace for Smart Grid (SG) implementation is presented in the paper. The classification of electromechanical units, based on the amount of analysing data, the classification of electromechanical units, based on the data processing speed; and the classification of computational network organization, based on required resources, are proposed in this paper. The combination of the considered classifications is formalized, which can be further used in organizing and planning of SG.

  4. Research on Building Education & Workforce Capacity in Systems Engineering

    DTIC Science & Technology

    2011-10-31

    product or prototype that addresses a real DoD need. Implemented as pilot courses in eight civilian and six military universities affiliated with...Engineering 1 1.1 Computer Engineering 1 1.1 Operations Research 1 1.1 Product Architecture 1 1.1 Total 93 100.0 Table 7: Breakdown of Student... product specifications, inattention to budget limits and safety issues, inattention to product life cycle, poor implementation of risk management plans

  5. Process Integrated Mechanism for Human-Computer Collaboration and Coordination

    DTIC Science & Technology

    2012-09-12

    system we implemented the TAFLib library that provides the communication with TAF . The data received from the TAF server is collected in a data structure...send new commands and flight plans for the UAVs to the TAF server. Test scenarios Several scenarios have been implemented to test and prove our...areas. Shooting Enemies The basic scenario proved the successful integration of PIM and the TAF simulation environment. Subsequently we improved the CP

  6. Two criteria for the selection of assembly plans - Maximizing the flexibility of sequencing the assembly tasks and minimizing the assembly time through parallel execution of assembly tasks

    NASA Technical Reports Server (NTRS)

    Homem De Mello, Luiz S.; Sanderson, Arthur C.

    1991-01-01

    The authors introduce two criteria for the evaluation and selection of assembly plans. The first criterion is to maximize the number of different sequences in which the assembly tasks can be executed. The second criterion is to minimize the total assembly time through simultaneous execution of assembly tasks. An algorithm that performs a heuristic search for the best assembly plan over the AND/OR graph representation of assembly plans is discussed. Admissible heuristics for each of the two criteria introduced are presented. Some implementation issues that affect the computational efficiency are addressed.

  7. Computers in the examination room and the electronic health record: physicians' perceived impact on clinical encounters before and after full installation and implementation.

    PubMed

    Doyle, Richard J; Wang, Nina; Anthony, David; Borkan, Jeffrey; Shield, Renee R; Goldman, Roberta E

    2012-10-01

    We compared physicians' self-reported attitudes and behaviours regarding electronic health record (EHR) use before and after installation of computers in patient examination rooms and transition to full implementation of an EHR in a family medicine training practice to identify anticipated and observed effects these changes would have on physicians' practices and clinical encounters. We conducted two individual qualitative interviews with family physicians. The first interview was before and second interview was 8 months later after full implementation of an EHR and computer installation in the examination rooms. Data were analysed through project team discussions and subsequent coding with qualitative analysis software. At the first interviews, physicians frequently expressed concerns about the potential negative effect of the EHR on quality of care and physician-patient interaction, adequacy of their skills in EHR use and privacy and confidentiality concerns. Nevertheless, most physicians also anticipated multiple benefits, including improved accessibility of patient data and online health information. In the second interviews, physicians reported that their concerns did not persist. Many anticipated benefits were realized, appearing to facilitate collaborative physician-patient relationships. Physicians reported a greater teaching role with patients and sharing online medical information and treatment plan decisions. Before computer installation and full EHR implementation, physicians expressed concerns about the impact of computer use on patient care. After installation and implementation, however, many concerns were mitigated. Using computers in the examination rooms to document and access patients' records along with online medical information and decision-making tools appears to contribute to improved physician-patient communication and collaboration.

  8. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  9. 75 FR 16026 - Approval and Promulgation of State Implementation Plan Revisions; State of North Dakota; Air...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... values for two monitors in Missoula County were 11.1, 11.4 and 11.8 [mu]g/m\\3\\. Computed from AQS... domain * * *''). \\16\\ The 400 mile distance to the nonattainment area is calculated from St. Cloud, and...

  10. Providing Access to Library Automation Systems for Students with Disabilities.

    ERIC Educational Resources Information Center

    California Community Colleges, Sacramento. High-Tech Center for the Disabled.

    This document provides information on the integration of assistive computer technologies and library automation systems at California Community Colleges in order to ensure access for students with disabilities. Topics covered include planning, upgrading, purchasing, implementing and using these technologies with library systems. As information…

  11. Multi-Protocol LAN Design and Implementation: A Case Study.

    ERIC Educational Resources Information Center

    Hazari, Sunil

    1995-01-01

    Reports on the installation of a local area network (LAN) at East Carolina University. Topics include designing the network; computer labs and electronic mail; Internet connectivity; LAN expenses; and recommendations on planning, equipment, administration, and training. A glossary of networking terms is also provided. (AEF)

  12. Personnel Requirements Consideration in Major Weapon System Acquisition. Research Planning Report

    DTIC Science & Technology

    1980-03-01

    of scientific study of human attributes associated with job performance. Sir Francis Galton (1822-1911) is considered the father of engineering...analysis of nonrelevant factors. Implementation of this plan will result in construction and pilot testing of a computer based branching decision...logic in 1980-81 and field testing and evaluation in 1981-82. . . 1. SSCuaITY CLASIPICATION Op TuiS PAOI(’I.A baa. Seoto, I Finch, F.L., Rigg, K.E. and

  13. Implementation and evaluation of various demons deformable image registration algorithms on a GPU.

    PubMed

    Gu, Xuejun; Pan, Hubert; Liang, Yun; Castillo, Richard; Yang, Deshan; Choi, Dongju; Castillo, Edward; Majumdar, Amitava; Guerrero, Thomas; Jiang, Steve B

    2010-01-07

    Online adaptive radiation therapy (ART) promises the ability to deliver an optimal treatment in response to daily patient anatomic variation. A major technical barrier for the clinical implementation of online ART is the requirement of rapid image segmentation. Deformable image registration (DIR) has been used as an automated segmentation method to transfer tumor/organ contours from the planning image to daily images. However, the current computational time of DIR is insufficient for online ART. In this work, this issue is addressed by using computer graphics processing units (GPUs). A gray-scale-based DIR algorithm called demons and five of its variants were implemented on GPUs using the compute unified device architecture (CUDA) programming environment. The spatial accuracy of these algorithms was evaluated over five sets of pulmonary 4D CT images with an average size of 256 x 256 x 100 and more than 1100 expert-determined landmark point pairs each. For all the testing scenarios presented in this paper, the GPU-based DIR computation required around 7 to 11 s to yield an average 3D error ranging from 1.5 to 1.8 mm. It is interesting to find out that the original passive force demons algorithms outperform subsequently proposed variants based on the combination of accuracy, efficiency and ease of implementation.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, C

    Purpose: To implement a novel, automatic, institutional customizable DVH quantities evaluation and PDF report tool on Philips Pinnacle treatment planning system (TPS) Methods: An add-on program (P3DVHStats) is developed by us to enable automatic DVH quantities evaluation (including both volume and dose based quantities, such as V98, V100, D2), and automatic PDF format report generation, for EMR convenience. The implementation is based on a combination of Philips Pinnacle scripting tool and Java language pre-installed on each Pinnacle Sun Solaris workstation. A single Pinnacle script provide user a convenient access to the program when needed. The activated script will first exportmore » DVH data for user selected ROIs from current Pinnacle plan trial; a Java program then provides a simple GUI interface, utilizes the data to compute any user requested DVH quantities, compare with preset institutional DVH planning goals; if accepted by users, the program will also generate a PDF report of the results and export it from Pinnacle to EMR import folder via FTP. Results: The program was tested thoroughly and has been released for clinical use at our institution (Pinnacle Enterprise server with both thin clients and P3PC access), for all dosimetry and physics staff, with excellent feedback. It used to take a few minutes to use MS-Excel worksheet to calculate these DVH quantities for IMRT/VMAT plans, and manually save them as PDF report; with the new program, it literally takes a few mouse clicks in less than 30 seconds to complete the same tasks. Conclusion: A Pinnacle scripting and Java language based program is successfully implemented, customized to our institutional needs. It is shown to dramatically reduce time and effort needed for DVH quantities computing and EMR reporting.« less

  15. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  16. Funding for Library Automation.

    ERIC Educational Resources Information Center

    Thompson, Ronelle K. H.

    This paper provides a brief overview of planning and implementing a project to fund library automation. It is suggested that: (1) proposal budgets should include all costs of a project, such as furniture needed for computer terminals, costs for modifying library procedures, initial supplies, or ongoing maintenance; (2) automation does not save…

  17. Long-Range Budget Planning in Private Colleges and Universities

    ERIC Educational Resources Information Center

    Hopkins, David S. P.; Massy, William F.

    1977-01-01

    Computer models have greatly assisted budget planners in privately financed institutions to identify and analyze major financial problems. The implementation of such a model at Stanford University is described that considers student aid expenses, indirect cost recovery, endowments, price elasticity of enrollment, and student/faculty ratios.…

  18. A Security Checklist for ERP Implementations

    ERIC Educational Resources Information Center

    Hughes, Joy R.; Beer, Robert

    2007-01-01

    The EDUCAUSE/Internet2 Computer and Network Security Task Force consulted with IT security professionals on campus about concerns with the current state of security in enterprise resource planning (ERP) systems. From these conversations, it was clear that security issues generally fell into one of two areas: (1) It has become extremely difficult…

  19. Application of AI techniques to a voice-actuated computer system for reconstructing and displaying magnetic resonance imaging data

    NASA Astrophysics Data System (ADS)

    Sherley, Patrick L.; Pujol, Alfonso, Jr.; Meadow, John S.

    1990-07-01

    To provide a means of rendering complex computer architectures languages and input/output modalities transparent to experienced and inexperienced users research is being conducted to develop a voice driven/voice response computer graphics imaging system. The system will be used for reconstructing and displaying computed tomography and magnetic resonance imaging scan data. In conjunction with this study an artificial intelligence (Al) control strategy was developed to interface the voice components and support software to the computer graphics functions implemented on the Sun Microsystems 4/280 color graphics workstation. Based on generated text and converted renditions of verbal utterances by the user the Al control strategy determines the user''s intent and develops and validates a plan. The program type and parameters within the plan are used as input to the graphics system for reconstructing and displaying medical image data corresponding to that perceived intent. If the plan is not valid the control strategy queries the user for additional information. The control strategy operates in a conversation mode and vocally provides system status reports. A detailed examination of the various AT techniques is presented with major emphasis being placed on their specific roles within the total control strategy structure. 1.

  20. Concepts, requirements, and design approaches for building successful planning and scheduling systems

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Willoughby, John K.

    1991-01-01

    Traditional practice of systems engineering management assumes requirements can be precisely determined and unambiguously defined prior to system design and implementation; practice further assumes requirements are held static during implementation. Human-computer decision support systems for service planning and scheduling applications do not conform well to these assumptions. Adaptation to the traditional practice of systems engineering management are required. Basic technology exists to support these adaptations. Additional innovations must be encouraged and nutured. Continued partnership between the programmatic and technical perspective assures proper balance of the impossible with the possible. Past problems have the following origins: not recognizing the unusual and perverse nature of the requirements for planning and scheduling; not recognizing the best starting point assumptions for the design; not understanding the type of system that being built; and not understanding the design consequences of the operations concept selected.

  1. Automating ATLAS Computing Operations using the Site Status Board

    NASA Astrophysics Data System (ADS)

    J, Andreeva; Iglesias C, Borrego; S, Campana; Girolamo A, Di; I, Dzhunov; Curull X, Espinal; S, Gayazov; E, Magradze; M, Nowotka M.; L, Rinaldi; P, Saiz; J, Schovancova; A, Stewart G.; M, Wright

    2012-12-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses the SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. The ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The paper will describe how the SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in the SSB. It will demonstrate the positive impact of the use of the SSB on the overall performance of ATLAS computing activities and will overview future plans.

  2. Enlist micros: Training science teachers to use microcomputers

    NASA Astrophysics Data System (ADS)

    Baird, William E.; Ellis, James D.; Kuerbis, Paul J.

    A National Science Foundation grant to the Biological Sciences Curriculum Study (BSCS) at The Colorado College supported the design and production of training materials to encourage literacy of science teachers in the use of microcomputers. ENLIST Micros is based on results of a national needs assessment that identified 22 compentencies needed by K-12 science teachers to use microcomputers for instruction. A writing team developed the 16-hour training program in the summer of 1985, and field-test coordinators tested it with 18 preservice or in-service groups during the 1985-86 academic year at 15 sites within the United States. The training materials consist of video programs, interactive computer disks for the Apple II series microcomputer, a training manual for participants, and a guide for the group leader. The experimental materials address major areas of educational computing: awareness, applications, implementation, evaluation, and resources. Each chapter contains activities developed for this program, such as viewing video segments of science teachers who are using computers effectively and running commercial science and training courseware. Role playing and small-group interaction help the teachers overcome their reluctance to use computers and plan for effective implementation of microcomputers in the school. This study examines the implementation of educational computing among 47 science teachers who completed the ENLIST Micros training at a southern university. We present results of formative evaluation for that site. Results indicate that both elementary and secondary teachers benefit from the training program and demonstrate gains in attitudes toward computer use. Participating teachers said that the program met its stated objectives and helped them obtain needed skills. Only 33 percent of these teachers, however, reported using computers one year after the training. In June 1986, the BSCS initiated a follow up to the ENLIST Micros curriculum to develop, evaluate, and disseminate a complete model of teacher enhancement for educational computing in the sciences. In that project, we use the ENLIST Micros curriculum as the first step in a training process. The project includes seminars that introduce additional skills: It contains provisions for sharing among participants, monitors use of computers in participants' classrooms, provides structured coaching of participants' use of computers in their classrooms, and offers planned observations of peers using computers in their science teaching.

  3. IAIMS development at Harvard Medical School.

    PubMed Central

    Barnett, G O; Greenes, R A; Zielstorff, R D

    1988-01-01

    The long-range goal of this IAIMS development project is to achieve an Integrated Academic Information Management System for the Harvard Medical School, the Francis A. Countway Library of Medicine, and Harvard's affiliated institutions and their respective libraries. An "opportunistic, incremental" approach to planning has been devised. The projects selected for the initial phase are to implement an increasingly powerful electronic communications network, to encourage the use of a variety of bibliographic and information access techniques, and to begin an ambitious program of faculty and student education in computer science and its applications to medical education, medical care, and research. In addition, we will explore means to promote better collaboration among the separate computer science units in the various schools and hospitals. We believe that our planning approach will have relevance to other educational institutions where lack of strong central organizational control prevents a "top-down" approach to planning. PMID:3416098

  4. Cost-Benefit Arbitration Between Multiple Reinforcement-Learning Systems.

    PubMed

    Kool, Wouter; Gershman, Samuel J; Cushman, Fiery A

    2017-09-01

    Human behavior is sometimes determined by habit and other times by goal-directed planning. Modern reinforcement-learning theories formalize this distinction as a competition between a computationally cheap but inaccurate model-free system that gives rise to habits and a computationally expensive but accurate model-based system that implements planning. It is unclear, however, how people choose to allocate control between these systems. Here, we propose that arbitration occurs by comparing each system's task-specific costs and benefits. To investigate this proposal, we conducted two experiments showing that people increase model-based control when it achieves greater accuracy than model-free control, and especially when the rewards of accurate performance are amplified. In contrast, they are insensitive to reward amplification when model-based and model-free control yield equivalent accuracy. This suggests that humans adaptively balance habitual and planned action through on-line cost-benefit analysis.

  5. Computer literacy and attitudes of dental students and staff at the University of the West Indies Dental School.

    PubMed

    Smith, W; Bedayse, S; Lalwah, S L; Paryag, A

    2009-08-01

    The University of the West Indies (UWI) Dental School is planning to implement computer-based information systems to manage student and patient data. In order to measure the acceptance of the proposed implementation and to determine the degree of training that would be required, a survey was undertaken of the computer literacy and attitude of all staff and students. Data were collected via 230 questionnaires from all staff and students. A 78% response rate was obtained. The computer literacy of the majority of respondents was ranked as 'more than adequate' compared to other European Dental Schools. Respondents < 50 years had significantly higher computer literacy scores than older age groups (P < 0.05). Similarly, respondents who owned an email address, a computer, or were members of online social networking sites had significantly higher computer literacy scores than those who did not (P < 0.05). Sex, nationality and whether the respondent was student/staff were not significant factors. Most respondents felt that computer literacy should be a part of every modern undergraduate curriculum; that computer assisted learning applications and web-based learning activity could effectively supplement the traditional undergraduate curriculum and that a suitable information system would improve the efficiency in the school's management of students, teaching and clinics. The implementation of a computer-based information system is likely to have widespread acceptance among students and staff at the UWI Dental School. The computer literacy of the students and staff are on par with those of schools in the US and Europe.

  6. Treatment Planning and Image Guidance for Radiofrequency Ablations of Large Tumors

    PubMed Central

    Ren, Hongliang; Campos-Nanez, Enrique; Yaniv, Ziv; Banovac, Filip; Abeledo, Hernan; Hata, Nobuhiko; Cleary, Kevin

    2014-01-01

    This article addresses the two key challenges in computer-assisted percutaneous tumor ablation: planning multiple overlapping ablations for large tumors while avoiding critical structures, and executing the prescribed plan. Towards semi-automatic treatment planning for image-guided surgical interventions, we develop a systematic approach to the needle-based ablation placement task, ranging from pre-operative planning algorithms to an intra-operative execution platform. The planning system incorporates clinical constraints on ablations and trajectories using a multiple objective optimization formulation, which consists of optimal path selection and ablation coverage optimization based on integer programming. The system implementation is presented and validated in phantom studies and on an animal model. The presented system can potentially be further extended for other ablation techniques such as cryotherapy. PMID:24235279

  7. An overview of the evaluation plan for PC/MISI: PC-based Multiple Information System Interface

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Lim, Bee Lee; Hall, Philip P.

    1985-01-01

    An initial evaluation plan for the personal computer multiple information system interface (PC/MISI) project is discussed. The document is intend to be used as a blueprint for the evaluation of this system. Each objective of the design project is discussed along with the evaluation parameters and methodology to be used in the evaluation of the implementation's achievement of those objectives. The potential of the system for research activities related to more general aspects of information retrieval is also discussed.

  8. System cost performance analysis (study 2.3). Volume 1: Executive summary. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A study is described which was initiated to identify and quantify the interrelationships between and within the performance, safety, cost, and schedule parameters for unmanned, automated payload programs. The result of the investigation was a systems cost/performance model which was implemented as a digital computer program and could be used to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses for mission model and advanced payload studies. Program objectives and results are described briefly.

  9. SU-E-T-500: Initial Implementation of GPU-Based Particle Swarm Optimization for 4D IMRT Planning in Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modiri, A; Hagan, A; Gu, X

    Purpose 4D-IMRT planning, combined with dynamic MLC tracking delivery, utilizes the temporal dimension as an additional degree of freedom to achieve improved OAR-sparing. The computational complexity for such optimization increases exponentially with increase in dimensionality. In order to accomplish this task in a clinically-feasible time frame, we present an initial implementation of GPU-based 4D-IMRT planning based on particle swarm optimization (PSO). Methods The target and normal structures were manually contoured on ten phases of a 4DCT scan of a NSCLC patient with a 54cm3 right-lower-lobe tumor (1.5cm motion). Corresponding ten 3D-IMRT plans were created in the Eclipse treatment planning systemmore » (Ver-13.6). A vendor-provided scripting interface was used to export 3D-dose matrices corresponding to each control point (10 phases × 9 beams × 166 control points = 14,940), which served as input to PSO. The optimization task was to iteratively adjust the weights of each control point and scale the corresponding dose matrices. In order to handle the large amount of data in GPU memory, dose matrices were sparsified and placed in contiguous memory blocks with the 14,940 weight-variables. PSO was implemented on CPU (dual-Xeon, 3.1GHz) and GPU (dual-K20 Tesla, 2496 cores, 3.52Tflops, each) platforms. NiftyReg, an open-source deformable image registration package, was used to calculate the summed dose. Results The 4D-PSO plan yielded PTV coverage comparable to the clinical ITV-based plan and significantly higher OAR-sparing, as follows: lung Dmean=33%; lung V20=27%; spinal cord Dmax=26%; esophagus Dmax=42%; heart Dmax=0%; heart Dmean=47%. The GPU-PSO processing time for 14940 variables and 7 PSO-particles was 41% that of CPU-PSO (199 vs. 488 minutes). Conclusion Truly 4D-IMRT planning can yield significant OAR dose-sparing while preserving PTV coverage. The corresponding optimization problem is large-scale, non-convex and computationally rigorous. Our initial results indicate that GPU-based PSO with further software optimization can make such planning clinically feasible. This work was supported through funding from the National Institutes of Health and Varian Medical Systems.« less

  10. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines

    PubMed Central

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-01-01

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately. PMID:27879843

  11. Multidirectional Scanning Model, MUSCLE, to Vectorize Raster Images with Straight Lines.

    PubMed

    Karas, Ismail Rakip; Bayram, Bulent; Batuk, Fatmagul; Akay, Abdullah Emin; Baz, Ibrahim

    2008-04-15

    This paper presents a new model, MUSCLE (Multidirectional Scanning for Line Extraction), for automatic vectorization of raster images with straight lines. The algorithm of the model implements the line thinning and the simple neighborhood methods to perform vectorization. The model allows users to define specified criteria which are crucial for acquiring the vectorization process. In this model, various raster images can be vectorized such as township plans, maps, architectural drawings, and machine plans. The algorithm of the model was developed by implementing an appropriate computer programming and tested on a basic application. Results, verified by using two well known vectorization programs (WinTopo and Scan2CAD), indicated that the model can successfully vectorize the specified raster data quickly and accurately.

  12. Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.

    PubMed

    Brodish, D L

    1998-01-01

    The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.

  13. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  14. A 2D ion chamber array audit of wedged and asymmetric fields in an inhomogeneous lung phantom.

    PubMed

    Lye, Jessica; Kenny, John; Lehmann, Joerg; Dunn, Leon; Kron, Tomas; Alves, Andrew; Cole, Andrew; Williams, Ivan

    2014-10-01

    The Australian Clinical Dosimetry Service (ACDS) has implemented a new method of a nonreference condition Level II type dosimetric audit of radiotherapy services to increase measurement accuracy and patient safety within Australia. The aim of this work is to describe the methodology, tolerances, and outcomes from the new audit. The ACDS Level II audit measures the dose delivered in 2D planes using an ionization chamber based array positioned at multiple depths. Measurements are made in rectilinear homogeneous and inhomogeneous phantoms composed of slabs of solid water and lung. Computer generated computed tomography data sets of the rectilinear phantoms are supplied to the facility prior to audit for planning of a range of cases including reference fields, asymmetric fields, and wedged fields. The audit assesses 3D planning with 6 MV photons with a static (zero degree) gantry. Scoring is performed using local dose differences between the planned and measured dose within 80% of the field width. The overall audit result is determined by the maximum dose difference over all scoring points, cases, and planes. Pass (Optimal Level) is defined as maximum dose difference ≤3.3%, Pass (Action Level) is ≤5.0%, and Fail (Out of Tolerance) is >5.0%. At close of 2013, the ACDS had performed 24 Level II audits. 63% of the audits passed, 33% failed, and the remaining audit was not assessable. Of the 15 audits that passed, 3 were at Pass (Action Level). The high fail rate is largely due to a systemic issue with modeling asymmetric 60° wedges which caused a delivered overdose of 5%-8%. The ACDS has implemented a nonreference condition Level II type audit, based on ion chamber 2D array measurements in an inhomogeneous slab phantom. The powerful diagnostic ability of this audit has allowed the ACDS to rigorously test the treatment planning systems implemented in Australian radiotherapy facilities. Recommendations from audits have led to facilities modifying clinical practice and changing planning protocols.

  15. Microcomputers and the future of epidemiology.

    PubMed Central

    Dean, A G

    1994-01-01

    The Workshop on Microcomputers and the Future of Epidemiology was held March 8-9, 1993, at the Turner Conference Center, Atlanta, GA, with 130 public health professionals participating. The purpose of the workshop was to define microcomputer needs in epidemiology and to propose future initiatives. Thirteen groups representing public health disciplines defined their needs for better and more useful data, development of computer technology appropriate to epidemiology, user support and human infrastructure development, and global communication and planning. Initiatives proposed were demonstration of health surveillance systems, new software and hardware, computer-based training, projects to establish or improve data bases and community access to data bases, improved international communication, conferences on microcomputer use in particular disciplines, a suggestion to encourage competition in the production of public-domain software, and longrange global planning for epidemiologic computing and data management. Other interested groups are urged to study, modify, and implement those ideas. PMID:7910692

  16. Robotic Online Path Planning on Point Cloud.

    PubMed

    Liu, Ming

    2016-05-01

    This paper deals with the path-planning problem for mobile wheeled- or tracked-robot which drive in 2.5-D environments, where the traversable surface is usually considered as a 2-D-manifold embedded in a 3-D ambient space. Specially, we aim at solving the 2.5-D navigation problem using raw point cloud as input. The proposed method is independent of traditional surface parametrization or reconstruction methods, such as a meshing process, which generally has high-computational complexity. Instead, we utilize the output of 3-D tensor voting framework on the raw point clouds. The computation of tensor voting is accelerated by optimized implementation on graphics computation unit. Based on the tensor voting results, a novel local Riemannian metric is defined using the saliency components, which helps the modeling of the latent traversable surface. Using the proposed metric, we prove that the geodesic in the 3-D tensor space leads to rational path-planning results by experiments. Compared to traditional methods, the results reveal the advantages of the proposed method in terms of smoothing the robot maneuver while considering the minimum travel distance.

  17. Computer-aided documentation. Quality, productivity, coding, and enhanced reimbursement.

    PubMed

    Foxlee, R H

    1993-10-01

    Physicians currently use technology, where appropriate, to improve patient care, for example, MRI and three-dimensional radiotherapy dose planning. One area that has seen limited benefit from current technology is in documenting of medical information. Review of related literature and directed interviews. Technology is available to assist in documenting the initial patient encounter. Patient care, quality of practice, and reimbursement may be improved with careful implementation. It will be worthwhile for practices to examine how to implement this technology to obtain the potential benefits.

  18. Systems cost/performance analysis (study 2.3). Volume 3: Programmer's manual and user's guide. [for unmanned spacecraft

    NASA Technical Reports Server (NTRS)

    Janz, R. F.

    1974-01-01

    The systems cost/performance model was implemented as a digital computer program to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses. The computer is described along with the operating environment in which the program was written and checked, the program specifications such as discussions of logic and computational flow, the different subsystem models involved in the design of the spacecraft, and routines involved in the nondesign area such as costing and scheduling of the design. Preliminary results for the DSCS-II design are also included.

  19. Full Monte Carlo-Based Biologic Treatment Plan Optimization System for Intensity Modulated Carbon Ion Therapy on Graphics Processing Unit.

    PubMed

    Qin, Nan; Shen, Chenyang; Tsai, Min-Yu; Pinto, Marco; Tian, Zhen; Dedes, Georgios; Pompos, Arnold; Jiang, Steve B; Parodi, Katia; Jia, Xun

    2018-01-01

    One of the major benefits of carbon ion therapy is enhanced biological effectiveness at the Bragg peak region. For intensity modulated carbon ion therapy (IMCT), it is desirable to use Monte Carlo (MC) methods to compute the properties of each pencil beam spot for treatment planning, because of their accuracy in modeling physics processes and estimating biological effects. We previously developed goCMC, a graphics processing unit (GPU)-oriented MC engine for carbon ion therapy. The purpose of the present study was to build a biological treatment plan optimization system using goCMC. The repair-misrepair-fixation model was implemented to compute the spatial distribution of linear-quadratic model parameters for each spot. A treatment plan optimization module was developed to minimize the difference between the prescribed and actual biological effect. We used a gradient-based algorithm to solve the optimization problem. The system was embedded in the Varian Eclipse treatment planning system under a client-server architecture to achieve a user-friendly planning environment. We tested the system with a 1-dimensional homogeneous water case and 3 3-dimensional patient cases. Our system generated treatment plans with biological spread-out Bragg peaks covering the targeted regions and sparing critical structures. Using 4 NVidia GTX 1080 GPUs, the total computation time, including spot simulation, optimization, and final dose calculation, was 0.6 hour for the prostate case (8282 spots), 0.2 hour for the pancreas case (3795 spots), and 0.3 hour for the brain case (6724 spots). The computation time was dominated by MC spot simulation. We built a biological treatment plan optimization system for IMCT that performs simulations using a fast MC engine, goCMC. To the best of our knowledge, this is the first time that full MC-based IMCT inverse planning has been achieved in a clinically viable time frame. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Towards constructing multi-bit binary adder based on Belousov-Zhabotinsky reaction

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Mao; Wong, Ieong; Chou, Meng-Ta; Zhao, Xin

    2012-04-01

    It has been proposed that the spatial excitable media can perform a wide range of computational operations, from image processing, to path planning, to logical and arithmetic computations. The realizations in the field of chemical logical and arithmetic computations are mainly concerned with single simple logical functions in experiments. In this study, based on Belousov-Zhabotinsky reaction, we performed simulations toward the realization of a more complex operation, the binary adder. Combining with some of the existing functional structures that have been verified experimentally, we designed a planar geometrical binary adder chemical device. Through numerical simulations, we first demonstrated that the device can implement the function of a single-bit full binary adder. Then we show that the binary adder units can be further extended in plane, and coupled together to realize a two-bit, or even multi-bit binary adder. The realization of chemical adders can guide the constructions of other sophisticated arithmetic functions, ultimately leading to the implementation of chemical computer and other intelligent systems.

  1. Computational Methods to Assess the Production Potential of Bio-Based Chemicals.

    PubMed

    Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M; Herrgård, Markus J

    2018-01-01

    Elevated costs and long implementation times of bio-based processes for producing chemicals represent a bottleneck for moving to a bio-based economy. A prospective analysis able to elucidate economically and technically feasible product targets at early research phases is mandatory. Computational tools can be implemented to explore the biological and technical spectrum of feasibility, while constraining the operational space for desired chemicals. In this chapter, two different computational tools for assessing potential for bio-based production of chemicals from different perspectives are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry.

  2. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  3. NASA/ESA CV-990 Spacelab simulation. Appendixes: C, data-handling: Planning and implementation; D, communications; E, mission documentation

    NASA Technical Reports Server (NTRS)

    Reller, J. O., Jr.

    1976-01-01

    Data handling, communications, and documentation aspects of the ASSESS mission are described. Most experiments provided their own data handling equipment, although some used the airborne computer for backup, and one experiment required real-time computations. Communications facilities were set up to simulate those to be provided between Spacelab and the ground, including a downlink TV system. Mission documentation was kept to a minimum and proved sufficient. Examples are given of the basic documents of the mission.

  4. Cardiology office computer use: primer, pointers, pitfalls.

    PubMed

    Shepard, R B; Blum, R I

    1986-10-01

    An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.

  5. Concepts and algorithms for terminal-area traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.; Chapel, J. D.

    1984-01-01

    The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.

  6. ICT in the Education of Students with SEN: Perceptions of Stakeholders

    NASA Astrophysics Data System (ADS)

    Ribeiro, Jaime; Moreira, António; Almeida, Ana Margarida

    Portugal is experiencing a technological reform in education. Technological refurbishing of schools and training of students and teachers is a reality on the rise, enhanced by the implementation of the Education Technological Plan, which also aims at computer skills certification, by 2010, of 90% of teachers. In a School that must be adjusted to all pupils, Special Educational Needs cannot be neglected and the nature and constitution of its computer resources should obviate the support of these students. ICT training is essential to benefit all students from its use. In the case of SEN, this need for training is of paramount importance to establish itself as a facilitator for these students. ICT Coordinators are the visible face of ICT implementation in schools; their functions include the management of the schools computer facilities and to zeal for the ICT training of fellow teachers.

  7. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  8. Documenting historical data and accessing it on the World Wide Web

    Treesearch

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    New computer technologies facilitate the storage, retrieval, and summarization of watershed-based data sets on the World Wide Web. These data sets are used by researchers when testing and validating predictive models, managers when planning and implementing watershed management practices, educators when learning about hydrologic processes, and decisionmakers when...

  9. Computer Output Microform (COM) Catalog Requirements for the Virginia Commonwealth University Libraries.

    ERIC Educational Resources Information Center

    White, Robert L.

    Generated as the result of the deliberations of the COM Catalog Advisory Work Group, which depended heavily on a review of the available literature, individual analysis, and group discussion, this report is intended as a general planning document for Virginia Commonwealth university libraries concerning their possible implementation of a COM…

  10. Implementing Wireless Mobile Instructional Labs: Planning Issues and Case Study

    ERIC Educational Resources Information Center

    McKimmy, Paul B.

    2005-01-01

    In April 2002, the Technology Advisory Committee of the University of Hawaii-Manoa College of Education (COE) prioritized the upgrade of existing instructional computer labs. Following several weeks of research and discussion, a decision was made to support wireless and mobile technologies during the upgrade. In June 2002, the first of three…

  11. Computers and Mental Health Care Delivery. A Resource Guide to Federal Information.

    ERIC Educational Resources Information Center

    Levy, Louise

    Prepared for the mental health professional or administrator who is involved in the planning, developing, or implementation of an automated information system in a mental health environment, this guide is limited to the electronic processing and storage of information for management and clinical functions. Management application areas include…

  12. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....90 of this chapter, a cyber security plan that satisfies the requirements of this section for.... Implementation of the licensee's cyber security program must be consistent with the approved schedule. Current... Commission prior to the effective date of this rule must amend their applications to include a cyber security...

  13. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....90 of this chapter, a cyber security plan that satisfies the requirements of this section for.... Implementation of the licensee's cyber security program must be consistent with the approved schedule. Current... Commission prior to the effective date of this rule must amend their applications to include a cyber security...

  14. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....90 of this chapter, a cyber security plan that satisfies the requirements of this section for.... Implementation of the licensee's cyber security program must be consistent with the approved schedule. Current... Commission prior to the effective date of this rule must amend their applications to include a cyber security...

  15. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....90 of this chapter, a cyber security plan that satisfies the requirements of this section for.... Implementation of the licensee's cyber security program must be consistent with the approved schedule. Current... Commission prior to the effective date of this rule must amend their applications to include a cyber security...

  16. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....90 of this chapter, a cyber security plan that satisfies the requirements of this section for.... Implementation of the licensee's cyber security program must be consistent with the approved schedule. Current... Commission prior to the effective date of this rule must amend their applications to include a cyber security...

  17. THE DESIGN OF A MAN-MACHINE COUNSELING SYSTEM. A PROFESSIONAL PAPER.

    ERIC Educational Resources Information Center

    COGSWELL, J.F.; AND OTHERS

    TWO PROJECTS ON THE DESIGN, DEVELOPMENT, IMPLEMENTATION, AND EVALUATION OF A MAN-MACHINE SYSTEM FOR COUNSELING IN THE PALO ALTO AND LOS ANGELES SCHOOL DISTRICTS ARE REPORTED. THE EARLIER PHILCO 2000 COMPUTER PROGRAMS SIMULATED A COUNSELOR'S WORK IN THE EDUCATIONAL PLANNING INTERVIEW BY ACCEPTING INPUTS SUCH AS SCHOOL GRADES, TEST SCORES, AND…

  18. The Cognitive Basis for Sentence Planning Difficulties in Discourse after Traumatic Brain Injury

    ERIC Educational Resources Information Center

    Peach, Richard K.

    2013-01-01

    Purpose: Analyses of language production of individuals with traumatic brain injury (TBI) place increasing emphasis on microlinguistic (i.e., within-sentence) patterns. It is unknown whether the observed problems involve implementation of well-formed sentence frames or represent a fundamental linguistic disturbance in computing sentence structure.…

  19. Beyond grid security

    NASA Astrophysics Data System (ADS)

    Hoeft, B.; Epting, U.; Koenig, T.

    2008-07-01

    While many fields relevant to Grid security are already covered by existing working groups, their remit rarely goes beyond the scope of the Grid infrastructure itself. However, security issues pertaining to the internal set-up of compute centres have at least as much impact on Grid security. Thus, this talk will present briefly the EU ISSeG project (Integrated Site Security for Grids). In contrast to groups such as OSCT (Operational Security Coordination Team) and JSPG (Joint Security Policy Group), the purpose of ISSeG is to provide a holistic approach to security for Grid computer centres, from strategic considerations to an implementation plan and its deployment. The generalised methodology of Integrated Site Security (ISS) is based on the knowledge gained during its implementation at several sites as well as through security audits, and this will be briefly discussed. Several examples of ISS implementation tasks at the Forschungszentrum Karlsruhe will be presented, including segregation of the network for administration and maintenance and the implementation of Application Gateways. Furthermore, the web-based ISSeG training material will be introduced. This aims to offer ISS implementation guidance to other Grid installations in order to help avoid common pitfalls.

  20. Redefining Tactical Operations for MER Using Cloud Computing

    NASA Technical Reports Server (NTRS)

    Joswig, Joseph C.; Shams, Khawaja S.

    2011-01-01

    The Mars Exploration Rover Mission (MER) includes the twin rovers, Spirit and Opportunity, which have been performing geological research and surface exploration since early 2004. The rovers' durability well beyond their original prime mission (90 sols or Martian days) has allowed them to be a valuable platform for scientific research for well over 2000 sols, but as a by-product it has produced new challenges in providing efficient and cost-effective tactical operational planning. An early stage process adaptation was the move to distributed operations as mission scientists returned to their places of work in the summer of 2004, but they would still came together via teleconference and connected software to plan rover activities a few times a week. This distributed model has worked well since, but it requires the purchase, operation, and maintenance of a dedicated infrastructure at the Jet Propulsion Laboratory. This server infrastructure is costly to operate and the periodic nature of its usage (typically heavy usage for 8 hours every 2 days) has made moving to a cloud based tactical infrastructure an extremely tempting proposition. In this paper we will review both past and current implementations of the tactical planning application focusing on remote plan saving and discuss the unique challenges present with long-latency, distributed operations. We then detail the motivations behind our move to cloud based computing services and as well as our system design and implementation. We will discuss security and reliability concerns and how they were addressed

  1. Diverter AI based decision aid, phases 1 and 2

    NASA Technical Reports Server (NTRS)

    Sexton, George A.; Bayles, Scott J.; Patterson, Robert W.; Schulke, Duane A.; Williams, Deborah C.

    1989-01-01

    It was determined that a system to incorporate artificial intelligence (AI) into airborne flight management computers is feasible. The AI functions that would be most useful to the pilot are to perform situational assessment, evaluate outside influences on the contemplated rerouting, perform flight planning/replanning, and perform maneuver planning. A study of the software architecture and software tools capable of demonstrating Diverter was also made. A skeletal planner known as the Knowledge Acquisition Development Tool (KADET), which is a combination script-based and rule-based system, was used to implement the system. A prototype system was developed which demonstrates advanced in-flight planning/replanning capabilities.

  2. Renewable Energy Opportunity Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hancock, Ed; Mas, Carl

    1998-11-13

    Presently, the US EPA is constructing a new complex at Research Triangle Park, North Carolina to consolidate its research operations in the Raleigh-Durham area. The National Computer Center (NCC) is currently in the design process and is planned for construction as part of this complex. Implementation of the new technologies can be planned as part of the normal construction process, and full credit for elimination of the conventional technologies can be taken. Several renewable technologies are specified in the current plans for the buildings. The objective of this study is to identify measures that are likely to be both technicallymore » and economically feasible.« less

  3. Roth 401(k): asking the right questions.

    PubMed

    Joyner, James F

    2006-01-01

    Roth 401(k) provisions are a newly available feature of 401(k) plans. Roth 401(k) provisions are after-tax savings that generally are tax-free at the time of distribution. Questions arise for plan sponsors about whether the new feature is beneficial, and to whom, and what needs to be done if the plan sponsor decides to offer this provision to its employees. This article tries to answer some of those common questions, including a simple computational analysis to try to answer the important question of how much an employee-participant genuinely benefits from this savings approach. Some practical issues of implementation are touched on, and some unanswered questions are identified.

  4. The locus of serial processing in reading aloud: orthography-to-phonology computation or speech planning?

    PubMed

    Mousikou, Petroula; Rastle, Kathleen; Besner, Derek; Coltheart, Max

    2015-07-01

    Dual-route theories of reading posit that a sublexical reading mechanism that operates serially and from left to right is involved in the orthography-to-phonology computation. These theories attribute the masked onset priming effect (MOPE) and the phonological Stroop effect (PSE) to the serial left-to-right operation of this mechanism. However, both effects may arise during speech planning, in the phonological encoding process, which also occurs serially and from left to right. In the present paper, we sought to determine the locus of serial processing in reading aloud by testing the contrasting predictions that the dual-route and speech planning accounts make in relation to the MOPE and the PSE. The results from three experiments that used the MOPE and the PSE paradigms in English are inconsistent with the idea that these effects arise during speech planning, and consistent with the claim that a sublexical serially operating reading mechanism is involved in the print-to-sound translation. Simulations of the empirical data on the MOPE with the dual route cascaded (DRC) and connectionist dual process (CDP++) models, which are computational implementations of the dual-route theory of reading, provide further support for the dual-route account. (c) 2015 APA, all rights reserved.

  5. GPU-based ultra-fast dose calculation using a finite size pencil beam model.

    PubMed

    Gu, Xuejun; Choi, Dongju; Men, Chunhua; Pan, Hubert; Majumdar, Amitava; Jiang, Steve B

    2009-10-21

    Online adaptive radiation therapy (ART) is an attractive concept that promises the ability to deliver an optimal treatment in response to the inter-fraction variability in patient anatomy. However, it has yet to be realized due to technical limitations. Fast dose deposit coefficient calculation is a critical component of the online planning process that is required for plan optimization of intensity-modulated radiation therapy (IMRT). Computer graphics processing units (GPUs) are well suited to provide the requisite fast performance for the data-parallel nature of dose calculation. In this work, we develop a dose calculation engine based on a finite-size pencil beam (FSPB) algorithm and a GPU parallel computing framework. The developed framework can accommodate any FSPB model. We test our implementation in the case of a water phantom and the case of a prostate cancer patient with varying beamlet and voxel sizes. All testing scenarios achieved speedup ranging from 200 to 400 times when using a NVIDIA Tesla C1060 card in comparison with a 2.27 GHz Intel Xeon CPU. The computational time for calculating dose deposition coefficients for a nine-field prostate IMRT plan with this new framework is less than 1 s. This indicates that the GPU-based FSPB algorithm is well suited for online re-planning for adaptive radiotherapy.

  6. Space Station Mission Planning Study (MPS) development study. Volume 3: Software development plan

    NASA Technical Reports Server (NTRS)

    Klus, W. L.

    1987-01-01

    A software development plan is presented for the definition, design, and implementation of the Space Station (SS) Payload Mission Planning System (MPS). This plan is an evolving document and must be updated periodically as the SS design and operations concepts as well as the SS MPS concept evolve. The major segments of this plan are as follows: an overview of the SS MPS and a description of its required capabilities including the computer programs identified as configurable items with an explanation of the place and function of each within the system; an overview of the project plan and a detailed description of each development project activity breaking each into lower level tasks where applicable; identification of the resources required and recommendations for the manner in which they should be utilized including recommended schedules and estimated manpower requirements; and a description of the practices, standards, and techniques recommended for the SS MPS Software (SW) development.

  7. TU-FG-201-03: Automatic Pre-Delivery Verification Using Statistical Analysis of Consistencies in Treatment Plan Parameters by the Treatment Site and Modality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, S; Wu, Y; Chang, X

    Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans ofmore » the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the Agency for Healthcare Research and Quality (AHRQ) under award 1R01HS0222888. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less

  8. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  9. Human motion planning based on recursive dynamics and optimal control techniques

    NASA Technical Reports Server (NTRS)

    Lo, Janzen; Huang, Gang; Metaxas, Dimitris

    2002-01-01

    This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.

  10. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  11. The Mexican National Programs on Teaching Mathematics and Science with Technology: The Legacy of a Decade of Experiences of Transformation of School Practices and Interactions

    NASA Astrophysics Data System (ADS)

    Sacristán, Ana Isabel; Rojano, Teresa

    Here we give an overview of the Mexican experience of a national program, begun in 1997, of gradual implementation of computational tools in the lower secondary-school classrooms (children 12-15 years-old) for mathematics and science. This project illustrates, through the benefit of long-term hindsight, the successes and difficulties of large-scale massive implementation of technologies in schools. The key factors for success and for transforming school practices seem to be: adequate planning, gradual implementation, continuous training and support, and enough time (years) for assimilation and integration.

  12. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  13. Use of a hand-held computer observational tool to improve communication for care planning and psychosocial well-being

    PubMed Central

    Corazzini, Kirsten; Rapp, Carla Gene; McConnell, Eleanor S.; Anderson, Ruth A.

    2013-01-01

    Staff development nurses in long-term care are challenged to implement training programs that foster quality unlicensed assistive personnel (UAP) care and improve the transfer of their observations to licensed nursing staff for care planning. This study describes the outcomes of a program where UAP recorded behavioral problems of residents to inform care. Findings suggest staff development nurses who aim to improve UAP reporting without simultaneously targeting licensed nursing staff behaviors may worsen nursing staff relationships. PMID:19182546

  14. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less

  15. Using Computers in Relation to Learning Climate in CLIL Method

    ERIC Educational Resources Information Center

    Binterová, Helena; Komínková, Olga

    2013-01-01

    The main purpose of the work is to present a successful implementation of CLIL method in Mathematics lessons in elementary schools. Nowadays at all types of schools (elementary schools, high schools and universities) all over the world every school subject tends to be taught in a foreign language. In 2003, a document called Action plan for…

  16. Networking as a Strategic Tool, 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This conference focuses on the technological advances, pitfalls, requirements, and trends involved in planning and implementing an effective computer network system. The basic theme of the conference is networking as a strategic tool. Tutorials and conference presentations explore the technology and methods involved in this rapidly changing field. Future directions are explored from a global, as well as local, perspective.

  17. 75 FR 2091 - Approval and Promulgation of Implementation Plans and Designation of Areas for Air Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-14

    ... computed 3-year average ozone concentration of 0.085 ppm is the smallest value that is greater than 0.08..., April 30, 1992; 3. ``Contingency Measures for Ozone and Carbon Monoxide (CO) Redesignations... includes contingency measures to remedy future violations of the 1997 8-hour ozone NAAQS. The maintenance...

  18. Northeast Artificial Intelligence Consortium Annual Report. Volume 6. 1988 Building an Intelligent Assistant: The Acquisition, Integration, and Maintenance of Complex Distributed Tasks

    DTIC Science & Technology

    1989-10-01

    of.ezpertiae Seymour. Wright (or artificisi. intelligence distributed. ai planning robo tics computer.vsion))." Implementation: (replace-values-in-constraint...by mechanical partners or advisors that customize the system’s response to the idiosyncrasies of the student. This paper describes the initial

  19. Study and Demonstration of Planning and Scheduling Concepts for the Earth Observing System Data and Information System

    NASA Technical Reports Server (NTRS)

    Davis, Randal; Thalman, Nancy

    1993-01-01

    The University of Colorado's Laboratory for Atmospheric and Space Physics (CU/LASP) along with the Goddard Space Flight Center (GSFC) and the Jet Propulsion Laboratory (JPL) designed, implemented, tested, and demonstrated a prototype of the distributed, hierarchical planning and scheduling system comtemplated for the Earth Observing System (EOS) project. The planning and scheduling prototype made use of existing systems: CU/LASP's Operations and Science Instrument Support Planning and Scheduling (OASIS-PS) software package; GSFC's Request Oriented Scheduling Engine (ROSE); and JPL's Plan Integrated Timeliner 2 (Plan-It-2). Using these tools, four scheduling nodes were implemented and tied together using a new communications protocol for scheduling applications called the Scheduling Applications Interface Language (SAIL). An extensive and realistic scenario of EOS satellite operations was then developed and the prototype scheduling system was tested and demonstrated using the scenario. Two demonstrations of the system were given to NASA personnel and EOS core system (ECS) contractor personnel. A comprehensive volume of lessons learned was generated and a meeting was held with NASA and ECS representatives to review these lessons learned. A paper and presentation on the project's final results was given at the American Institute of Aeronautics and Astronautics Computing in Aerospace 9 conference.

  20. Fault-free behavior of reliable multiprocessor systems: FTMP experiments in AIRLAB

    NASA Technical Reports Server (NTRS)

    Clune, E.; Segall, Z.; Siewiorek, D.

    1985-01-01

    This report describes a set of experiments which were implemented on the Fault tolerant Multi-Processor (FTMP) at NASA/Langley's AIRLAB facility. These experiments are part of an effort to formulate and evaluate validation methodologies for fault-tolerant computers. This report deals with the measurement of single parameters (baselines) of a fault free system. The initial set of baseline experiments lead to the following conclusions: (1) The system clock is constant and independent of workload in the tested cases; (2) the instruction execution times are constant; (3) the R4 frame size is 40mS with some variation; (4) the frame stretching mechanism has some flaws in its implementation that allow the possibility of an infinite stretching of frame duration. Future experiments are planned. Some will broaden the results of these initial experiments. Others will measure the system more dynamically. The implementation of a synthetic workload generation mechanism for FTMP is planned to enhance the experimental environment of the system.

  1. Plan for the design, development, and implementation, and operation of the National Water Information System

    USGS Publications Warehouse

    Edwards, M.D.

    1987-01-01

    The Water Resources Division of the U.S. Geological Survey is developing a National Water Information System (NWIS) that will integrate and replace its existing water data and information systems of the National Water Data Storage and Retrieval System, National Water Data Exchange, National Water-Use Information, and Water Resources Scientific Information Center programs. It will be a distributed data system operated as part of the Division 's Distributed Information System, which is a network of computers linked together through a national telecommunication network known as GEONET. The NWIS is being developed as a series of prototypes that will be integrated as they are completed to allow the development and implementation of the system in a phased manner. It also is being developed in a distributed manner using personnel who work under the coordination of a central NWIS Project Office. Work on the development of the NWIS began in 1983 and it is scheduled for completion in 1990. This document presents an overall plan for the design, development, implementation, and operation of the system. Detailed discussions are presented on each of these phases of the NWIS life cycle. The planning, quality assurance, and configuration management phases of the life cycle also are discussed. The plan is intended to be a working document for use by NWIS management and participants in its design and development and to assist offices of the Division in planning and preparing for installation and operation of the system. (Author 's abstract)

  2. Beyond input-output computings: error-driven emergence with parallel non-distributed slime mold computer.

    PubMed

    Aono, Masashi; Gunji, Yukio-Pegio

    2003-10-01

    The emergence derived from errors is the key importance for both novel computing and novel usage of the computer. In this paper, we propose an implementable experimental plan for the biological computing so as to elicit the emergent property of complex systems. An individual plasmodium of the true slime mold Physarum polycephalum acts in the slime mold computer. Modifying the Elementary Cellular Automaton as it entails the global synchronization problem upon the parallel computing provides the NP-complete problem solved by the slime mold computer. The possibility to solve the problem by giving neither all possible results nor explicit prescription of solution-seeking is discussed. In slime mold computing, the distributivity in the local computing logic can change dynamically, and its parallel non-distributed computing cannot be reduced into the spatial addition of multiple serial computings. The computing system based on exhaustive absence of the super-system may produce, something more than filling the vacancy.

  3. Dual-Arm Generalized Compliant Motion With Shared Control

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.

    1994-01-01

    Dual-Arm Generalized Compliant Motion (DAGCM) primitive computer program implementing improved unified control scheme for two manipulator arms cooperating in task in which both grasp same object. Provides capabilities for autonomous, teleoperation, and shared control of two robot arms. Unifies cooperative dual-arm control with multi-sensor-based task control and makes complete task-control capability available to higher-level task-planning computer system via large set of input parameters used to describe desired force and position trajectories followed by manipulator arms. Some concepts discussed in "A Generalized-Compliant-Motion Primitive" (NPO-18134).

  4. Kernel-based Linux emulation for Plan 9.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minnich, Ronald G.

    2010-09-01

    CNKemu is a kernel-based system for the 9k variant of the Plan 9 kernel. It is designed to provide transparent binary support for programs compiled for IBM's Compute Node Kernel (CNK) on the Blue Gene series of supercomputers. This support allows users to build applications with the standard Blue Gene toolchain, including C++ and Fortran compilers. While the CNK is not Linux, IBM designed the CNK so that the user interface has much in common with the Linux 2.0 system call interface. The Plan 9 CNK emulator hence provides the foundation of kernel-based Linux system call support on Plan 9.more » In this paper we discuss cnkemu's implementation and some of its more interesting features, such as the ability to easily intermix Plan 9 and Linux system calls.« less

  5. Two arm robot path planning in a static environment using polytopes and string stretching. Thesis

    NASA Technical Reports Server (NTRS)

    Schima, Francis J., III

    1990-01-01

    The two arm robot path planning problem has been analyzed and reduced into components to be simplified. This thesis examines one component in which two Puma-560 robot arms are simultaneously holding a single object. The problem is to find a path between two points around obstacles which is relatively fast and minimizes the distance. The thesis involves creating a structure on which to form an advanced path planning algorithm which could ideally find the optimum path. An actual path planning method is implemented which is simple though effective in most common situations. Given the limits of computer technology, a 'good' path is currently found. Objects in the workspace are modeled with polytopes. These are used because they can be used for rapid collision detection and still provide a representation which is adequate for path planning.

  6. Route planning in a four-dimensional environment

    NASA Technical Reports Server (NTRS)

    Slack, M. G.; Miller, D. P.

    1987-01-01

    Robots must be able to function in the real world. The real world involves processes and agents that move independently of the actions of the robot, sometimes in an unpredictable manner. A real-time integrated route planning and spatial representation system for planning routes through dynamic domains is presented. The system will find the safest most efficient route through space-time as described by a set of user defined evaluation functions. Because the route planning algorthims is highly parallel and can run on an SIMD machine in O(p) time (p is the length of a path), the system will find real-time paths through unpredictable domains when used in an incremental mode. Spatial representation, an SIMD algorithm for route planning in a dynamic domain, and results from an implementation on a traditional computer architecture are discussed.

  7. 78 FR 28776 - Approval and Promulgation of Implementation Plans; Georgia; State Implementation Plan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Section of this Federal Register, EPA is approving the State's implementation plan revision as a direct... Promulgation of Implementation Plans; Georgia; State Implementation Plan Miscellaneous Revisions AGENCY... State Implementation Plan (SIP) submitted by the Georgia Environmental Protection Division to EPA in...

  8. Software engineering and Ada (Trademark) training: An implementation model for NASA

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  9. Forward and backward inference in spatial cognition.

    PubMed

    Penny, Will D; Zeidman, Peter; Burgess, Neil

    2013-01-01

    This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  10. Forward and Backward Inference in Spatial Cognition

    PubMed Central

    Penny, Will D.; Zeidman, Peter; Burgess, Neil

    2013-01-01

    This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of ‘lower-level’ computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus. PMID:24348230

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heijkoop, Sabrina T., E-mail: s.heijkoop@erasmusmc.nl; Langerak, Thomas R.; Quint, Sandra

    Purpose: To evaluate the clinical implementation of an online adaptive plan-of-the-day protocol for nonrigid target motion management in locally advanced cervical cancer intensity modulated radiation therapy (IMRT). Methods and Materials: Each of the 64 patients had four markers implanted in the vaginal fornix to verify the position of the cervix during treatment. Full and empty bladder computed tomography (CT) scans were acquired prior to treatment to build a bladder volume-dependent cervix-uterus motion model for establishment of the plan library. In the first phase of clinical implementation, the library consisted of one IMRT plan based on a single model-predicted internal targetmore » volume (mpITV), covering the target for the whole pretreatment observed bladder volume range, and a 3D conformal radiation therapy (3DCRT) motion-robust backup plan based on the same mpITV. The planning target volume (PTV) combined the ITV and nodal clinical target volume (CTV), expanded with a 1-cm margin. In the second phase, for patients showing >2.5-cm bladder-induced cervix-uterus motion during planning, two IMRT plans were constructed, based on mpITVs for empty-to-half-full and half-full-to-full bladder. In both phases, a daily cone beam CT (CBCT) scan was acquired to first position the patient based on bony anatomy and nodal targets and then select the appropriate plan. Daily post-treatment CBCT was used to verify plan selection. Results: Twenty-four and 40 patients were included in the first and second phase, respectively. In the second phase, 11 patients had two IMRT plans. Overall, an IMRT plan was used in 82.4% of fractions. The main reasons for selecting the motion-robust backup plan were uterus outside the PTV (27.5%) and markers outside their margin (21.3%). In patients with two IMRT plans, the half-full-to-full bladder plan was selected on average in 45% of the first 12 fractions, which was reduced to 35% in the last treatment fractions. Conclusions: The implemented online adaptive plan-of-the-day protocol for locally advanced cervical cancer enables (almost) daily tissue-sparing IMRT.« less

  12. Survey of image-guided radiotherapy use in Australia.

    PubMed

    Batumalai, Vikneswary; Holloway, Lois Charlotte; Kumar, Shivani; Dundas, Kylie; Jameson, Michael Geoffrey; Vinod, Shalini Kavita; Delaney, Geoff P

    2017-06-01

    This study aimed to evaluate the current use of imaging technologies for planning and delivery of radiotherapy (RT) in Australia. An online survey was emailed to all Australian RT centres in August 2015. The survey inquired about imaging practices during planning and treatment delivery processes. Participants were asked about the types of image-guided RT (IGRT) technologies and the disease sites they were used for, reasons for implementation, frequency of imaging and future plans for IGRT use in their department. The survey was completed by 71% of Australian RT centres. All respondents had access to computed tomography (CT) simulators and regularly co-registered the following scans to the RT: diagnostic CT (50%), diagnostic magnetic resonance imaging (MRI) (95%), planning MRI (34%), planning positron emission tomography (PET) (26%) and diagnostic PET (97%) to aid in tumour delineation. The main reason for in-room IGRT implementation was the use of highly conformal techniques, while the most common reason for under-utilisation was lack of equipment capability. The most commonly used IGRT modalities were kilovoltage (kV) cone-beam CT (CBCT) (97%), kV electronic portal image (EPI) (89%) and megavoltage (MV) EPI (75%). Overall, participants planned to increase IGRT use in planning (33%) and treatment delivery (36%). IGRT is widely used among Australian RT centres. On the basis of future plans of respondents, the installation of new imaging modalities is expected to increase for both planning and treatment. © 2016 The Royal Australian and New Zealand College of Radiologists.

  13. Motion Planning in a Society of Intelligent Mobile Agents

    NASA Technical Reports Server (NTRS)

    Esterline, Albert C.; Shafto, Michael (Technical Monitor)

    2002-01-01

    The majority of the work on this grant involved formal modeling of human-computer integration. We conceptualize computer resources as a multiagent system so that these resources and human collaborators may be modeled uniformly. In previous work we had used modal for this uniform modeling, and we had developed a process-algebraic agent abstraction. In this work, we applied this abstraction (using CSP) in uniformly modeling agents and users, which allowed us to use tools for investigating CSP models. This work revealed the power of, process-algebraic handshakes in modeling face-to-face conversation. We also investigated specifications of human-computer systems in the style of algebraic specification. This involved specifying the common knowledge required for coordination and process-algebraic patterns of communication actions intended to establish the common knowledge. We investigated the conditions for agents endowed with perception to gain common knowledge and implemented a prototype neural-network system that allows agents to detect when such conditions hold. The literature on multiagent systems conceptualizes communication actions as speech acts. We implemented a prototype system that infers the deontic effects (obligations, permissions, prohibitions) of speech acts and detects violations of these effects. A prototype distributed system was developed that allows users to collaborate in moving proxy agents; it was designed to exploit handshakes and common knowledge Finally. in work carried over from a previous NASA ARC grant, about fifteen undergraduates developed and presented projects on multiagent motion planning.

  14. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  15. SU-E-T-785: Using Systems Engineering to Design HDR Skin Treatment Operation for Small Lesions to Enhance Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saw, C; Baikadi, M; Peters, C

    2015-06-15

    Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less

  16. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Ammazzalorso, F.; Bednarz, T.; Jelen, U.

    2014-03-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  17. Development of a plan for automating integrated circuit processing

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.

  18. A 2D ion chamber array audit of wedged and asymmetric fields in an inhomogeneous lung phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lye, Jessica; Dunn, Leon, E-mail: leon.dunn@arpansa.gov.au; Alves, Andrew

    Purpose: The Australian Clinical Dosimetry Service (ACDS) has implemented a new method of a nonreference condition Level II type dosimetric audit of radiotherapy services to increase measurement accuracy and patient safety within Australia. The aim of this work is to describe the methodology, tolerances, and outcomes from the new audit. Methods: The ACDS Level II audit measures the dose delivered in 2D planes using an ionization chamber based array positioned at multiple depths. Measurements are made in rectilinear homogeneous and inhomogeneous phantoms composed of slabs of solid water and lung. Computer generated computed tomography data sets of the rectilinear phantomsmore » are supplied to the facility prior to audit for planning of a range of cases including reference fields, asymmetric fields, and wedged fields. The audit assesses 3D planning with 6 MV photons with a static (zero degree) gantry. Scoring is performed using local dose differences between the planned and measured dose within 80% of the field width. The overall audit result is determined by the maximum dose difference over all scoring points, cases, and planes. Pass (Optimal Level) is defined as maximum dose difference ≤3.3%, Pass (Action Level) is ≤5.0%, and Fail (Out of Tolerance) is >5.0%. Results: At close of 2013, the ACDS had performed 24 Level II audits. 63% of the audits passed, 33% failed, and the remaining audit was not assessable. Of the 15 audits that passed, 3 were at Pass (Action Level). The high fail rate is largely due to a systemic issue with modeling asymmetric 60° wedges which caused a delivered overdose of 5%–8%. Conclusions: The ACDS has implemented a nonreference condition Level II type audit, based on ion chamber 2D array measurements in an inhomogeneous slab phantom. The powerful diagnostic ability of this audit has allowed the ACDS to rigorously test the treatment planning systems implemented in Australian radiotherapy facilities. Recommendations from audits have led to facilities modifying clinical practice and changing planning protocols.« less

  19. A unified approach for composite cost reporting and prediction in the ACT program

    NASA Technical Reports Server (NTRS)

    Freeman, W. Tom; Vosteen, Louis F.; Siddiqi, Shahid

    1991-01-01

    The Structures Technology Program Office (STPO) at NASA Langley Research Center has held two workshops with representatives from the commercial airframe companies to establish a plan for development of a standard cost reporting format and a cost prediction tool for conceptual and preliminary designers. This paper reviews the findings of the workshop representatives with a plan for implementation of their recommendations. The recommendations of the cost tracking and reporting committee will be implemented by reinstituting the collection of composite part fabrication data in a format similar to the DoD/NASA Structural Composites Fabrication Guide. The process of data collection will be automated by taking advantage of current technology with user friendly computer interfaces and electronic data transmission. Development of a conceptual and preliminary designers' cost prediction model will be initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design (CAD) programs is assessed.

  20. Reconfigurable Software for Controlling Formation Flying

    NASA Technical Reports Server (NTRS)

    Mueller, Joseph B.

    2006-01-01

    Software for a system to control the trajectories of multiple spacecraft flying in formation is being developed to reflect underlying concepts of (1) a decentralized approach to guidance and control and (2) reconfigurability of the control system, including reconfigurability of the software and of control laws. The software is organized as a modular network of software tasks. The computational load for both determining relative trajectories and planning maneuvers is shared equally among all spacecraft in a cluster. The flexibility and robustness of the software are apparent in the fact that tasks can be added, removed, or replaced during flight. In a computational simulation of a representative formation-flying scenario, it was demonstrated that the following are among the services performed by the software: Uploading of commands from a ground station and distribution of the commands among the spacecraft, Autonomous initiation and reconfiguration of formations, Autonomous formation of teams through negotiations among the spacecraft, Working out details of high-level commands (e.g., shapes and sizes of geometrically complex formations), Implementation of a distributed guidance law providing autonomous optimization and assignment of target states, and Implementation of a decentralized, fuel-optimal, impulsive control law for planning maneuvers.

  1. Reading as Active Sensing: A Computational Model of Gaze Planning in Word Recognition

    PubMed Central

    Ferro, Marcello; Ognibene, Dimitri; Pezzulo, Giovanni; Pirrelli, Vito

    2010-01-01

    We offer a computational model of gaze planning during reading that consists of two main components: a lexical representation network, acquiring lexical representations from input texts (a subset of the Italian CHILDES database), and a gaze planner, designed to recognize written words by mapping strings of characters onto lexical representations. The model implements an active sensing strategy that selects which characters of the input string are to be fixated, depending on the predictions dynamically made by the lexical representation network. We analyze the developmental trajectory of the system in performing the word recognition task as a function of both increasing lexical competence, and correspondingly increasing lexical prediction ability. We conclude by discussing how our approach can be scaled up in the context of an active sensing strategy applied to a robotic setting. PMID:20577589

  2. Reading as active sensing: a computational model of gaze planning in word recognition.

    PubMed

    Ferro, Marcello; Ognibene, Dimitri; Pezzulo, Giovanni; Pirrelli, Vito

    2010-01-01

    WE OFFER A COMPUTATIONAL MODEL OF GAZE PLANNING DURING READING THAT CONSISTS OF TWO MAIN COMPONENTS: a lexical representation network, acquiring lexical representations from input texts (a subset of the Italian CHILDES database), and a gaze planner, designed to recognize written words by mapping strings of characters onto lexical representations. The model implements an active sensing strategy that selects which characters of the input string are to be fixated, depending on the predictions dynamically made by the lexical representation network. We analyze the developmental trajectory of the system in performing the word recognition task as a function of both increasing lexical competence, and correspondingly increasing lexical prediction ability. We conclude by discussing how our approach can be scaled up in the context of an active sensing strategy applied to a robotic setting.

  3. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  4. 50 CFR 600.1008 - Implementation plan and implementation regulations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Implementation plan and implementation... Capacity Reduction Framework § 600.1008 Implementation plan and implementation regulations. (a) As soon as... period, a proposed implementation plan and implementation regulations. During the public comment period...

  5. Ontology for assessment studies of human-computer-interaction in surgery.

    PubMed

    Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen

    2015-02-01

    New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks. Furthermore, it allows to acquire the structured description of the applied assessment methods within a certain surgical domain and to consider this information for own study design or to perform a comparison of different studies. The investigation model and the corresponding ontology can be used further to create new knowledge bases of HCI assessment in surgery. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Multi Modal Anticipation in Fuzzy Space

    NASA Astrophysics Data System (ADS)

    Asproth, Viveca; Holmberg, Stig C.; Hâkansson, Anita

    2006-06-01

    We are all stakeholders in the geographical space, which makes up our common living and activity space. This means that a careful, creative, and anticipatory planning, design, and management of that space will be of paramount importance for our sustained life on earth. Here it is shown that the quality of such planning could be significantly increased with help of a computer based modelling and simulation tool. Further, the design and implementation of such a tool ought to be guided by the conceptual integration of some core concepts like anticipation and retardation, multi modal system modelling, fuzzy space modelling, and multi actor interaction.

  7. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  8. Image guided adaptive external beam radiation therapy for cervix cancer: Evaluation of a clinically implemented plan-of-the-day technique.

    PubMed

    Buschmann, Martin; Majercakova, Katarina; Sturdza, Alina; Smet, Stephanie; Najjari, Dina; Daniel, Michaela; Pötter, Richard; Georg, Dietmar; Seppenwoolde, Yvette

    2017-10-12

    Radiotherapy for cervix cancer is challenging in patients exhibiting large daily changes in the pelvic anatomy, therefore adaptive treatments (ART) have been proposed. The aim of this study was the clinical implementation and subsequent evaluation of plan-of-the-day (POTD)-ART for cervix cancer in supine positioning. The described workflow was based on standard commercial equipment and current quality assurance (QA) methods. A POTD strategy, which employs a VMAT plan library consisting of an empty bladder plan, a full bladder plan and a motion robust backup plan, was developed. Daily adaption was guided by cone beam computed tomography (CBCT) imaging after which the best plan from the library was selected. Sixteen patients were recruited in a clinical study on ART, for nine POTD was applied due to their large organ motion derived from two computed tomography (CT) scans with variable bladder filling. All patients were treated to 45Gy in 25 fractions. Plan selection frequencies over the treatment course were analyzed. Daily doses in the rectum, bladder and cervix-uterus target (CTV-T) were derived and compared to a simulated non-adapted treatment (non-ART), which employed the robust plan for each fraction. Additionally, the adaption consistency was determined by repeating the plan selection procedure one month after treatment by a group of experts. ART-specific QA methods are presented. 225 ART fractions with CBCTs were analyzed. The empty bladder plan was delivered in 49% of the fractions in the first treatment week and this number increased to 78% in the fifth week. The daily coverage of the CTV-T was equivalent between ART and the non-ART simulation, while the daily total irradiated volume V42.75Gy (95% of prescription dose) was reduced by a median of 87cm 3 . The median delivered V42.75Gy was 1782cm 3 . Daily delivered doses (V42.75Gy, V40Gy, V30G) to the organs at risk were statistically significantly reduced by ART, with a median difference in daily V42.75Gy in rectum and bladder of 3.2% and 1.1%, respectively. The daily bladder V42.75Gy and V40Gy were decreased by more than 10 percent points in 30% and 24% of all fractions, respectively, through ART. The agreement between delivered plans and retrospective expert-group plan selections was 84%. A POTD-ART technique for cervix cancer was successfully and safely implemented in the clinic and evaluated. Improved normal tissue sparing compared to a simulated non-ART treatment could be demonstrated. Future developments should focus on commercial automated software solutions to allow for a more widespread adoption and to keep the increased workload manageable. Copyright © 2017. Published by Elsevier GmbH.

  9. Computers in mathematics: teacher-inservice training at a distance

    NASA Astrophysics Data System (ADS)

    Friedman, Edward A.; Jurkat, M. P.

    1993-01-01

    While research and experience show many advantages for incorporation of computer technology into secondary school mathematics instruction, less than 5 percent of the nation's teachers are actively using computers in their classrooms. This is the case even though mathematics teachers in grades 7 - 12 are often familiar with computer technology and have computers available to them in their schools. The implementation bottleneck is in-service teacher training and there are few models of effective implementation available for teachers to emulate. Stevens Institute of Technology has been active since 1988 in research and development efforts to incorporate computers into classroom use. We have found that teachers need to see examples of classroom experience with hardware and software and they need to have assistance as they experiment with applications of software and the development of lesson plans. High-band width technology can greatly facilitate teacher training in this area through transmission of video documentaries, software discussions, teleconferencing, peer interactions, classroom observations, etc. We discuss the experience that Stevens has had with face-to-face teacher training as well as with satellite-based teleconferencing using one-way video and two- way audio. Included are reviews of analyses of this project by researchers from Educational Testing Service, Princeton University, and Bank Street School of Education.

  10. Optimization of equivalent uniform dose using the L-curve criterion.

    PubMed

    Chvetsov, Alexei V; Dempsey, James F; Palta, Jatinder R

    2007-10-07

    Optimization of equivalent uniform dose (EUD) in inverse planning for intensity-modulated radiation therapy (IMRT) prevents variation in radiobiological effect between different radiotherapy treatment plans, which is due to variation in the pattern of dose nonuniformity. For instance, the survival fraction of clonogens would be consistent with the prescription when the optimized EUD is equal to the prescribed EUD. One of the problems in the practical implementation of this approach is that the spatial dose distribution in EUD-based inverse planning would be underdetermined because an unlimited number of nonuniform dose distributions can be computed for a prescribed value of EUD. Together with ill-posedness of the underlying integral equation, this may significantly increase the dose nonuniformity. To optimize EUD and keep dose nonuniformity within reasonable limits, we implemented into an EUD-based objective function an additional criterion which ensures the smoothness of beam intensity functions. This approach is similar to the variational regularization technique which was previously studied for the dose-based least-squares optimization. We show that the variational regularization together with the L-curve criterion for the regularization parameter can significantly reduce dose nonuniformity in EUD-based inverse planning.

  11. Experimental study of trajectory planning and control of a high precision robot manipulator

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.; Antrazi, Sami S.

    1991-01-01

    The kinematic and trajectory planning is presented for a 6 DOF end-effector whose design was based on the Stewart Platform mechanism. The end-effector was used as a testbed for studying robotic assembly of NASA hardware with passive compliance. Vector analysis was employed to derive a closed-form solution for the end-effector inverse kinematic transformation. A computationally efficient numerical solution was obtained for the end-effector forward kinematic transformation using Newton-Raphson method. Three trajectory planning schemes, two for fine motion and one for gross motion, were developed for the end-effector. Experiments conducted to evaluate the performance of the trajectory planning schemes showed excellent tracking quality with minimal errors. Current activities focus on implementing the developed trajectory planning schemes on mating and demating space-rated connectors and using the compliant platform to acquire forces/torques applied on the end-effector during the assembly task.

  12. Design of a cooperative problem-solving system for enroute flight planning: An empirical study of its use by airline dispatchers

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, C. Elaine; Layton, Charles; Orasanu, Judith; Chappel, Sherry; Palmer, EV; Corker, Kevin

    1993-01-01

    In a previous report, an empirical study of 30 pilots using the Flight Planning Testbed was reported. An identical experiment using the Flight Planning Testbed (FPT), except that 27 airline dispatchers were studied, is described. Five general questions were addressed in this study: (1) under what circumstances do the introduction of computer-generated suggestions (flight plans) influence the planning behavior of dispatchers (either in a beneficial or adverse manner); (2) what is the nature of such influences (i.e., how are the person's cognitive processes changed); (3) how beneficial are the general design concepts underlying FPT (use of a graphical interface, embedding graphics in a spreadsheet, etc.); (4) how effective are the specific implementation decisions made in realizing these general design concepts; and (5) how effectively do dispatchers evaluate situations requiring replanning, and how effectively do they identify appropriate solutions to these situations.

  13. Clinical implementation of target tracking by breathing synchronized delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tewatia, Dinesh; Zhang Tiezhi; Tome, Wolfgang

    2006-11-15

    Target-tracking techniques can be categorized based on the mechanism of the feedback loop. In real time tracking, breathing-delivery phase correlation is provided to the treatment delivery hardware. Clinical implementation of target tracking in real time requires major hardware modifications. In breathing synchronized delivery (BSD), the patient is guided to breathe in accordance with target motion derived from four-dimensional computed tomography (4D-CT). Violations of mechanical limitations of hardware are to be avoided at the treatment planning stage. Hardware modifications are not required. In this article, using sliding window IMRT delivery as an example, we have described step-by-step the implementation of targetmore » tracking by the BSD technique: (1) A breathing guide is developed from patient's normal breathing pattern. The patient tries to reproduce this guiding cycle by following the display in the goggles; (2) 4D-CT scans are acquired at all the phases of the breathing cycle; (3) The average tumor trajectory is obtained by deformable image registration of 4D-CT datasets and is smoothed by Fourier filtering; (4) Conventional IMRT planning is performed using the images at reference phase (full exhalation phase) and a leaf sequence based on optimized fluence map is generated; (5) Assuming the patient breathes with a reproducible breathing pattern and the machine maintains a constant dose rate, the treatment process is correlated with the breathing phase; (6) The instantaneous average tumor displacement is overlaid on the dMLC position at corresponding phase; and (7) DMLC leaf speed and acceleration are evaluated to ensure treatment delivery. A custom-built mobile phantom driven by a computer-controlled stepper motor was used in the dosimetry verification. A stepper motor was programmed such that the phantom moved according to the linear component of tumor motion used in BSD treatment planning. A conventional plan was delivered on the phantom with and without motion. The BSD plan was also delivered on the phantom that moved with the prescheduled pattern and synchronized with the delivery of each beam. Film dosimetry showed underdose and overdose in the superior and inferior regions of the target, respectively, if the tumor motion is not compensated during the delivery. BSD delivery resulted in a dose distribution very similar to the planned treatments.« less

  14. Nurses' attitudes towards computers: cross sectional questionnaire study.

    PubMed

    Brumini, Gordan; Kovic, Ivor; Zombori, Dejvid; Lulic, Ileana; Petrovecki, Mladen

    2005-02-01

    To estimate the attitudes of hospital nurses towards computers and the influence of gender, age, education, and computer usage on these attitudes. The study was conducted in two Croatian hospitals where integrated hospital information system is being implemented. There were 1,081 nurses surveyed by an anonymous questionnaire consisting of 8 questions about demographic data, education, and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers were compared using one-way ANOVA and Tukey-b post-hoc test. The total score was 120+/-15 (mean+/-standard deviation) out of maximal 150. Nurses younger than 30 years had a higher total score than those older than 30 years (124+/-13 vs 119+/-16 for 30-39 age groups and 117+/-15 for>39 age groups, P<0.001). Nurses with a bachelor's degree (119+/-16 vs 122+/-14, P=0.002) and nurses who had attended computer science courses had a higher total score compared to the others (124+/-13 vs 118+/-16, P<0.001). Nurses using computers more than 5 hours per week had higher total score than those who used computers less than 5 hours (127+/-13 vs 124+/-12 for 1-5 h and and 119+/-14 for <1 hour per day, P<0.001, post-hoc test). Nurses in general have positive attitudes towards computers. These results are important for the planning and implementing an integrated hospital information system.

  15. Automated radiotherapy treatment plan integrity verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang Deshan; Moore, Kevin L.

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method ofmore » dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.« less

  16. Computational materials science and engineering education: A survey of trends and needs

    NASA Astrophysics Data System (ADS)

    Thornton, K.; Nola, Samanthule; Edwin Garcia, R.; Asta, Mark; Olson, G. B.

    2009-10-01

    Results from a recent reassessment of the state of computational materials science and engineering (CMSE) education are reported. Surveys were distributed to the chairs and heads of materials programs, faculty members engaged in computational research, and employers of materials scientists and engineers, mainly in the United States. The data was compiled to assess current course offerings related to CMSE, the general climate for introducing computational methods in MSE curricula, and the requirements from the employers’ viewpoint. Furthermore, the available educational resources and their utilization by the community are examined. The surveys show a general support for integrating computational content into MSE education. However, they also reflect remaining issues with implementation, as well as a gap between the tools being taught in courses and those that are used by employers. Overall, the results suggest the necessity for a comprehensively developed vision and plans to further the integration of computational methods into MSE curricula.

  17. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.

  18. Computer integrated manufacturing/processing in the HPI. [Hydrocarbon Processing Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, J.S.

    1993-05-01

    Hydrocarbon Processing and Systemhouse Inc., developed a comprehensive survey on the status of computer integrated manufacturing/processing (CIM/CIP) targeted specifically to the unique requirements of the hydrocarbon processing industry. These types of surveys and other benchmarking techniques can be invaluable in assisting companies to maximize business benefits from technology investments. The survey was organized into 5 major areas: CIM/CIP planning, management perspective, functional applications, integration and technology infrastructure and trends. The CIM/CIP planning area dealt with the use and type of planning methods to plan, justify implement information technology projects. The management perspective section addressed management priorities, expenditure levels and implementationmore » barriers. The functional application area covered virtually all functional areas of organization and focused on the specific solutions and benefits in each of the functional areas. The integration section addressed the needs and integration status of the organization's functional areas. Finally, the technology infrastructure and trends section dealt with specific technologies in use as well as trends over the next three years. In February 1993, summary areas from preliminary results were presented at the 2nd International Conference on Productivity and Quality in the Hydrocarbon Processing Industry.« less

  19. NAS-current status and future plans

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.

    1987-01-01

    The Numerical Aerodynamic Simulation (NAS) has met its first major milestone, the NAS Processing System Network (NPSN) Initial Operating Configuration (IOC). The program has met its goal of providing a national supercomputer facility capable of greatly enhancing the Nation's research and development efforts. Furthermore, the program is fulfilling its pathfinder role by defining and implementing a paradigm for supercomputing system environments. The IOC is only the begining and the NAS Program will aggressively continue to develop and implement emerging supercomputer, communications, storage, and software technologies to strengthen computations as a critical element in supporting the Nation's leadership role in aeronautics.

  20. CASE tools and UML: state of the ART.

    PubMed

    Agarwal, S

    2001-05-01

    With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.

  1. Metaphorical motion in mathematical reasoning: further evidence for pre-motor implementation of structure mapping in abstract domains.

    PubMed

    Fields, Chris

    2013-08-01

    The theory of computation and category theory both employ arrow-based notations that suggest that the basic metaphor "state changes are like motions" plays a fundamental role in all mathematical reasoning involving formal manipulations. If this is correct, structure-mapping inferences implemented by the pre-motor action planning system can be expected to be involved in solving any mathematics problems not solvable by table lookups and number line manipulations alone. Available functional imaging studies of multi-digit arithmetic, algebra, geometry and calculus problem solving are consistent with this expectation.

  2. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  3. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  4. CASPER Version 2.0

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Rabideau, Gregg; Tran, Daniel; Knight, Russell; Chouinard, Caroline; Estlin, Tara; Gaines, Daniel; Clement, Bradley; Barrett, Anthony

    2007-01-01

    CASPER is designed to perform automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. In contradistinction to the traditional concept of batch planning followed by execution, CASPER implements a concept of continuous planning and replanning in response to unanticipated changes (including failures), integrated with execution. Improvements over other, similar software that have been incorporated into CASPER version 2.0 include an enhanced executable interface to facilitate integration with a wide range of execution software systems and supporting software libraries; features to support execution while reasoning about urgency, importance, and impending deadlines; features that enable accommodation to a wide range of computing environments that include various central processing units and random- access-memory capacities; and improved generic time-server and time-control features.

  5. The CP-PACS project

    NASA Astrophysics Data System (ADS)

    Iwasaki, Y.; CP-PACS Collaboration

    1998-01-01

    The CP-PACS project is a five year plan, which formally started in April 1992 and has been completed in March 1997, to develop a massively parallel computer for carrying out research in computational physics with primary emphasis on lattice QCD. The initial version of the CP-PACS computer with a theoretical peak speed of 307 GFLOPS with 1024 processors was completed in March 1996. The final version with a peak speed of 614 GFLOPS with 2048 processors was completed in September 1996, and has been in full operation since October 1996. We describe the architecture, the final specification, the hardware implementation, and the software of the CP-PACS computer. The CP-PACS has been used for hadron spectroscopy production runs since July 1996. The performance for lattice QCD applications and the LINPACK benchmark are given.

  6. Introduction to Computers & Introduction to Word Processing: Integrating Content Area Coursework into College Reading/Study Skills Curricula Using Microcomputers.

    ERIC Educational Resources Information Center

    Balajthy, Ernest; And Others

    A study examined the planning, implementation, and evaluation of a curriculum designed to teach 60 college level developmental reading students to use microcomputers (Apple) as learning tools and to improve their content area reading ability. The textbook from a biology course in which all but three of the subjects were enrolled was the source for…

  7. Fly-by-light technology development plan

    NASA Technical Reports Server (NTRS)

    Todd, J. R.; Williams, T.; Goldthorpe, S.; Hay, J.; Brennan, M.; Sherman, B.; Chen, J.; Yount, Larry J.; Hess, Richard F.; Kravetz, J.

    1990-01-01

    The driving factors and developments which make a fly-by-light (FBL) viable are discussed. Documentation, analyses, and recommendations are provided on the major issues pertinent to facilitating the U.S. implementation of commercial FBL aircraft before the turn of the century. Areas of particular concern include ultra-reliable computing (hardware/software); electromagnetic environment (EME); verification and validation; optical techniques; life-cycle maintenance; and basis and procedures for certification.

  8. The Development and Implementation of an Integrated Career Education and Placement Program For the Washington State System of Community Colleges.

    ERIC Educational Resources Information Center

    Marble, James E.; And Others

    The community colleges in the state of Washington are committed to a Six Year Plan to provide computing and information systems support to all students. The system is intended to make available a broad range of career placement information to assist decision-making, thereby humanizing education by insuring fewer misguided students, counselors and…

  9. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  10. Applying graphics user interface ot group technology classification and coding at the Boeing aerospace company

    NASA Astrophysics Data System (ADS)

    Ness, P. H.; Jacobson, H.

    1984-10-01

    The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.

  11. The method of a joint intraday security check system based on cloud computing

    NASA Astrophysics Data System (ADS)

    Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng

    2017-01-01

    The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.

  12. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  13. Working Notes from the 1992 AAAI Spring Symposium on Practical Approaches to Scheduling and Planning

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Fox, Mark; Tate, Austin; Zweben, Monte

    1992-01-01

    The symposium presented issues involved in the development of scheduling systems that can deal with resource and time limitations. To qualify, a system must be implemented and tested to some degree on non-trivial problems (ideally, on real-world problems). However, a system need not be fully deployed to qualify. Systems that schedule actions in terms of metric time constraints typically represent and reason about an external numeric clock or calendar and can be contrasted with those systems that represent time purely symbolically. The following topics are discussed: integrating planning and scheduling; integrating symbolic goals and numerical utilities; managing uncertainty; incremental rescheduling; managing limited computation time; anytime scheduling and planning algorithms, systems; dependency analysis and schedule reuse; management of schedule and plan execution; and incorporation of discrete event techniques.

  14. 77 FR 43205 - Notice of Data Availability for Approval, Disapproval and Promulgation of Implementation Plans...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... Haze State Implementation Plan; Federal Implementation Plan for Regional Haze AGENCY: Environmental... (SIP) revision submitted by the State of Wyoming on January 12, 2011, that addresses regional haze...; Regional Haze State Implementation Plan; Federal Implementation Plan for Regional Haze; Proposed Rule (77...

  15. 78 FR 28775 - Approval and Promulgation of Implementation Plans; North Carolina; State Implementation Plan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Promulgation of Implementation Plans; North Carolina; State Implementation Plan Miscellaneous Revisions AGENCY... a revision to the North Carolina State Implementation Plan submitted on February 3, 2010, through... particulate matter found in the Code of Federal Regulations. In the Final Rules Section of this Federal...

  16. Computer-based planning of optimal donor sites for autologous osseous grafts

    NASA Astrophysics Data System (ADS)

    Krol, Zdzislaw; Chlebiej, Michal; Zerfass, Peter; Zeilhofer, Hans-Florian U.; Sader, Robert; Mikolajczak, Pawel; Keeve, Erwin

    2002-05-01

    Bone graft surgery is often necessary for reconstruction of craniofacial defects after trauma, tumor, infection or congenital malformation. In this operative technique the removed or missing bone segment is filled with a bone graft. The mainstay of the craniofacial reconstruction rests with the replacement of the defected bone by autogeneous bone grafts. To achieve sufficient incorporation of the autograft into the host bone, precise planning and simulation of the surgical intervention is required. The major problem is to determine as accurately as possible the donor site where the graft should be dissected from and to define the shape of the desired transplant. A computer-aided method for semi-automatic selection of optimal donor sites for autografts in craniofacial reconstructive surgery has been developed. The non-automatic step of graft design and constraint setting is followed by a fully automatic procedure to find the best fitting position. In extension to preceding work, a new optimization approach based on the Levenberg-Marquardt method has been implemented and embedded into our computer-based surgical planning system. This new technique enables, once the pre-processing step has been performed, selection of the optimal donor site in time less than one minute. The method has been applied during surgery planning step in more than 20 cases. The postoperative observations have shown that functional results, such as speech and chewing ability as well as restoration of bony continuity were clearly better compared to conventionally planned operations. Moreover, in most cases the duration of the surgical interventions has been distinctly reduced.

  17. High performance computing for deformable image registration: towards a new paradigm in adaptive radiotherapy.

    PubMed

    Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D

    2008-08-01

    The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is characterized in terms of time per megapixels per iteration (TPMI) with units of seconds per megapixels per iteration (or spmi). For the demons algorithm, our CPU implementation yielded largely invariant values of TPMI. The mean TPMIs were 0.527 spmi and 0.335 spmi for the single threading and multithreading cases, respectively, with <2% variation over the considered image data range. For GPU computing, we achieved TPMI =0.00916 spmi with 3.7% variation, indicating optimized memory handling under CUDA. The paradigm of GPU based real-time DIR opens up a host of clinical applications for medical imaging.

  18. Analytical modeling and feasibility study of a multi-GPU cloud-based server (MGCS) framework for non-voxel-based dose calculations.

    PubMed

    Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A

    2017-04-01

    In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.

  19. Multiagent Flight Control in Dynamic Environments with Cooperative Coevolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Knudson, Matthew D.; Colby, Mitchell; Tumer, Kagan

    2014-01-01

    Dynamic flight environments in which objectives and environmental features change with respect to time pose a difficult problem with regards to planning optimal flight paths. Path planning methods are typically computationally expensive, and are often difficult to implement in real time if system objectives are changed. This computational problem is compounded when multiple agents are present in the system, as the state and action space grows exponentially. In this work, we use cooperative coevolutionary algorithms in order to develop policies which control agent motion in a dynamic multiagent unmanned aerial system environment such that goals and perceptions change, while ensuring safety constraints are not violated. Rather than replanning new paths when the environment changes, we develop a policy which can map the new environmental features to a trajectory for the agent while ensuring safe and reliable operation, while providing 92% of the theoretically optimal performance

  20. Multiagent Flight Control in Dynamic Environments with Cooperative Coevolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Colby, Mitchell; Knudson, Matthew D.; Tumer, Kagan

    2014-01-01

    Dynamic environments in which objectives and environmental features change with respect to time pose a difficult problem with regards to planning optimal paths through these environments. Path planning methods are typically computationally expensive, and are often difficult to implement in real time if system objectives are changed. This computational problem is compounded when multiple agents are present in the system, as the state and action space grows exponentially with the number of agents in the system. In this work, we use cooperative coevolutionary algorithms in order to develop policies which control agent motion in a dynamic multiagent unmanned aerial system environment such that goals and perceptions change, while ensuring safety constraints are not violated. Rather than replanning new paths when the environment changes, we develop a policy which can map the new environmental features to a trajectory for the agent while ensuring safe and reliable operation, while providing 92% of the theoretically optimal performance.

  1. System architecture for asynchronous multi-processor robotic control system

    NASA Technical Reports Server (NTRS)

    Steele, Robert D.; Long, Mark; Backes, Paul

    1993-01-01

    The architecture for the Modular Telerobot Task Execution System (MOTES) as implemented in the Supervisory Telerobotics (STELER) Laboratory is described. MOTES is the software component of the remote site of a local-remote telerobotic system which is being developed for NASA for space applications, in particular Space Station Freedom applications. The system is being developed to provide control and supervised autonomous control to support both space based operation and ground-remote control with time delay. The local-remote architecture places task planning responsibilities at the local site and task execution responsibilities at the remote site. This separation allows the remote site to be designed to optimize task execution capability within a limited computational environment such as is expected in flight systems. The local site task planning system could be placed on the ground where few computational limitations are expected. MOTES is written in the Ada programming language for a multiprocessor environment.

  2. Algorithms for the optimization of RBE-weighted dose in particle therapy.

    PubMed

    Horcicka, M; Meyer, C; Buschbacher, A; Durante, M; Krämer, M

    2013-01-21

    We report on various algorithms used for the nonlinear optimization of RBE-weighted dose in particle therapy. Concerning the dose calculation carbon ions are considered and biological effects are calculated by the Local Effect Model. Taking biological effects fully into account requires iterative methods to solve the optimization problem. We implemented several additional algorithms into GSI's treatment planning system TRiP98, like the BFGS-algorithm and the method of conjugated gradients, in order to investigate their computational performance. We modified textbook iteration procedures to improve the convergence speed. The performance of the algorithms is presented by convergence in terms of iterations and computation time. We found that the Fletcher-Reeves variant of the method of conjugated gradients is the algorithm with the best computational performance. With this algorithm we could speed up computation times by a factor of 4 compared to the method of steepest descent, which was used before. With our new methods it is possible to optimize complex treatment plans in a few minutes leading to good dose distributions. At the end we discuss future goals concerning dose optimization issues in particle therapy which might benefit from fast optimization solvers.

  3. Algorithms for the optimization of RBE-weighted dose in particle therapy

    NASA Astrophysics Data System (ADS)

    Horcicka, M.; Meyer, C.; Buschbacher, A.; Durante, M.; Krämer, M.

    2013-01-01

    We report on various algorithms used for the nonlinear optimization of RBE-weighted dose in particle therapy. Concerning the dose calculation carbon ions are considered and biological effects are calculated by the Local Effect Model. Taking biological effects fully into account requires iterative methods to solve the optimization problem. We implemented several additional algorithms into GSI's treatment planning system TRiP98, like the BFGS-algorithm and the method of conjugated gradients, in order to investigate their computational performance. We modified textbook iteration procedures to improve the convergence speed. The performance of the algorithms is presented by convergence in terms of iterations and computation time. We found that the Fletcher-Reeves variant of the method of conjugated gradients is the algorithm with the best computational performance. With this algorithm we could speed up computation times by a factor of 4 compared to the method of steepest descent, which was used before. With our new methods it is possible to optimize complex treatment plans in a few minutes leading to good dose distributions. At the end we discuss future goals concerning dose optimization issues in particle therapy which might benefit from fast optimization solvers.

  4. Complex facial deformity reconstruction with a surgical guide incorporating a built-in occlusal stent as the positioning reference.

    PubMed

    Fang, Jing-Jing; Liu, Jia-Kuang; Wu, Tzu-Chieh; Lee, Jing-Wei; Kuo, Tai-Hong

    2013-05-01

    Computer-aided design has gained increasing popularity in clinical practice, and the advent of rapid prototyping technology has further enhanced the quality and predictability of surgical outcomes. It provides target guides for complex bony reconstruction during surgery. Therefore, surgeons can efficiently and precisely target fracture restorations. Based on three-dimensional models generated from a computed tomographic scan, precise preoperative planning simulation on a computer is possible. Combining the interdisciplinary knowledge of surgeons and engineers, this study proposes a novel surgical guidance method that incorporates a built-in occlusal wafer that serves as the positioning reference.Two patients with complex facial deformity suffering from severe facial asymmetry problems were recruited. In vitro facial reconstruction was first rehearsed on physical models, where a customized surgical guide incorporating a built-in occlusal stent as the positioning reference was designed to implement the surgery plan. This study is intended to present the authors' preliminary experience in a complex facial reconstruction procedure. It suggests that in regions with less information, where intraoperative computed tomographic scans or navigation systems are not available, our approach could be an effective, expedient, straightforward aid to enhance surgical outcome in a complex facial repair.

  5. Approved Air Quality Implementation Plans in Region 10

    EPA Pesticide Factsheets

    Landing page for information about EPA-approved air quality State Implementation Plans (SIPs), Tribal Implementation Plans (TIPs), and Federal Implementation Plans (FIPs) in Alaska, Idaho, Oregon, Washington.

  6. 78 FR 69625 - Approval and Promulgation of Implementation Plans; New York State Ozone Implementation Plan Revision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ...] Approval and Promulgation of Implementation Plans; New York State Ozone Implementation Plan Revision AGENCY...) is proposing to approve a revision to the New York State Implementation Plan (SIP) for ozone... air quality standards for ozone. DATES: Comments must be received on or before December 20, 2013...

  7. 77 FR 13974 - Approval and Promulgation of Implementation Plans; New York State Ozone Implementation Plan Revision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-08

    ... Promulgation of Implementation Plans; New York State Ozone Implementation Plan Revision AGENCY: Environmental... a proposed revision to the New York State Implementation Plan (SIP) for ozone concerning the control... national ambient air quality standards for ozone. DATES: Effective Date: This rule will be effective April...

  8. Primary Care Physicians' Experience with Electronic Medical Records: Barriers to Implementation in a Fee-for-Service Environment

    PubMed Central

    Ludwick, D. A.; Doucette, John

    2009-01-01

    Our aging population has exacerbated strong and divergent trends between health human resource supply and demand. One way to mitigate future inequities is through the adoption of health information technology (HIT). Our previous research showed a number of risks and mitigating factors which affected HIT implementation success. We confirmed these findings through semistructured interviews with nine Alberta clinics. Sociotechnical factors significantly affected physicians' implementation success. Physicians reported that the time constraints limited their willingness to investigate, procure, and implement an EMR. The combination of antiquated exam room design, complex HIT user interfaces, insufficient physician computer skills, and the urgency in patient encounters precipitated by a fee-for-service remuneration model and long waitlists compromised the quantity, if not the quality, of the information exchange. Alternative remuneration and access to services plans might be considered to drive prudent behavior during physician office system implementation. PMID:19081787

  9. LBM-EP: Lattice-Boltzmann method for fast cardiac electrophysiology simulation from 3D images.

    PubMed

    Rapaka, S; Mansi, T; Georgescu, B; Pop, M; Wright, G A; Kamen, A; Comaniciu, Dorin

    2012-01-01

    Current treatments of heart rhythm troubles require careful planning and guidance for optimal outcomes. Computational models of cardiac electrophysiology are being proposed for therapy planning but current approaches are either too simplified or too computationally intensive for patient-specific simulations in clinical practice. This paper presents a novel approach, LBM-EP, to solve any type of mono-domain cardiac electrophysiology models at near real-time that is especially tailored for patient-specific simulations. The domain is discretized on a Cartesian grid with a level-set representation of patient's heart geometry, previously estimated from images automatically. The cell model is calculated node-wise, while the transmembrane potential is diffused using Lattice-Boltzmann method within the domain defined by the level-set. Experiments on synthetic cases, on a data set from CESC'10 and on one patient with myocardium scar showed that LBM-EP provides results comparable to an FEM implementation, while being 10 - 45 times faster. Fast, accurate, scalable and requiring no specific meshing, LBM-EP paves the way to efficient and detailed models of cardiac electrophysiology for therapy planning.

  10. A six-degree-of-freedom passive arm with dynamic constraints (PADyC) for cardiac surgery application: preliminary experiments.

    PubMed

    Schneider, O; Troccaz, J

    2001-01-01

    The purpose of Computer-Assisted Surgery (CAS) is to help physicians and surgeons plan and execute optimal strategies from multimodal image data. The execution of such planned strategies may be assisted by guidance systems. Some of these systems, called synergistic systems, are based on the cooperation of a robotic device with a human operator. We have developed such a synergistic device: PADyC (Passive Arm with Dynamic Constraints). The basic principle of PADyC is to have a manually actuated arm that dynamically constrains the authorized motions of the surgical tool held by the human operator during a planned task. Dynamic constraints are computed from the task definition, and are implemented by a patented mechanical system. In this paper, we first introduce synergistic systems and then focus on modeling and algorithmic issues related to the dynamic constraints. Finally, we describe a 6-degree-of-freedom prototype robot designed for a clinical application (cardiac surgery) and report on preliminary experiments to date. The experimental results are then discussed, and future work is proposed. Copyright 2002 Wiley-Liss, Inc.

  11. VirSSPA- a virtual reality tool for surgical planning workflow.

    PubMed

    Suárez, C; Acha, B; Serrano, C; Parra, C; Gómez, T

    2009-03-01

    A virtual reality tool, called VirSSPA, was developed to optimize the planning of surgical processes. Segmentation algorithms for Computed Tomography (CT) images: a region growing procedure was used for soft tissues and a thresholding algorithm was implemented to segment bones. The algorithms operate semiautomati- cally since they only need seed selection with the mouse on each tissue segmented by the user. The novelty of the paper is the adaptation of an enhancement method based on histogram thresholding applied to CT images for surgical planning, which simplifies subsequent segmentation. A substantial improvement of the virtual reality tool VirSSPA was obtained with these algorithms. VirSSPA was used to optimize surgical planning, to decrease the time spent on surgical planning and to improve operative results. The success rate increases due to surgeons being able to see the exact extent of the patient's ailment. This tool can decrease operating room time, thus resulting in reduced costs. Virtual simulation was effective for optimizing surgical planning, which could, consequently, result in improved outcomes with reduced costs.

  12. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Ravindra; Reilly, James T.; Wang, Jianhui

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to commonmore » barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.« less

  13. Implementing the President's Vision: JPL and NASA's Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Sander, Michael J.

    2006-01-01

    As part of the NASA team the Jet Propulsion Laboratory is involved in the Exploration Systems Mission Directorate (ESMD) work to implement the President's Vision for Space exploration. In this slide presentation the roles that are assigned to the various NASA centers to implement the vision are reviewed. The plan for JPL is to use the Constellation program to advance the combination of science an Constellation program objectives. JPL's current participation is to contribute systems engineering support, Command, Control, Computing and Information (C3I) architecture, Crew Exploration Vehicle, (CEV) Thermal Protection System (TPS) project support/CEV landing assist support, Ground support systems support at JSC and KSC, Exploration Communication and Navigation System (ECANS), Flight prototypes for cabin atmosphere instruments

  14. Propagation of registration uncertainty during multi-fraction cervical cancer brachytherapy

    NASA Astrophysics Data System (ADS)

    Amir-Khalili, A.; Hamarneh, G.; Zakariaee, R.; Spadinger, I.; Abugharbieh, R.

    2017-10-01

    Multi-fraction cervical cancer brachytherapy is a form of image-guided radiotherapy that heavily relies on 3D imaging during treatment planning, delivery, and quality control. In this context, deformable image registration can increase the accuracy of dosimetric evaluations, provided that one can account for the uncertainties associated with the registration process. To enable such capability, we propose a mathematical framework that first estimates the registration uncertainty and subsequently propagates the effects of the computed uncertainties from the registration stage through to the visualizations, organ segmentations, and dosimetric evaluations. To ensure the practicality of our proposed framework in real world image-guided radiotherapy contexts, we implemented our technique via a computationally efficient and generalizable algorithm that is compatible with existing deformable image registration software. In our clinical context of fractionated cervical cancer brachytherapy, we perform a retrospective analysis on 37 patients and present evidence that our proposed methodology for computing and propagating registration uncertainties may be beneficial during therapy planning and quality control. Specifically, we quantify and visualize the influence of registration uncertainty on dosimetric analysis during the computation of the total accumulated radiation dose on the bladder wall. We further show how registration uncertainty may be leveraged into enhanced visualizations that depict the quality of the registration and highlight potential deviations from the treatment plan prior to the delivery of radiation treatment. Finally, we show that we can improve the transfer of delineated volumetric organ segmentation labels from one fraction to the next by encoding the computed registration uncertainties into the segmentation labels.

  15. Beyond the computer-based patient record: re-engineering with a vision.

    PubMed

    Genn, B; Geukers, L

    1995-01-01

    In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.

  16. Approved Air Quality Implementation Plans in The Virgin Islands

    EPA Pesticide Factsheets

    This site contains information about air quality regulations called State Implementation Plans (SIPs), Federal Implementation Plans (FIPs), and Tribal Implementation Plans (TIPs) approved by EPA within the U.S. Virgin Islands.

  17. Kinematics, controls, and path planning results for a redundant manipulator

    NASA Technical Reports Server (NTRS)

    Gretz, Bruce; Tilley, Scott W.

    1989-01-01

    The inverse kinematics solution, a modal position control algorithm, and path planning results for a 7 degree of freedom manipulator are presented. The redundant arm consists of two links with shoulder and elbow joints and a spherical wrist. The inverse kinematics problem for tip position is solved and the redundant joint is identified. It is also shown that a locus of tip positions exists in which there are kinematic limitations on self-motion. A computationally simple modal position control algorithm has been developed which guarantees a nearly constant closed-loop dynamic response throughout the workspace. If all closed-loop poles are assigned to the same location, the algorithm can be implemented with very little computation. To further reduce the required computation, the modal gains are updated only at discrete time intervals. Criteria are developed for the frequency of these updates. For commanding manipulator movements, a 5th-order spline which minimizes jerk provides a smooth tip-space path. Schemes for deriving a corresponding joint-space trajectory are discussed. Modifying the trajectory to avoid joint torque saturation when a tip payload is added is also considered. Simulation results are presented.

  18. Computer graphics for management: An abstract of capabilities and applications of the EIS system

    NASA Technical Reports Server (NTRS)

    Solem, B. J.

    1975-01-01

    The Executive Information Services (EIS) system, developed as a computer-based, time-sharing tool for making and implementing management decisions, and including computer graphics capabilities, was described. The following resources are available through the EIS languages: centralized corporate/gov't data base, customized and working data bases, report writing, general computational capability, specialized routines, modeling/programming capability, and graphics. Nearly all EIS graphs can be created by a single, on-line instruction. A large number of options are available, such as selection of graphic form, line control, shading, placement on the page, multiple images on a page, control of scaling and labeling, plotting of cum data sets, optical grid lines, and stack charts. The following are examples of areas in which the EIS system may be used: research, estimating services, planning, budgeting, and performance measurement, national computer hook-up negotiations.

  19. DORMAN computer program (study 2.5). Volume 1: Executive summary. [development of data bank for computerized information storage of NASA programs

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1973-01-01

    The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.

  20. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1B: Concise review

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Southall, J. W.; Kawaguchi, A. S.; Redhed, D. D.

    1973-01-01

    Reports on the design process, support of the design process, IPAD System design catalog of IPAD technical program elements, IPAD System development and operation, and IPAD benefits and impact are concisely reviewed. The approach used to define the design is described. Major activities performed during the product development cycle are identified. The computer system requirements necessary to support the design process are given as computational requirements of the host system, technical program elements and system features. The IPAD computer system design is presented as concepts, a functional description and an organizational diagram of its major components. The cost and schedules and a three phase plan for IPAD implementation are presented. The benefits and impact of IPAD technology are discussed.

  1. APGEN Version 5.0

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre; Page, Dennis; Chase, Adam

    2005-01-01

    Activity Plan Generator (APGEN), now at version 5.0, is a computer program that assists in generating an integrated plan of activities for a spacecraft mission that does not oversubscribe spacecraft and ground resources. APGEN generates an interactive display, through which the user can easily create or modify the plan. The display summarizes the plan by means of a time line, whereon each activity is represented by a bar stretched between its beginning and ending times. Activities can be added, deleted, and modified via simple mouse and keyboard actions. The use of resources can be viewed on resource graphs. Resource and activity constraints can be checked. Types of activities, resources, and constraints are defined by simple text files, which the user can modify. In one of two modes of operation, APGEN acts as a planning expert assistant, displaying the plan and identifying problems in the plan. The user is in charge of creating and modifying the plan. In the other mode, APGEN automatically creates a plan that does not oversubscribe resources. The user can then manually modify the plan. APGEN is designed to interact with other software that generates sequences of timed commands for implementing details of planned activities.

  2. Automation for "Direct-to" Clearances in Air-Traffic Control

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; McNally, David

    2006-01-01

    A method of automation, and a system of computer hardware and software to implement the method, have been invented to assist en-route air-traffic controllers in the issuance of clearances to fly directly to specified waypoints or navigation fixes along straight paths that deviate from previously filed flight plans. Such clearances, called "direct-to" clearances, have been in use since before the invention of this method and system.

  3. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  4. Robust, Optimal Water Infrastructure Planning Under Deep Uncertainty Using Metamodels

    NASA Astrophysics Data System (ADS)

    Maier, H. R.; Beh, E. H. Y.; Zheng, F.; Dandy, G. C.; Kapelan, Z.

    2015-12-01

    Optimal long-term planning plays an important role in many water infrastructure problems. However, this task is complicated by deep uncertainty about future conditions, such as the impact of population dynamics and climate change. One way to deal with this uncertainty is by means of robustness, which aims to ensure that water infrastructure performs adequately under a range of plausible future conditions. However, as robustness calculations require computationally expensive system models to be run for a large number of scenarios, it is generally computationally intractable to include robustness as an objective in the development of optimal long-term infrastructure plans. In order to overcome this shortcoming, an approach is developed that uses metamodels instead of computationally expensive simulation models in robustness calculations. The approach is demonstrated for the optimal sequencing of water supply augmentation options for the southern portion of the water supply for Adelaide, South Australia. A 100-year planning horizon is subdivided into ten equal decision stages for the purpose of sequencing various water supply augmentation options, including desalination, stormwater harvesting and household rainwater tanks. The objectives include the minimization of average present value of supply augmentation costs, the minimization of average present value of greenhouse gas emissions and the maximization of supply robustness. The uncertain variables are rainfall, per capita water consumption and population. Decision variables are the implementation stages of the different water supply augmentation options. Artificial neural networks are used as metamodels to enable all objectives to be calculated in a computationally efficient manner at each of the decision stages. The results illustrate the importance of identifying optimal staged solutions to ensure robustness and sustainability of water supply into an uncertain long-term future.

  5. 77 FR 48061 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Regional Haze State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... Promulgation of Air Quality Implementation Plans; Pennsylvania; Regional Haze State Implementation Plan...'s limited approval of Pennsylvania's Regional Haze State Implementation Plan (SIP). DATES: Effective... announcing our limited approval of Pennsylvania's Regional Haze SIP. In this document, we inadvertently...

  6. Iterations of computer- and template assisted mandibular or maxillary reconstruction with free flaps containing the lateral scapular border--Evolution of a biplanar plug-on cutting guide.

    PubMed

    Cornelius, Carl-Peter; Giessler, Goetz Andreas; Wilde, Frank; Metzger, Marc Christian; Mast, Gerson; Probst, Florian Andreas

    2016-03-01

    Computer-assisted planning and intraoperative implementation using templates have become appreciated modalities in craniofacial reconstruction with fibula and DCIA flaps due to saving in operation time, improved accuracy of osteotomies and easy insetting. Up to now, a similar development for flaps from the subscapular vascular system, namely the lateral scapular border and tip, has not been addressed in the literature. A cohort of 12 patients who underwent mandibular (n = 10) or maxillary (n = 2) reconstruction with free flaps containing the lateral scapular border and tip using computer-assisted planning, stereolithography (STL) models and selective laser sintered (SLS) templates for bone contouring and sub-segmentation osteotomies was reviewed focussing on iterations in the design of computer generated tools and templates. The technical evolution migrated from hybrid STL models over SLS templates for cut out as well as sub-segmentation with a uniplanar framework to plug-on tandem template assemblies providing a biplanar access for the in toto cut out from the posterior aspect in succession with contouring into sub-segments from the medial side. The latest design version is the proof of concept that virtual planning of bone flaps from the lateral scapular border can be successfully transferred into surgery by appropriate templates. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  7. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  8. Automatic vehicle monitoring systems study. Report of phase O. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A set of planning guidelines is presented to help law enforcement agencies and vehicle fleet operators decide which automatic vehicle monitoring (AVM) system could best meet their performance requirements. Improvements in emergency response times and resultant cost benefits obtainable with various operational and planned AVM systems may be synthesized and simulated by means of special computer programs for model city parameters applicable to small, medium, and large urban areas. Design characteristics of various AVM systems and the implementation requirements are illustrated and cost estimated for the vehicles, the fixed sites, and the base equipments. Vehicle location accuracies for different RF links and polling intervals are analyzed.

  9. Exploratory Research and Development Fund, FY 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-05-01

    The Lawrence Berkeley Laboratory Exploratory R D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicinemore » and radiation biophysics.« less

  10. Quality-assurance plan for water-resources activities of the U.S. Geological Survey in Idaho

    USGS Publications Warehouse

    Packard, F.A.

    1996-01-01

    To ensure continued confidence in its products, the Water Resources Division of the U.S. Geological Survey implemented a policy that all its scientific work be performed in accordance with a centrally managed quality-assurance program. This report establishes and documents a formal policy for current (1995) quality assurance within the Idaho District of the U.S. Geological Survey. Quality assurance is formalized by describing district organization and operational responsibilities, documenting the district quality-assurance policies, and describing district functions. The districts conducts its work through offices in Boise, Idaho Falls, Twin Falls, Sandpoint, and at the Idaho National Engineering Laboratory. Data-collection programs and interpretive studies are conducted by two operating units, and operational and technical assistance is provided by three support units: (1) Administrative Services advisors provide guidance on various personnel issues and budget functions, (2) computer and reports advisors provide guidance in their fields, and (3) discipline specialists provide technical advice and assistance to the district and to chiefs of various projects. The district's quality-assurance plan is based on an overall policy that provides a framework for defining the precision and accuracy of collected data. The plan is supported by a series of quality-assurance policy statements that describe responsibilities for specific operations in the district's program. The operations are program planning; project planning; project implementation; review and remediation; data collection; equipment calibration and maintenance; data processing and storage; data analysis, synthesis, and interpretation; report preparation and processing; and training. Activities of the district are systematically conducted under a hierarchy of supervision an management that is designed to ensure conformance with Water Resources Division goals quality assurance. The district quality-assurance plan does not describe detailed technical activities that are commonly termed "quality-control procedures." Instead, it focuses on current policies, operations, and responsibilities that are implemented at the management level. Contents of the plan will be reviewed annually and updated as programs and operations change.

  11. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    PubMed

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  13. Traffic routing in a switched regenerative satellite. Volume 1, task 3: Traffic assignment

    NASA Astrophysics Data System (ADS)

    1982-12-01

    Time plan assignment in a multibeam SS-TDMA is discussed. System features fixed by the designer, such as the number and the speed of ground terminals installed in each station, and the number and the speed of satellite transponders working in each spot are described. Linkage among terminals and transponders is also discussed, including having more than one transponder linked to one terminal. A procedure to achieve a switching plan with high efficiency, taking into account all system constraints such as no bursts breaking and two transmission rates harmonization is proposed. Algorithms to be implemented are: the Hungarian method; branch and bound; the INSERT heuristic; and the HOLE heuristic. Computer programs were developed, and a time plan for a European Satellite System is produced.

  14. A Spacelab Expert System for Remote Engineering and Science

    NASA Technical Reports Server (NTRS)

    Groleau, Nick; Colombano, Silvano; Friedland, Peter (Technical Monitor)

    1994-01-01

    NASA's space science program is based on strictly pre-planned activities. This approach does not always result in the best science. We describe an existing computer system that enables space science to be conducted in a more reactive manner through advanced automation techniques that have recently been used in SLS-2 October 1993 space shuttle flight. Advanced computing techniques, usually developed in the field of Artificial Intelligence, allow large portions of the scientific investigator's knowledge to be "packaged" in a portable computer to present advice to the astronaut operator. We strongly believe that this technology has wide applicability to other forms of remote science/engineering. In this brief article, we present the technology of remote science/engineering assistance as implemented for the SLS-2 space shuttle flight. We begin with a logical overview of the system (paying particular attention to the implementation details relevant to the use of the embedded knowledge for system reasoning), then describe its use and success in space, and conclude with ideas about possible earth uses of the technology in the life and medical sciences.

  15. Development of digital interactive processing system for NOAA satellites AVHRR data

    NASA Astrophysics Data System (ADS)

    Gupta, R. K.; Murthy, N. N.

    The paper discusses the digital image processing system for NOAA/AVHRR data including Land applications - configured around VAX 11/750 host computer supported with FPS 100 Array Processor, Comtal graphic display and HP Plotting devices; wherein the system software for relational Data Base together with query and editing facilities, Man-Machine Interface using form, menu and prompt inputs including validation of user entries for data type and range; preprocessing software for data calibration, Sun-angle correction, Geometric Corrections for Earth curvature effect and Earth rotation offsets and Earth location of AVHRR image have been accomplished. The implemented image enhancement techniques such as grey level stretching, histogram equalization and convolution are discussed. The software implementation details for the computation of vegetative index and normalized vegetative index using NOAA/AVHRR channels 1 and 2 data together with output are presented; scientific background for such computations and obtainability of similar indices from Landsat/MSS data are also included. The paper concludes by specifying the further software developments planned and the progress envisaged in the field of vegetation index studies.

  16. Software components for medical image visualization and surgical planning

    NASA Astrophysics Data System (ADS)

    Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.

    2001-05-01

    Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been licensed and certified for use in a commercial image guidance system. Conclusions: It is feasible to encapsulate image manipulation and surgical guidance tasks in individual, reusable software modules. These modules allow for faster development of new applications. The strict application of object oriented software design methods allows individual components of such a system to make the transition from the research environment to a commercial one.

  17. The "big bang" implementation: not for the faint of heart.

    PubMed

    Anderson, Linda K; Stafford, Cynthia J

    2002-01-01

    Replacing a hospital's obsolete mainframe computer system with a modern integrated clinical and administrative information system presents multiple challenges. When the new system is activated in one weekend, in "big bang" fashion, the challenges are magnified. Careful planning is essential to ensure that all hospital staff are fully prepared for this transition, knowing this conversion will involve system downtime, procedural changes, and the resulting stress that naturally accompanies change. Implementation concerns include staff preparation and training, process changes, continuity of patient care, and technical and administrative support. This article outlines how the University of Missouri Health Care addressed these operational concerns during this dramatic information system conversion.

  18. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  19. 77 FR 5191 - Approval and Promulgation of Air Quality Implementation Plans; District of Columbia; Regional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-02

    ... Promulgation of Air Quality Implementation Plans; District of Columbia; Regional Haze State Implementation Plan... of Columbia Regional Haze Plan, a revision to the District of Columbia State Implementation Plan (SIP... existing anthropogenic impairment of visibility in mandatory Class I areas through a regional haze program...

  20. Computed Tomographic Angiographic Perforator Localization for Virtual Surgical Planning of Osteocutaneous Fibular Free Flaps in Head and Neck Reconstruction.

    PubMed

    Ettinger, Kyle S; Alexander, Amy E; Arce, Kevin

    2018-04-10

    Virtual surgical planning (VSP), computer-aided design and computer-aided modeling, and 3-dimensional printing are 3 distinct technologies that have become increasingly used in head and neck oncology and microvascular reconstruction. Although each of these technologies has long been used for treatment planning in other surgical disciplines, such as craniofacial surgery, trauma surgery, temporomandibular joint surgery, and orthognathic surgery, its widespread use in head and neck reconstructive surgery remains a much more recent event. In response to the growing trend of VSP being used for the planning of fibular free flaps in head and neck reconstruction, some surgeons have questioned the technology's implementation based on its inadequacy in addressing other reconstructive considerations beyond hard tissue anatomy. Detractors of VSP for head and neck reconstruction highlight its lack of capability in accounting for multiple reconstructive factors, such as recipient vessel selection, vascular pedicle reach, need for dead space obliteration, and skin paddle perforator location. It is with this premise in mind that the authors report on a straightforward technique for anatomically localizing peroneal artery perforators during VSP for osteocutaneous fibular free flaps in which bone and a soft tissue skin paddle are required for ablative reconstruction. The technique allows for anatomic perforator localization during the VSP session based solely on data existent at preoperative computed tomographic angiography (CTA); it does not require any modifications to preoperative clinical workflows. It is the authors' presumption that many surgeons in the field are unaware of this planning capability within the context of modern VSP for head and neck reconstruction. The primary purpose of this report is to introduce and further familiarize surgeons with the technique of CTA perforator localization as a method of improving intraoperative fidelity for VSP of osteocutaneous fibular free flaps. Copyright © 2018. Published by Elsevier Inc.

  1. Asymmetric collimation: Dosimetric characteristics, treatment planning algorithm, and clinical applications

    NASA Astrophysics Data System (ADS)

    Kwa, William

    1998-11-01

    In this thesis the dosimetric characteristics of asymmetric fields are investigated and a new computation method for the dosimetry of asymmetric fields is described and implemented into an existing treatment planning algorithm. Based on this asymmetric field treatment planning algorithm, the clinical use of asymmetric fields in cancer treatment is investigated, and new treatment techniques for conformal therapy are developed. Dose calculation is verified with thermoluminescent dosimeters in a body phantom. In this thesis, an analytical approach is proposed to account for the dose reduction when a corresponding symmetric field is collimated asymmetrically to a smaller asymmetric field. This is represented by a correction factor that uses the ratio of the equivalent field dose contributions between the asymmetric and symmetric fields. The same equation used in the expression of the correction factor can be used for a wide range of asymmetric field sizes, photon energies and linear accelerators. This correction factor will account for the reduction in scatter contributions within an asymmetric field, resulting in the dose profile of an asymmetric field resembling that of a wedged field. The output factors of some linear accelerators are dependent on the collimator settings and whether the upper or lower collimators are used to set the narrower dimension of a radiation field. In addition to this collimator exchange effect for symmetric fields, asymmetric fields are also found to exhibit some asymmetric collimator backscatter effect. The proposed correction factor is extended to account for these effects. A set of correction factors determined semi-empirically to account for the dose reduction in the penumbral region and outside the radiated field is established. Since these correction factors rely only on the output factors and the tissue maximum ratios, they can easily be implemented into an existing treatment planning system. There is no need to store either additional sets of asymmetric field profiles or databases for the implementation of these correction factors into an existing in-house treatment planning system. With this asymmetric field algorithm, the computation time is found to be 20 times faster than a commercial system. This computation method can also be generalized to the dose representation of a two-fold asymmetric field whereby both the field width and length are set asymmetrically, and the calculations are not limited to points lying on one of the principal planes. The dosimetric consequences of asymmetric fields on the dose delivery in clinical situations are investigated. Examples of the clinical use of asymmetric fields are given and the potential use of asymmetric fields in conformal therapy is demonstrated. An alternative head and neck conformal therapy is described, and the treatment plan is compared to the conventional technique. The dose distributions calculated for the standard and alternative techniques are confirmed with thermoluminescent dosimeters in a body phantom at selected dose points. (Abstract shortened by UMI.)

  2. Attention Demands of Spoken Word Planning: A Review

    PubMed Central

    Roelofs, Ardi; Piai, Vitória

    2011-01-01

    Attention and language are among the most intensively researched abilities in the cognitive neurosciences, but the relation between these abilities has largely been neglected. There is increasing evidence, however, that linguistic processes, such as those underlying the planning of words, cannot proceed without paying some form of attention. Here, we review evidence that word planning requires some but not full attention. The evidence comes from chronometric studies of word planning in picture naming and word reading under divided attention conditions. It is generally assumed that the central attention demands of a process are indexed by the extent that the process delays the performance of a concurrent unrelated task. The studies measured the speed and accuracy of linguistic and non-linguistic responding as well as eye gaze durations reflecting the allocation of attention. First, empirical evidence indicates that in several task situations, processes up to and including phonological encoding in word planning delay, or are delayed by, the performance of concurrent unrelated non-linguistic tasks. These findings suggest that word planning requires central attention. Second, empirical evidence indicates that conflicts in word planning may be resolved while concurrently performing an unrelated non-linguistic task, making a task decision, or making a go/no-go decision. These findings suggest that word planning does not require full central attention. We outline a computationally implemented theory of attention and word planning, and describe at various points the outcomes of computer simulations that demonstrate the utility of the theory in accounting for the key findings. Finally, we indicate how attention deficits may contribute to impaired language performance, such as in individuals with specific language impairment. PMID:22069393

  3. Poster — Thur Eve — 76: Dosimetric Comparison of Pinnacle and iPlan Algorithms with an Anthropomorphic Lung Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, P.; Tambasco, M.; LaFontaine, R.

    2014-08-15

    Our goal is to compare the dosimetric accuracy of the Pinnacle-3 9.2 Collapsed Cone Convolution Superposition (CCCS) and the iPlan 4.1 Monte Carlo (MC) and Pencil Beam (PB) algorithms in an anthropomorphic lung phantom using measurement as the gold standard. Ion chamber measurements were taken for 6, 10, and 18 MV beams in a CIRS E2E SBRT Anthropomorphic Lung Phantom, which mimics lung, spine, ribs, and tissue. The plan implemented six beams with a 5×5 cm{sup 2} field size, delivering a total dose of 48 Gy. Data from the planning systems were computed at the treatment isocenter in the leftmore » lung, and two off-axis points, the spinal cord and the right lung. The measurements were taken using a pinpoint chamber. The best results between data from the algorithms and our measurements occur at the treatment isocenter. For the 6, 10, and 18 MV beams, iPlan 4.1 MC software performs the best with 0.3%, 0.2%, and 4.2% absolute percent difference from measurement, respectively. Differences between our measurements and algorithm data are much greater for the off-axis points. The best agreement seen for the right lung and spinal cord is 11.4% absolute percent difference with 6 MV iPlan 4.1 PB and 18 MV iPlan 4.1 MC, respectively. As energy increases absolute percent difference from measured data increases up to 54.8% for the 18 MV CCCS algorithm. This study suggests that iPlan 4.1 MC computes peripheral dose and target dose in the lung more accurately than the iPlan 4.1 PB and Pinnicale CCCS algorithms.« less

  4. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  5. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-06-07

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  6. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  7. A new approach to integrate GPU-based Monte Carlo simulation into inverse treatment plan optimization for proton therapy.

    PubMed

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2017-01-07

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6  ±  15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.

  8. A new approach to integrate GPU-based Monte Carlo simulation into inverse treatment plan optimization for proton therapy

    NASA Astrophysics Data System (ADS)

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2017-01-01

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6  ±  15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.

  9. A New Approach to Integrate GPU-based Monte Carlo Simulation into Inverse Treatment Plan Optimization for Proton Therapy

    PubMed Central

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2016-01-01

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6±15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size. PMID:27991456

  10. SU-E-T-395: Multi-GPU-Based VMAT Treatment Plan Optimization Using a Column-Generation Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Shi, F; Jia, X

    Purpose: GPU has been employed to speed up VMAT optimizations from hours to minutes. However, its limited memory capacity makes it difficult to handle cases with a huge dose-deposition-coefficient (DDC) matrix, e.g. those with a large target size, multiple arcs, small beam angle intervals and/or small beamlet size. We propose multi-GPU-based VMAT optimization to solve this memory issue to make GPU-based VMAT more practical for clinical use. Methods: Our column-generation-based method generates apertures sequentially by iteratively searching for an optimal feasible aperture (referred as pricing problem, PP) and optimizing aperture intensities (referred as master problem, MP). The PP requires accessmore » to the large DDC matrix, which is implemented on a multi-GPU system. Each GPU stores a DDC sub-matrix corresponding to one fraction of beam angles and is only responsible for calculation related to those angles. Broadcast and parallel reduction schemes are adopted for inter-GPU data transfer. MP is a relatively small-scale problem and is implemented on one GPU. One headand- neck cancer case was used for test. Three different strategies for VMAT optimization on single GPU were also implemented for comparison: (S1) truncating DDC matrix to ignore its small value entries for optimization; (S2) transferring DDC matrix part by part to GPU during optimizations whenever needed; (S3) moving DDC matrix related calculation onto CPU. Results: Our multi-GPU-based implementation reaches a good plan within 1 minute. Although S1 was 10 seconds faster than our method, the obtained plan quality is worse. Both S2 and S3 handle the full DDC matrix and hence yield the same plan as in our method. However, the computation time is longer, namely 4 minutes and 30 minutes, respectively. Conclusion: Our multi-GPU-based VMAT optimization can effectively solve the limited memory issue with good plan quality and high efficiency, making GPUbased ultra-fast VMAT planning practical for real clinical use.« less

  11. Evaluating the impact of computer-generated rounding reports on physician workflow in the nursing home: a feasibility time-motion study.

    PubMed

    Thorpe-Jamison, Patrice T; Culley, Colleen M; Perera, Subashan; Handler, Steven M

    2013-05-01

    To determine the feasibility and impact of a computer-generated rounding report on physician rounding time and perceived barriers to providing clinical care in the nursing home (NH) setting. Three NHs located in Pittsburgh, PA. Ten attending NH physicians. Time-motion method to record the time taken to gather data (pre-rounding), to evaluate patients (rounding), and document their findings/develop an assessment and plan (post-rounding). Additionally, surveys were used to determine the physicians' perception of barriers to providing optimal clinical care, as well as physician satisfaction before and after the use of a computer-generated rounding report. Ten physicians were observed during half-day sessions both before and 4 weeks after they were introduced to a computer-generated rounding report. A total of 69 distinct patients were evaluated during the 20 physician observation sessions. Each physician evaluated, on average, four patients before implementation and three patients after implementation. The observations showed a significant increase (P = .03) in the pre-rounding time, and no significant difference in the rounding (P = .09) or post-rounding times (P = .29). Physicians reported that information was more accessible (P = .03) following the implementation of the computer-generated rounding report. Most (80%) physicians stated that they would prefer to use the computer-generated rounding report rather than the paper-based process. The present study provides preliminary data suggesting that the use of a computer-generated rounding report can decrease some perceived barriers to providing optimal care in the NH. Although the rounding report did not improve rounding time efficiency, most NH physicians would prefer to use the computer-generated report rather than the current paper-based process. Improving the accuracy and harmonization of medication information with the electronic medication administration record and rounding reports, as well as improving facility network speeds might improve the effectiveness of this technology. Copyright © 2013 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  12. Online dose reconstruction for tracked volumetric arc therapy: Real-time implementation and offline quality assurance for prostate SBRT.

    PubMed

    Kamerling, Cornelis Ph; Fast, Martin F; Ziegenhein, Peter; Menten, Martin J; Nill, Simeon; Oelfke, Uwe

    2017-11-01

    Firstly, this study provides a real-time implementation of online dose reconstruction for tracked volumetric arc therapy (VMAT). Secondly, this study describes a novel offline quality assurance tool, based on commercial dose calculation algorithms. Online dose reconstruction for VMAT is a computationally challenging task in terms of computer memory usage and calculation speed. To potentially reduce the amount of memory used, we analyzed the impact of beam angle sampling for dose calculation on the accuracy of the dose distribution. To establish the performance of the method, we planned two single-arc VMAT prostate stereotactic body radiation therapy cases for delivery with dynamic MLC tracking. For quality assurance of our online dose reconstruction method we have also developed a stand-alone offline dose reconstruction tool, which utilizes the RayStation treatment planning system to calculate dose. For the online reconstructed dose distributions of the tracked deliveries, we could establish strong resemblance for 72 and 36 beam co-planar equidistant beam samples with less than 1.2% deviation for the assessed dose-volume indicators (clinical target volume D98 and D2, and rectum D2). We could achieve average runtimes of 28-31 ms per reported MLC aperture for both dose computation and accumulation, meeting our real-time requirement. To cross-validate the offline tool, we have compared the planned dose to the offline reconstructed dose for static deliveries and found excellent agreement (3%/3 mm global gamma passing rates of 99.8%-100%). Being able to reconstruct dose during delivery enables online quality assurance and online replanning strategies for VMAT. The offline quality assurance tool provides the means to validate novel online dose reconstruction applications using a commercial dose calculation engine. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  13. 24 CFR 598.605 - Implementation plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 3 2014-04-01 2013-04-01 true Implementation plan. 598.605 Section 598.605 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... Grants § 598.605 Implementation plan. (a) Implementation plan content. An EZ must submit an...

  14. 77 FR 41279 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Regional Haze State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... Promulgation of Air Quality Implementation Plans; Pennsylvania; Regional Haze State Implementation Plan AGENCY... the Regional Haze State Implementation Plan (SIP) (hereafter RH SIP) revision submitted by the... anthropogenic impairment of visibility in mandatory Class I areas [[Page 41280

  15. 76 FR 12945 - Instructions for Implementing Climate Change Adaptation Planning in Accordance With Executive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... COUNCIL ON ENVIRONMENTAL QUALITY Instructions for Implementing Climate Change Adaptation Planning... Availability of Climate Change Adaptation Planning Implementing Instructions. SUMMARY: The Chair of the Council... for Implementing Climate Change Adaptation Planning are now available at: http://www.whitehouse.gov...

  16. Flight software requirements and design support system

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.; Edwards, B.

    1980-01-01

    The desirability and feasibility of computer-augmented support for the pre-implementation activities occurring during the development of flight control software was investigated. The specific topics to be investigated were the capabilities to be included in a pre-implementation support system for flight control software system development, and the specification of a preliminary design for such a system. Further, the pre-implementation support system was to be characterized and specified under the constraints that it: (1) support both description and assessment of flight control software requirements definitions and design specification; (2) account for known software description and assessment techniques; (3) be compatible with existing and planned NASA flight control software development support system; and (4) does not impose, but may encourage, specific development technologies. An overview of the results is given.

  17. Electronic Neural Networks

    NASA Technical Reports Server (NTRS)

    Thakoor, Anil

    1990-01-01

    Viewgraphs on electronic neural networks for space station are presented. Topics covered include: electronic neural networks; electronic implementations; VLSI/thin film hybrid hardware for neurocomputing; computations with analog parallel processing; features of neuroprocessors; applications of neuroprocessors; neural network hardware for terrain trafficability determination; a dedicated processor for path planning; neural network system interface; neural network for robotic control; error backpropagation algorithm for learning; resource allocation matrix; global optimization neuroprocessor; and electrically programmable read only thin-film synaptic array.

  18. Design standards. Cutting the costs you don't see.

    PubMed

    Jolley, K

    1995-07-01

    In an era of hospital cost cutting and reengineering at all levels, it is still important to implement interior design standards in planning, selecting, and arranging the products that affect hospital image and function. Design standards manuals include manufacturer and model of furniture, fixtures, and equipment from desks, chairs, and computer stands to window treatments. This article reviews how to save on interior design costs while keeping a professional healthcare image.

  19. Analysis of a Proposed Material Handling System Using a Computer Simulation Model.

    DTIC Science & Technology

    1981-06-01

    the proposed MMHS were identified to assist the managers of the system in implementation and future planning. * 4 UNCLASSIFIED SRCUllTY CLASSIPICATION...the Degree of Master of Science in Logistics Management By Darwin D. Harp, BSIE GS-11. June 1981 Approved for public release; distribution unlimited...partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN LOGISTICS MANAGEMENT DATE: 17 June 1981 (( COMMITECARN ii 67- B I

  20. A Computer Engineering Curriculum for the Air Force Academy: An Implementation Plan

    DTIC Science & Technology

    1985-04-01

    engineerinq is needed as a r ul of the findings? 5. What is the impact of this study’s rocommendat ion to pursue the Electrico I Engineering deqree with onpt...stepper motor 9 S35 LAB 36 Serial 10 S37 GR #3 - 38 8251 10 chip ) 39 LAB serial 10 10 * 40 LAB " 1)41 LAB S 42 Course review - S FINAL EXAM 00 % 80 0

  1. HAL/S-360 compiler test activity report

    NASA Technical Reports Server (NTRS)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  2. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    PubMed

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  3. 40 CFR 52.672 - Approval of plans.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Idaho § 52.672 Approval of plans. (a) Carbon Monoxide. (1) EPA approves as a revision to the Idaho State Implementation Plan, the Limited Maintenance Plan for.... [Reserved] (e) Particulate Matter. (1) EPA approves as a revision to the Idaho State Implementation Plan...

  4. 78 FR 17835 - Approval and Promulgation of Federal Implementation Plan for Oil and Natural Gas Well Production...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ...The EPA is taking final action to promulgate a Reservation- specific Federal Implementation Plan in order to regulate emissions from oil and natural gas production facilities located on the Fort Berthold Indian Reservation in North Dakota. The Federal Implementation Plan includes basic air quality regulations for the protection of communities in and adjacent to the Fort Berthold Indian Reservation. The Federal Implementation Plan requires owners and operators of oil and natural gas production facilities to reduce emissions of volatile organic compounds emanating from well completions, recompletions, and production and storage operations. This Federal Implementation Plan will be implemented by the EPA, or a delegated tribal authority, until replaced by a Tribal Implementation Plan. The EPA proposed a Reservation-specific Federal Implementation Plan concurrently with an interim final rule on August 15, 2012. This final Federal Implementation Plan replaces the interim final rule in all intents and purposes on the effective date of the final rule. The EPA is taking this action pursuant to the Clean Air Act (CAA).

  5. 24 CFR 598.605 - Implementation plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 3 2012-04-01 2012-04-01 false Implementation plan. 598.605 Section 598.605 Housing and Urban Development Regulations Relating to Housing and Urban Development... Empowerment Zone Grants § 598.605 Implementation plan. (a) Implementation plan content. An EZ must submit an...

  6. 24 CFR 598.605 - Implementation plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 3 2013-04-01 2013-04-01 false Implementation plan. 598.605 Section 598.605 Housing and Urban Development Regulations Relating to Housing and Urban Development... Empowerment Zone Grants § 598.605 Implementation plan. (a) Implementation plan content. An EZ must submit an...

  7. 78 FR 68005 - Approval and Promulgation of Implementation Plans; Mississippi; Transportation Conformity SIP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... this Federal Register, EPA is approving the State's implementation plan revision as a direct final rule...] Approval and Promulgation of Implementation Plans; Mississippi; Transportation Conformity SIP--Memorandum... proposing to approve a State Implementation Plan revision submitted by the Mississippi Department of...

  8. Effective protection of open space: does planning matter?

    PubMed

    Steelman, Toddi A; Hess, George R

    2009-07-01

    High quality plans are considered a crucial part of good land use planning and often used as a proxy measure for success in plan implementation and goal attainment. We explored the relationship of open space plan quality to the implementation of open space plans and attainment of open space protection goals in Research Triangle, North Carolina, USA. To measure plan quality, we used a standard plan evaluation matrix that we modified to focus on open space plans. We evaluated all open space plans in the region that contained a natural resource protection element. To measure plan implementation and open space protection, we developed an online survey and administered it to open space planners charged with implementing the plans. The survey elicited each planner's perspective on aspects of open space protection in his or her organization. The empirical results (1) indicate that success in implementation and attaining goals are not related to plan quality, (2) highlight the importance of when and how stakeholders are involved in planning and implementation processes, and (3) raise questions about the relationship of planning to implementation. These results suggest that a technically excellent plan does not guarantee the long-term relationships among local land owners, political and appointed officials, and other organizations that are crucial to meeting land protection goals. A greater balance of attention to the entire decision process and building relationships might lead to more success in protecting open space.

  9. SU-F-T-100: Development and Implementation of a Treatment Planning Tracking System Into the Radiation Oncology Clinic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Cline, K; Li, Y

    Purpose: With increasing numbers of cancer patients being diagnosed and the complexity of radiotherapy treatments rising it’s paramount that patient plan development continues to stay fluid within the clinic. In order to maintain a high standard of care and clinical efficiency the establishment of a tracking system for patient plan development allows healthcare providers to view real time plan progression and drive clinical workflow. In addition, it provides statistical datasets which can further identify inefficiencies within the clinic. Methods: An application was developed utilizing Microsoft’s ODBC SQL database engine to track patient plan status throughout the treatment planning process whilemore » also managing key factors pertaining to the patient’s treatment. Pertinent information is accessible to staff in many locations, including tracking monitors within dosimetry, the clinic network for both computers and handheld devices, and through email notifications. Plans are initiated with a CT and continually tracked through planning stages until final approval by staff. Patient’s status is dynamically updated by the physicians, dosimetrists, and medical physicists based on the stage of the patient’s plan. Results: Our application has been running over a six month period with all patients being processed through the system. Modifications have been made to allow for new features to be implemented along with additional tracking parameters. Based on in-house feedback, the application has been supportive in streamlining patient plans through the treatment planning process and data has been accumulating to further improve procedures within the clinic. Conclusion: Over time the clinic will continue to track data with this application. As data accumulates the clinic will be able to highlight inefficiencies within the workflow and adapt accordingly. We will add in new features to help support the treatment planning process in the future.« less

  10. White paper: A plan for cooperation between NASA and DARPA to establish a center for advanced architectures

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.

    1986-01-01

    Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.

  11. 78 FR 41731 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-11

    ... Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four Corners Power... Implementation Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional... given the uncertainties in the electrical market in Arizona, EPA is proposing to extend the date by...

  12. Task-based image quality assessment in radiation therapy: initial characterization and demonstration with CT simulation images

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua

    2017-03-01

    In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.

  13. 78 FR 40654 - Approval, Disapproval and Promulgation of Implementation Plans; State of Wyoming; Regional Haze...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ..., Disapproval and Promulgation of Implementation Plans; State of Wyoming; Regional Haze State Implementation Plan; Federal Implementation Plan for Regional Haze; Notice of Public Hearings AGENCY: Environmental...) addressing regional haze under. We are making this change in response to letters submitted by the Governor of...

  14. 78 FR 78310 - Approval and Promulgation of Implementation Plans; North Carolina; Transportation Conformity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... Federal Register, EPA is approving the State's implementation plan revision as a direct final rule without...] Approval and Promulgation of Implementation Plans; North Carolina; Transportation Conformity Memorandum of... Implementation Plan submitted on July 12, 2013, through the North Carolina Department of Environment and Natural...

  15. 77 FR 58072 - Finding of Substantial Inadequacy of Implementation Plan; Call for California State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-19

    ... Substantial Inadequacy of Implementation Plan; Call for California State Implementation Plan Revision; South Coast AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY: In response to a... that the California State Implementation Plan (SIP) for the Los Angeles-South Coast Air Basin (South...

  16. 78 FR 30770 - Approval and Promulgation of Air Quality Implementation Plans; Illinois; Air Quality Standards...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ... Promulgation of Air Quality Implementation Plans; Illinois; Air Quality Standards Revision AGENCY... Illinois state implementation plan (SIP) to reflect current National Ambient Air Quality Standards (NAAQS... Implementation Plan at 35 Illinois Administrative Code part 243, which updates National Ambient Air Quality...

  17. Exploratory benchtop study evaluating the use of surgical design and simulation in fibula free flap mandibular reconstruction

    PubMed Central

    2013-01-01

    Background Surgical design and simulation (SDS) is a useful tool to help surgeons visualize the anatomy of the patient and perform operative maneuvers on the computer before implementation in the operating room. While these technologies have many advantages, further evidence of their potential to improve outcomes is required. The present benchtop study was intended to identify if there is a difference in surgical outcome between free-hand surgery completed without virtual surgical planning (VSP) software and preoperatively planned surgery completed with the use of VSP software. Methods Five surgeons participated in the study. In Session A, participants were asked to do a free-hand reconstruction of a 3d printed mandible with a defect using a 3d printed fibula. Four weeks later, in Session B, the participants were asked to do the same reconstruction, but in this case using a preoperatively digitally designed surgical plan. Digital registration computer software, hard tissue measures and duration of the task were used to compare the outcome of the benchtop reconstructions. Results The study revealed that: (1) superimposed images produced in a computer aided design (CAD) software were effective in comparing pre and post-surgical outcomes, (2) there was a difference, based on hard tissue measures, in surgical outcome between the two scenarios and (3) there was no difference in the time it took to complete the sessions. Conclusion The study revealed that the participants were more consistent in the preoperatively digitally planned surgery than they were in the free hand surgery. PMID:23800209

  18. BUDEM: an urban growth simulation model using CA for Beijing metropolitan area

    NASA Astrophysics Data System (ADS)

    Long, Ying; Shen, Zhenjiang; Du, Liqun; Mao, Qizhi; Gao, Zhanping

    2008-10-01

    It is in great need of identifying the future urban form of Beijing, which faces challenges of rapid growth in urban development projects implemented in Beijing. We develop Beijing Urban Developing Model (BUDEM in short) to support urban planning and corresponding policies evaluation. BUDEM is the spatio-temporal dynamic model for simulating urban growth in Beijing metropolitan area, using cellular automata (CA) and Multi-agent system (MAS) approaches. In this phase, the computer simulation using CA in Beijing metropolitan area is conducted, which attempts to provide a premise of urban activities including different kinds of urban development projects for industrial plants, shopping facilities, houses. In the paper, concept model of BUDEM is introduced, which is established basing on prevalent urban growth theories. The method integrating logistic regression and MonoLoop is used to retrieve weights in the transition rule by MCE. After model sensibility analysis, we apply BUDEM into three aspects of urban planning practices: (1) Identifying urban growth mechanism in various historical phases since 1986; (2) Identifying urban growth policies needed to implement desired urban form (BEIJING2020), namely planned urban form; (3) Simulating urban growth scenarios of 2049 (BEIJING2049) basing on the urban form and parameter set of BEIJING2020.

  19. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    PubMed

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  20. Exploiting Quantum Resonance to Solve Combinatorial Problems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Fijany, Amir

    2006-01-01

    Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.

  1. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGrail, B.P.; Mahoney, L.A.

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less

  2. Do's and Don'ts of Computer Models for Planning

    ERIC Educational Resources Information Center

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  3. DSMC (Defense Systems Management College) CALS (Computer-Aided Acquisition and Logistics Support) Briefing: Program Manager Course.

    DTIC Science & Technology

    1988-11-01

    library . o Air Force Tech Order Management System - Final Report, library o DLA CALS 1988 Implementation Plan, library . Where to go for Additional...0. ~l l LU; 0. 0 0 2. o 3 0 I) 0 U no V) 4- C 4. U) 00 u c.C0 Cco C Cl) cc m~0-CU . d" CD 0 m mooc Er.C 0 .0> s -w 2 c IM CO (aC wi E 0 r. X 0-0 a 0...as well as wider application. The Air Force AFTOMS Automation Plan, a copy of which is in the library , has excellent discussions of the expected

  4. Exploratory Research and Development Fund, FY 1990. Report on Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-05-01

    The Lawrence Berkeley Laboratory Exploratory R&D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R&D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicine and radiationmore » biophysics.« less

  5. Dose computation for therapeutic electron beams

    NASA Astrophysics Data System (ADS)

    Glegg, Martin Mackenzie

    The accuracy of electron dose calculations performed by two commercially available treatment planning computers, Varian Cadplan and Helax TMS, has been assessed. Measured values of absorbed dose delivered by a Varian 2100C linear accelerator, under a wide variety of irradiation conditions, were compared with doses calculated by the treatment planning computers. Much of the motivation for this work was provided by a requirement to verify the accuracy of calculated electron dose distributions in situations encountered clinically at Glasgow's Beatson Oncology Centre. Calculated dose distributions are required in a significant minority of electron treatments, usually in cases involving treatment to the head and neck. Here, therapeutic electron beams are subject to factors which may cause non-uniformity in the distribution of dose, and which may complicate the calculation of dose. The beam shape is often irregular, the beam may enter the patient at an oblique angle or at an extended source to skin distance (SSD), tissue inhomogeneities can alter the dose distribution, and tissue equivalent material (such as wax) may be added to reduce dose to critical organs. Technological advances have allowed the current generation of treatment planning computers to implement dose calculation algorithms with the ability to model electron beams in these complex situations. These calculations have, however, yet to be verified by measurement. This work has assessed the accuracy of calculations in a number of specific instances. Chapter two contains a comparison of measured and calculated planar electron isodose distributions. Three situations were considered: oblique incidence, incidence on an irregular surface (such as that which would be arise from the use of wax to reduce dose to spinal cord), and incidence on a phantom containing a small air cavity. Calculations were compared with measurements made by thermoluminescent dosimetry (TLD) in a WTe electron solid water phantom. Chapter three assesses the planning computers' ability to model electron beam penumbra at extended SSD. Calculations were compared with diode measurements in a water phantom. Further measurements assessed doses in the junction region produced by abutting an extended SSD electron field with opposed photon fields. Chapter four describes an investigation of the size and shape of the region enclosed by the 90% isodose line when produced by limiting the electron beam with square and elliptical apertures. The 90% isodose line was chosen because clinical treatments are often prescribed such that a given volume receives at least 90% dose. Calculated and measured dose distributions were compared in a plane normal to the beam central axis. Measurements were made by film dosimetry. While chapters two to four examine relative doses, chapter five assesses the accuracy of absolute dose (or output) calculations performed by the planning computers. Output variation with SSD and field size was examined. Two further situations already assessed for the distribution of relative dose were also considered: an obliquely incident field, and a field incident on an irregular surface. The accuracy of calculations was assessed against criteria stipulated by the International Commission on Radiation Units and Measurement (ICRU). The Varian Cadplan and Helax TMS treatment planning systems produce acceptable accuracy in the calculation of relative dose from therapeutic electron beams in most commonly encountered situations. When interpreting clinical dose distributions, however, knowledge of the limitations of the calculation algorithm employed by each system is required in order to identify the minority of situations where results are not accurate. The calculation of absolute dose is too inaccurate to implement in a clinical environment. (Abstract shortened by ProQuest.).

  6. Incipient fault detection study for advanced spacecraft systems

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.

    1986-01-01

    A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.

  7. 40 CFR 52.1392 - Federal Implementation Plan for the Billings/Laurel Area.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 4 2013-07-01 2013-07-01 false Federal Implementation Plan for the Billings/Laurel Area. 52.1392 Section 52.1392 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Montana § 52.1392 Federal Implementation Plan for...

  8. 40 CFR 52.1392 - Federal Implementation Plan for the Billings/Laurel Area.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 4 2012-07-01 2012-07-01 false Federal Implementation Plan for the Billings/Laurel Area. 52.1392 Section 52.1392 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Montana § 52.1392 Federal Implementation Plan for...

  9. 40 CFR 52.1392 - Federal Implementation Plan for the Billings/Laurel Area.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 4 2014-07-01 2014-07-01 false Federal Implementation Plan for the Billings/Laurel Area. 52.1392 Section 52.1392 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Montana § 52.1392 Federal Implementation Plan for...

  10. 40 CFR 52.1392 - Federal Implementation Plan for the Billings/Laurel Area.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Federal Implementation Plan for the Billings/Laurel Area. 52.1392 Section 52.1392 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Montana § 52.1392 Federal Implementation Plan for...

  11. 77 FR 16937 - Approval and Promulgation of Air Quality Implementation Plans; West Virginia; Regional Haze State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... Promulgation of Air Quality Implementation Plans; West Virginia; Regional Haze State Implementation Plan AGENCY... limited disapproval of West Virginia's Regional Haze State Implementation Plan (SIP) revision. EPA is... mandatory Class I areas through a regional haze program. EPA is also approving this revision as meeting the...

  12. 77 FR 65151 - Finding of Substantial Inadequacy of Implementation Plan; Call for California State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-25

    ... the Federal Register on September 19, 2012. In that action, in response to a remand by the Ninth... Substantial Inadequacy of Implementation Plan; Call for California State Implementation Plan Revision; South... State Implementation Plan (SIP) for the Los Angeles-South Coast Air Basin (South Coast) is substantially...

  13. 40 CFR 93.118 - Criteria and procedures: Motor vehicle emissions budget.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PLANS Conformity to State or Federal Implementation Plans of Transportation Plans, Programs, and..., consultation among federal, State, and local agencies occurred; full implementation plan documentation was... response to comments that are required to be submitted with any implementation plan. EPA will document its...

  14. 40 CFR 93.118 - Criteria and procedures: Motor vehicle emissions budget.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PLANS Conformity to State or Federal Implementation Plans of Transportation Plans, Programs, and..., consultation among federal, State, and local agencies occurred; full implementation plan documentation was... response to comments that are required to be submitted with any implementation plan. EPA will document its...

  15. PIMS: Memristor-Based Processing-in-Memory-and-Storage.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeanine

    Continued progress in computing has augmented the quest for higher performance with a new quest for higher energy efficiency. This has led to the re-emergence of Processing-In-Memory (PIM) ar- chitectures that offer higher density and performance with some boost in energy efficiency. Past PIM work either integrated a standard CPU with a conventional DRAM to improve the CPU- memory link, or used a bit-level processor with Single Instruction Multiple Data (SIMD) control, but neither matched the energy consumption of the memory to the computation. We originally proposed to develop a new architecture derived from PIM that more effectively addressed energymore » efficiency for high performance scientific, data analytics, and neuromorphic applications. We also originally planned to implement a von Neumann architecture with arithmetic/logic units (ALUs) that matched the power consumption of an advanced storage array to maximize energy efficiency. Implementing this architecture in storage was our original idea, since by augmenting storage (in- stead of memory), the system could address both in-memory computation and applications that accessed larger data sets directly from storage, hence Processing-in-Memory-and-Storage (PIMS). However, as our research matured, we discovered several things that changed our original direc- tion, the most important being that a PIM that implements a standard von Neumann-type archi- tecture results in significant energy efficiency improvement, but only about a O(10) performance improvement. In addition to this, the emergence of new memory technologies moved us to propos- ing a non-von Neumann architecture, called Superstrider, implemented not in storage, but in a new DRAM technology called High Bandwidth Memory (HBM). HBM is a stacked DRAM tech- nology that includes a logic layer where an architecture such as Superstrider could potentially be implemented.« less

  16. Simulation techniques in hyperthermia treatment planning

    PubMed Central

    Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC

    2013-01-01

    Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453

  17. The introduction of computer assisted learning in a school of midwifery using the Wessex Care Plan Program.

    PubMed

    Leong, W C

    1989-04-01

    This case study was the result of attending the Computer Assisted Learning (CAL) Course sponsored by the Wessex Regional CAL Project. This was the Region's initiative to prepare Nurse and Midwife Teachers in developing CAL in the curriculum. The small scale qualitative classroom study was conducted in the School of Midwifery. The aim of the study was to evaluate the use of the content-free Wessex Care Plan Program (WCPP) in the Midwifery curriculum. For the evaluation of the study, a triangulation of data were obtained from the following sources: 1) classroom observation 2) questionnaires and interviews of eight Student Midwives 3) colleagues' responses to the introduction of CAL and personal experience The findings of this study showed that the content-free WCPP was easy to prepare and implement. The Student Midwives found the program easy to follow and a useful means of learning. At the same time it was enjoyable and fun; a dimension of learning that we could do with more often!

  18. 40 CFR 49.10046 - Contents of implementation plan.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Plan for the Cow Creek Band of Umpqua Indians of Oregon § 49.10046 Contents of implementation plan. The implementation plan for the Reservation of the Cow Creek Band of Umpqua Indians consists of the following rules...

  19. Implementing a Capital Plan.

    ERIC Educational Resources Information Center

    Daigneau, William A.

    2003-01-01

    Addresses four questions regarding implementation of a long-term capital plan to manage a college's facilities portfolio: When should the projects be implemented? How should the capital improvements be implemented? What will it actually cost in terms of project costs as well as operating costs? Who will implement the plan? (EV)

  20. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, D; Anderson, C; Mayo, C

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinFormsmore » to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response models. Supported by NIH - P01 - CA059827.« less

  1. 40 CFR 52.1973 - Approval of plans.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1973 Approval of plans. (a) Carbon monoxide. (1) EPA approves as a revision to the Oregon State Implementation Plan, the Second... December 27, 2004. (2) EPA approves as a revision to the Oregon State Implementation Plan, the Salem carbon...

  2. 40 CFR 52.1973 - Approval of plans.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1973 Approval of plans. (a) Carbon monoxide. (1) EPA approves as a revision to the Oregon State Implementation Plan, the Second... December 27, 2004. (2) EPA approves as a revision to the Oregon State Implementation Plan, the Salem carbon...

  3. 40 CFR 52.1973 - Approval of plans.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1973 Approval of plans. (a) Carbon monoxide. (1) EPA approves as a revision to the Oregon State Implementation Plan, the Second... December 27, 2004. (2) EPA approves as a revision to the Oregon State Implementation Plan, the Salem carbon...

  4. 40 CFR 52.1973 - Approval of plans.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1973 Approval of plans. (a) Carbon monoxide. (1) EPA approves as a revision to the Oregon State Implementation Plan, the Second... December 27, 2004. (2) EPA approves as a revision to the Oregon State Implementation Plan, the Salem carbon...

  5. 49 CFR 633.27 - Implementation of a project management plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Implementation of a project management plan. 633... TRANSIT ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROJECT MANAGEMENT OVERSIGHT Project Management Plans § 633.27 Implementation of a project management plan. (a) Upon approval of a project management plan by...

  6. 49 CFR 633.27 - Implementation of a project management plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 7 2011-10-01 2011-10-01 false Implementation of a project management plan. 633... TRANSIT ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROJECT MANAGEMENT OVERSIGHT Project Management Plans § 633.27 Implementation of a project management plan. (a) Upon approval of a project management plan by...

  7. 49 CFR 633.27 - Implementation of a project management plan.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 7 2012-10-01 2012-10-01 false Implementation of a project management plan. 633... TRANSIT ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROJECT MANAGEMENT OVERSIGHT Project Management Plans § 633.27 Implementation of a project management plan. (a) Upon approval of a project management plan by...

  8. 49 CFR 633.27 - Implementation of a project management plan.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 7 2014-10-01 2014-10-01 false Implementation of a project management plan. 633... TRANSIT ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROJECT MANAGEMENT OVERSIGHT Project Management Plans § 633.27 Implementation of a project management plan. (a) Upon approval of a project management plan by...

  9. 49 CFR 633.27 - Implementation of a project management plan.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 7 2013-10-01 2013-10-01 false Implementation of a project management plan. 633... TRANSIT ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROJECT MANAGEMENT OVERSIGHT Project Management Plans § 633.27 Implementation of a project management plan. (a) Upon approval of a project management plan by...

  10. 76 FR 9706 - Finding of Substantial Inadequacy of Implementation Plan; Call for Iowa State Implementation Plan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... advance and available for prompt implementation once triggered. Section 110(k)(5) of the CAA provides that... Environmental protection, Air pollution control, Iowa, Particulate matter, State Implementation Plan. Dated...

  11. Efficient Parallel Kernel Solvers for Computational Fluid Dynamics Applications

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He

    1997-01-01

    Distributed-memory parallel computers dominate today's parallel computing arena. These machines, such as Intel Paragon, IBM SP2, and Cray Origin2OO, have successfully delivered high performance computing power for solving some of the so-called "grand-challenge" problems. Despite initial success, parallel machines have not been widely accepted in production engineering environments due to the complexity of parallel programming. On a parallel computing system, a task has to be partitioned and distributed appropriately among processors to reduce communication cost and to attain load balance. More importantly, even with careful partitioning and mapping, the performance of an algorithm may still be unsatisfactory, since conventional sequential algorithms may be serial in nature and may not be implemented efficiently on parallel machines. In many cases, new algorithms have to be introduced to increase parallel performance. In order to achieve optimal performance, in addition to partitioning and mapping, a careful performance study should be conducted for a given application to find a good algorithm-machine combination. This process, however, is usually painful and elusive. The goal of this project is to design and develop efficient parallel algorithms for highly accurate Computational Fluid Dynamics (CFD) simulations and other engineering applications. The work plan is 1) developing highly accurate parallel numerical algorithms, 2) conduct preliminary testing to verify the effectiveness and potential of these algorithms, 3) incorporate newly developed algorithms into actual simulation packages. The work plan has well achieved. Two highly accurate, efficient Poisson solvers have been developed and tested based on two different approaches: (1) Adopting a mathematical geometry which has a better capacity to describe the fluid, (2) Using compact scheme to gain high order accuracy in numerical discretization. The previously developed Parallel Diagonal Dominant (PDD) algorithm and Reduced Parallel Diagonal Dominant (RPDD) algorithm have been carefully studied on different parallel platforms for different applications, and a NASA simulation code developed by Man M. Rai and his colleagues has been parallelized and implemented based on data dependency analysis. These achievements are addressed in detail in the paper.

  12. Generation and reduction of the data for the Ulysses gravitational wave experiment

    NASA Technical Reports Server (NTRS)

    Agresti, R.; Bonifazi, P.; Iess, L.; Trager, G. B.

    1987-01-01

    A procedure for the generation and reduction of the radiometric data known as REGRES is described. The software is implemented on a HP-1000F computer and was tested on REGRES data relative to the Voyager I spacecraft. The REGRES data are a current output of NASA's Orbit Determination Program. The software package was developed in view of the data analysis of the gravitational wave experiment planned for the European spacecraft Ulysses.

  13. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  14. GLOBECOM '92 - IEEE Global Telecommunications Conference, Orlando, FL, Dec. 6-9, 1992, Conference Record. Vols. 1-3

    NASA Astrophysics Data System (ADS)

    Papers are presented on such topics as the wireless data network in PCS, advances in digital mobile networks, ATM switching experiments, broadband applications, network planning, and advances in SONET/SDH implementations. Consideration is also given to gigabit computer networks, techniques for modeling large high-speed networks, coding and modulation, the next-generation lightwave system, signaling systems for broadband ISDN, satellite technologies, and advances in standardization of low-rate signal processing.

  15. Use of a wireless local area network in an orthodontic clinic.

    PubMed

    Mupparapu, Muralidhar; Binder, Robert E; Cummins, John M

    2005-06-01

    Radiographic images and other patient records, including medical histories, demographics, and health insurance information, can now be stored digitally and accessed via patient management programs. However, digital image acquisition and diagnosis and treatment planning are independent tasks, and each is time consuming, especially when performed at different computer workstations. Networking or linking the computers in an office enhances access to imaging and treatment planning tools. Access can be further enhanced if the entire network is wireless. Thanks to wireless technology, stand-alone, desk-bound personal computers have been replaced with mobile, hand-held devices that can communicate with each other and the rest of the world via the Internet. As with any emerging technology, some issues should be kept in mind when adapting to the wireless environment. Foremost is network security. Second is the choice of mobile hardware devices that are used by the orthodontist, office staff, and patients. This article details the standards and choices in wireless technology that can be implemented in an orthodontic clinic and suggests how to select suitable mobile hardware for accessing or adding data to a preexisting network. The network security protocols discussed comply with HIPAA regulations and boost the efficiency of a modern orthodontic clinic.

  16. MIDAS - A microcomputer-based image display and analysis system with full Landsat frame processing capabilities

    NASA Technical Reports Server (NTRS)

    Hofman, L. B.; Erickson, W. K.; Donovan, W. E.

    1984-01-01

    Image Display and Analysis Systems (MIDAS) developed at NASA/Ames for the analysis of Landsat MSS images is described. The MIDAS computer power and memory, graphics, resource-sharing, expansion and upgrade, environment and maintenance, and software/user-interface requirements are outlined; the implementation hardware (including 32-bit microprocessor, 512K error-correcting RAM, 70 or 140-Mbyte formatted disk drive, 512 x 512 x 24 color frame buffer, and local-area-network transceiver) and applications software (ELAS, CIE, and P-EDITOR) are characterized; and implementation problems, performance data, and costs are examined. Planned improvements in MIDAS hardware and design goals and areas of exploration for MIDAS software are discussed.

  17. Developing and Implementing a Citywide Asthma Action Plan: A Community Collaborative Partnership.

    PubMed

    Staudt, Amanda Marie; Alamgir, Hasanat; Long, Debra Lynn; Inscore, Stephen Curtis; Wood, Pamela Runge

    2015-12-01

    Asthma affects 1 in 10 children in the United States, with higher prevalence among children living in poverty. Organizations in San Antonio, Texas, partnered to design and implement a uniform, citywide asthma action plan to improve asthma management capacity in schools. The asthma action plan template was modified from that of the Global Initiative for Asthma. School personnel were trained in symptom recognition, actions to take, and use of equipment before the asthma action plan implementation. The annual Asthma Action Plan Summit was organized as a forum for school nurses, healthcare providers, and members of the community to exchange ideas and strategies on implementation, as well as to revise the plan. The asthma action plan was implemented in all 16 local school districts. Feedback received from school nurses suggests that the citywide asthma action plan resulted in improved asthma management and student health at schools. The evidence in this study suggests that community organizations can successfully collaborate to implement a citywide health initiative similar to the asthma action plan.

  18. The Neural Basis of Aversive Pavlovian Guidance during Planning

    PubMed Central

    Faulkner, Paul

    2017-01-01

    Important real-world decisions are often arduous as they frequently involve sequences of choices, with initial selections affecting future options. Evaluating every possible combination of choices is computationally intractable, particularly for longer multistep decisions. Therefore, humans frequently use heuristics to reduce the complexity of decisions. We recently used a goal-directed planning task to demonstrate the profound behavioral influence and ubiquity of one such shortcut, namely aversive pruning, a reflexive Pavlovian process that involves neglecting parts of the decision space residing beyond salient negative outcomes. However, how the brain implements this important decision heuristic and what underlies individual differences have hitherto remained unanswered. Therefore, we administered an adapted version of the same planning task to healthy male and female volunteers undergoing functional magnetic resonance imaging (fMRI) to determine the neural basis of aversive pruning. Through both computational and standard categorical fMRI analyses, we show that when planning was influenced by aversive pruning, the subgenual cingulate cortex was robustly recruited. This neural signature was distinct from those associated with general planning and valuation, two fundamental cognitive components elicited by our task but which are complementary to aversive pruning. Furthermore, we found that individual variation in levels of aversive pruning was associated with the responses of insula and dorsolateral prefrontal cortices to the receipt of large monetary losses, and also with subclinical levels of anxiety. In summary, our data reveal the neural signatures of an important reflexive Pavlovian process that shapes goal-directed evaluations and thereby determines the outcome of high-level sequential cognitive processes. SIGNIFICANCE STATEMENT Multistep decisions are complex because initial choices constrain future options. Evaluating every path for long decision sequences is often impractical; thus, cognitive shortcuts are often essential. One pervasive and powerful heuristic is aversive pruning, in which potential decision-making avenues are curtailed at immediate negative outcomes. We used neuroimaging to examine how humans implement such pruning. We found it to be associated with activity in the subgenual cingulate cortex, with neural signatures that were distinguishable from those covarying with planning and valuation. Individual variations in aversive pruning levels related to subclinical anxiety levels and insular cortex activation. These findings reveal the neural mechanisms by which basic negative Pavlovian influences guide decision-making during planning, with implications for disrupted decision-making in psychiatric disorders. PMID:28924006

  19. The Neural Basis of Aversive Pavlovian Guidance during Planning.

    PubMed

    Lally, Níall; Huys, Quentin J M; Eshel, Neir; Faulkner, Paul; Dayan, Peter; Roiser, Jonathan P

    2017-10-18

    Important real-world decisions are often arduous as they frequently involve sequences of choices, with initial selections affecting future options. Evaluating every possible combination of choices is computationally intractable, particularly for longer multistep decisions. Therefore, humans frequently use heuristics to reduce the complexity of decisions. We recently used a goal-directed planning task to demonstrate the profound behavioral influence and ubiquity of one such shortcut, namely aversive pruning, a reflexive Pavlovian process that involves neglecting parts of the decision space residing beyond salient negative outcomes. However, how the brain implements this important decision heuristic and what underlies individual differences have hitherto remained unanswered. Therefore, we administered an adapted version of the same planning task to healthy male and female volunteers undergoing functional magnetic resonance imaging (fMRI) to determine the neural basis of aversive pruning. Through both computational and standard categorical fMRI analyses, we show that when planning was influenced by aversive pruning, the subgenual cingulate cortex was robustly recruited. This neural signature was distinct from those associated with general planning and valuation, two fundamental cognitive components elicited by our task but which are complementary to aversive pruning. Furthermore, we found that individual variation in levels of aversive pruning was associated with the responses of insula and dorsolateral prefrontal cortices to the receipt of large monetary losses, and also with subclinical levels of anxiety. In summary, our data reveal the neural signatures of an important reflexive Pavlovian process that shapes goal-directed evaluations and thereby determines the outcome of high-level sequential cognitive processes. SIGNIFICANCE STATEMENT Multistep decisions are complex because initial choices constrain future options. Evaluating every path for long decision sequences is often impractical; thus, cognitive shortcuts are often essential. One pervasive and powerful heuristic is aversive pruning, in which potential decision-making avenues are curtailed at immediate negative outcomes. We used neuroimaging to examine how humans implement such pruning. We found it to be associated with activity in the subgenual cingulate cortex, with neural signatures that were distinguishable from those covarying with planning and valuation. Individual variations in aversive pruning levels related to subclinical anxiety levels and insular cortex activation. These findings reveal the neural mechanisms by which basic negative Pavlovian influences guide decision-making during planning, with implications for disrupted decision-making in psychiatric disorders. Copyright © 2017 the authors 0270-6474/17/3710216-15$15.00/0.

  20. Participatory System Dynamics Modeling: Increasing Stakeholder Engagement and Precision to Improve Implementation Planning in Systems.

    PubMed

    Zimmerman, Lindsey; Lounsbury, David W; Rosen, Craig S; Kimerling, Rachel; Trafton, Jodie A; Lindley, Steven E

    2016-11-01

    Implementation planning typically incorporates stakeholder input. Quality improvement efforts provide data-based feedback regarding progress. Participatory system dynamics modeling (PSD) triangulates stakeholder expertise, data and simulation of implementation plans prior to attempting change. Frontline staff in one VA outpatient mental health system used PSD to examine policy and procedural "mechanisms" they believe underlie local capacity to implement evidence-based psychotherapies (EBPs) for PTSD and depression. We piloted the PSD process, simulating implementation plans to improve EBP reach. Findings indicate PSD is a feasible, useful strategy for building stakeholder consensus, and may save time and effort as compared to trial-and-error EBP implementation planning.

  1. A systems approach to computer-based training

    NASA Technical Reports Server (NTRS)

    Drape, Gaylen W.

    1994-01-01

    This paper describes the hardware and software systems approach used in the Automated Recertification Training System (ARTS), a Phase 2 Small Business Innovation Research (SBIR) project for NASA Kennedy Space Center (KSC). The goal of this project is to optimize recertification training of technicians who process the Space Shuttle before launch by providing computer-based training courseware. The objectives of ARTS are to implement more effective CBT applications identified through a need assessment process and to provide an ehanced courseware production system. The system's capabilities are demonstrated by using five different pilot applications to convert existing classroom courses into interactive courseware. When the system is fully implemented at NASA/KSC, trainee job performance will improve and the cost of courseware development will be lower. Commercialization of the technology developed as part of this SBIR project is planned for Phase 3. Anticipated spin-off products include custom courseware for technical skills training and courseware production software for use by corporate training organizations of aerospace and other industrial companies.

  2. Study to define an approach for developing a computer-based system capable of automatic, unattended assembly/disassembly of spacecraft, phase 1

    NASA Technical Reports Server (NTRS)

    Nevins, J. L.; Defazio, T. L.; Seltzer, D. S.; Whitney, D. E.

    1981-01-01

    The initial set of requirements for additional studies necessary to implement a space-borne, computer-based work system capable of achieving assembly, disassembly, repair, or maintenance in space were developed. The specific functions required of a work system to perform repair and maintenance were discussed. Tasks and relevant technologies were identified and delineated. The interaction of spacecraft design and technology options, including a consideration of the strategic issues of repair versus retrieval-replacement or destruction by removal were considered along with the design tradeoffs for accomplishing each of the options. A concept system design and its accompanying experiment or test plan were discussed.

  3. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  4. Dose tracking and dose auditing in a comprehensive computed tomography dose-reduction program.

    PubMed

    Duong, Phuong-Anh; Little, Brent P

    2014-08-01

    Implementation of a comprehensive computed tomography (CT) radiation dose-reduction program is a complex undertaking, requiring an assessment of baseline doses, an understanding of dose-saving techniques, and an ongoing appraisal of results. We describe the role of dose tracking in planning and executing a dose-reduction program and discuss the use of the American College of Radiology CT Dose Index Registry at our institution. We review the basics of dose-related CT scan parameters, the components of the dose report, and the dose-reduction techniques, showing how an understanding of each technique is important in effective auditing of "outlier" doses identified by dose tracking. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Machine intelligence and robotics: Report of the NASA study group. Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A brief overview of applications of machine intelligence and robotics in the space program is given. These space exploration robots, global service robots to collect data for public service use on soil conditions, sea states, global crop conditions, weather, geology, disasters, etc., from Earth orbit, space industrialization and processing technologies, and construction of large structures in space. Program options for research, advanced development, and implementation of machine intelligence and robot technology for use in program planning are discussed. A vigorous and long-range program to incorporate and keep pace with state of the art developments in computer technology, both in spaceborne and ground-based computer systems is recommended.

  6. ARPA surveillance technology for detection of targets hidden in foliage

    NASA Astrophysics Data System (ADS)

    Hoff, Lawrence E.; Stotts, Larry B.

    1994-02-01

    The processing of large quantities of synthetic aperture radar data in real time is a complex problem. Even the image formation process taxes today's most advanced computers. The use of complex algorithms with multiple channels adds another dimension to the computational problem. Advanced Research Projects Agency (ARPA) is currently planning on using the Paragon parallel processor for this task. The Paragon is small enough to allow its use in a sensor aircraft. Candidate algorithms will be implemented on the Paragon for evaluation for real time processing. In this paper ARPA technology developments for detecting targets hidden in foliage are reviewed and examples of signal processing techniques on field collected data are presented.

  7. 75 FR 54805 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota; Carbon Monoxide (CO...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-09

    ... Promulgation of Air Quality Implementation Plans; Minnesota; Carbon Monoxide (CO) Limited Maintenance Plan for... June 16, 2010, to revise the Minnesota State Implementation Plan (SIP) for carbon monoxide (CO) under the Clean Air Act (CAA). The State has submitted a limited maintenance plan for CO showing continued...

  8. 40 CFR 49.9861 - Identification of plan.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRIBAL CLEAN AIR ACT AUTHORITY Implementation Plans for Tribes-Region X Implementation Plan for the Burns Paiute Tribe of the Burns Paiute Indian Colony of Oregon § 49.9861 Identification of plan. This section and §§ 49.9862 through 49.9890 contain the implementation plan for the Burns Paiute Tribe of the Burns...

  9. 75 FR 63139 - Approval and Promulgation of Implementation Plans-Maricopa County (Phoenix) PM-10 Nonattainment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... Promulgation of Implementation Plans--Maricopa County (Phoenix) PM-10 Nonattainment Area; Serious Area Plan for... implementation plan (SIP) revisions submitted by the State of Arizona to meet, among other requirements, section... (Maricopa area). Specifically, EPA proposed to disapprove provisions of the 189(d) plan because they do not...

  10. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    NASA Astrophysics Data System (ADS)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  11. Inverse-consistent rigid registration of CT and MR for MR-based planning and adaptive prostate radiation therapy

    NASA Astrophysics Data System (ADS)

    Rivest-Hénault, David; Dowson, Nicholas; Greer, Peter; Dowling, Jason

    2014-03-01

    MRI-alone treatment planning and adaptive MRI-based prostate radiation therapy are two promising techniques that could significantly increase the accuracy of the curative dose delivery processes while reducing the total radiation dose. State-of-the-art methods rely on the registration of a patient MRI with a MR-CT atlas for the estimation of pseudo-CT [5]. This atlas itself is generally created by registering many CT and MRI pairs. Most registration methods are not symmetric, but the order of the images influences the result [8]. The computed transformation is therefore biased, introducing unwanted variability. This work examines how much a symmetric algorithm improves the registration. Methods: A robust symmetric registration algorithm is proposed that simultaneously optimises a half space transform and its inverse. During the registration process, the two input volumetric images are transformed to a common position in space, therefore minimising any computational bias. An asymmetrical implementation of the same algorithm was used for comparison purposes. Results: Whole pelvis MRI and CT scans from 15 prostate patients were registered, as in the creation of MR-CT atlases. In each case, two registrations were performed, with different input image orders, and the transformation error quantified. Mean residuals of 0.63±0.26 mm (translation) and (8.7±7.3) × 10--3 rad (rotation) were found for the asymmetrical implementation with corresponding values of 0.038±0.039 mm and (1.6 ± 1.3) × 10--3 rad for the proposed symmetric algorithm, a substantial improvement. Conclusions: The increased registration precision will enhance the generation of pseudo-CT from MRI for atlas based MR planning methods.

  12. Evolving the Technical Infrastructure of the Planetary Data System for the 21st Century

    NASA Technical Reports Server (NTRS)

    Beebe, Reta F.; Crichton, D.; Hughes, S.; Grayzeck, E.

    2010-01-01

    The Planetary Data System (PDS) was established in 1989 as a distributed system to assure scientific oversight. Initially the PDS followed guidelines recommended by the National Academies Committee on Data Management and Computation (CODMAC, 1982) and placed emphasis on archiving validated datasets. But overtime user demands, supported by increased computing capabilities and communication methods, have placed increasing demands on the PDS. The PDS must add additional services to better enable scientific analysis within distributed environments and to ensure that those services integrate with existing systems and data. To face these challenges the Planetary Data System (PDS) must modernize its architecture and technical implementation. The PDS 2010 project addresses these challenges. As part of this project, the PDS has three fundamental project goals that include: (1) Providing more efficient client delivery of data by data providers to the PDS (2) Enabling a stable, long-term usable planetary science data archive (3) Enabling services for the data consumer to find, access and use the data they require in contemporary data formats. In order to achieve these goals, the PDS 2010 project is upgrading both the technical infrastructure and the data standards to support increased efficiency in data delivery as well as usability of the PDS. Efforts are underway to interface with missions as early as possible and to streamline the preparation and delivery of data to the PDS. Likewise, the PDS is working to define and plan for data services that will help researchers to perform analysis in cost-constrained environments. This presentation will cover the PDS 2010 project including the goals, data standards and technical implementation plans that are underway within the Planetary Data System. It will discuss the plans for moving from the current system, version PDS 3, to version PDS 4.

  13. Resource Allocation Planning Helper (RALPH): Lessons learned

    NASA Technical Reports Server (NTRS)

    Durham, Ralph; Reilly, Norman B.; Springer, Joe B.

    1990-01-01

    The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility.

  14. Design requirements for SRB production control system. Volume 4: Implementation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The implementation plan which is presented was developed to provide the means for the successful implementation of the automated production control system. There are three factors which the implementation plan encompasses: detailed planning; phased implementation; and user involvement. The plan is detailed to the task level in terms of necessary activities as the system is developed, refined, installed, and tested. These tasks are scheduled, on a preliminary basis, over a two-and-one-half-year time frame.

  15. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  16. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  17. 40 CFR 256.42 - Recommendations for assuring facility development.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... WASTES GUIDELINES FOR DEVELOPMENT AND IMPLEMENTATION OF STATE SOLID WASTE MANAGEMENT PLANS Facility Planning and Implementation § 256.42 Recommendations for assuring facility development. (a) The State plan... facilities, and (4) Development of schedules of implementation. (d) The State plan should encourage private...

  18. 40 CFR 256.42 - Recommendations for assuring facility development.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... WASTES GUIDELINES FOR DEVELOPMENT AND IMPLEMENTATION OF STATE SOLID WASTE MANAGEMENT PLANS Facility Planning and Implementation § 256.42 Recommendations for assuring facility development. (a) The State plan... facilities, and (4) Development of schedules of implementation. (d) The State plan should encourage private...

  19. 40 CFR 256.42 - Recommendations for assuring facility development.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... WASTES GUIDELINES FOR DEVELOPMENT AND IMPLEMENTATION OF STATE SOLID WASTE MANAGEMENT PLANS Facility Planning and Implementation § 256.42 Recommendations for assuring facility development. (a) The State plan... facilities, and (4) Development of schedules of implementation. (d) The State plan should encourage private...

  20. 75 FR 3680 - Revisions to the California State Implementation Plan

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-22

    ... the California State Implementation Plan AGENCY: Environmental Protection Agency (EPA). ACTION... Pollution Control District (SJVAPCD) portion of the California State Implementation Plan (SIP). These... are taking comments on this proposal and plan to follow with a final action. DATES: Any comments must...

  1. 49 CFR 130.33 - Response plan implementation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 2 2014-10-01 2014-10-01 false Response plan implementation. 130.33 Section 130... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OIL TRANSPORTATION OIL SPILL PREVENTION AND RESPONSE PLANS § 130.33 Response plan implementation. If, during transportation of oil subject to this part, a...

  2. 49 CFR 130.33 - Response plan implementation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Response plan implementation. 130.33 Section 130... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OIL TRANSPORTATION OIL SPILL PREVENTION AND RESPONSE PLANS § 130.33 Response plan implementation. If, during transportation of oil subject to this part, a...

  3. 49 CFR 130.33 - Response plan implementation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Response plan implementation. 130.33 Section 130... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OIL TRANSPORTATION OIL SPILL PREVENTION AND RESPONSE PLANS § 130.33 Response plan implementation. If, during transportation of oil subject to this part, a...

  4. 49 CFR 130.33 - Response plan implementation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 2 2012-10-01 2012-10-01 false Response plan implementation. 130.33 Section 130... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OIL TRANSPORTATION OIL SPILL PREVENTION AND RESPONSE PLANS § 130.33 Response plan implementation. If, during transportation of oil subject to this part, a...

  5. 49 CFR 130.33 - Response plan implementation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 2 2013-10-01 2013-10-01 false Response plan implementation. 130.33 Section 130... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OIL TRANSPORTATION OIL SPILL PREVENTION AND RESPONSE PLANS § 130.33 Response plan implementation. If, during transportation of oil subject to this part, a...

  6. 76 FR 28195 - Approval and Promulgation of Air Quality Implementation Plans; New Mexico; Sunland Park 1-Hour...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-16

    ... Promulgation of Air Quality Implementation Plans; New Mexico; Sunland Park 1-Hour Ozone Maintenance Plan AGENCY... the New Mexico State Implementation Plan (SIP). The revision consists of a maintenance plan for Sunland Park, New Mexico developed to ensure continued attainment of the 8-hour ozone National Ambient Air...

  7. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  8. Road Risk Modeling and Cloud-Aided Safety-Based Route Planning.

    PubMed

    Li, Zhaojian; Kolmanovsky, Ilya; Atkins, Ella; Lu, Jianbo; Filev, Dimitar P; Michelini, John

    2016-11-01

    This paper presents a safety-based route planner that exploits vehicle-to-cloud-to-vehicle (V2C2V) connectivity. Time and road risk index (RRI) are considered as metrics to be balanced based on user preference. To evaluate road segment risk, a road and accident database from the highway safety information system is mined with a hybrid neural network model to predict RRI. Real-time factors such as time of day, day of the week, and weather are included as correction factors to the static RRI prediction. With real-time RRI and expected travel time, route planning is formulated as a multiobjective network flow problem and further reduced to a mixed-integer programming problem. A V2C2V implementation of our safety-based route planning approach is proposed to facilitate access to real-time information and computing resources. A real-world case study, route planning through the city of Columbus, Ohio, is presented. Several scenarios illustrate how the "best" route can be adjusted to favor time versus safety metrics.

  9. Disaster recovery plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.E.

    The BMS production implementation will be complete by October 1, 1998 and the server environment will be comprised of two types of platforms. The PassPort Supply and the PeopleSoft Financials will reside on LNIX servers and the PeopleSoft Human Resources and Payroll will reside on Microsoft NT servers. Because of the wide scope and the requirements of the COTS products to run in various environments backup and recovery responsibilities are divided between two groups in Technical Operations. The Central Computer Systems Management group provides support for the LTNIX/NT Backup Data Center, and the Network Infrastructure Systems group provides support formore » the NT Application Server Backup outside the Data Center. The disaster recovery process is dependent on a good backup and recovery process. Information and integrated system data for determining the disaster recovery process is identified from the Fluor Daniel Hanford (FDH) Risk Assessment Plan, Contingency Plan, and Backup and Recovery Plan, and Backup Form for HANDI 2000 BMS.« less

  10. Design and Implementation of the PMS Module for ’Argos’

    DTIC Science & Technology

    1989-12-01

    designing , and implementing a fully workable Planned Maintenance System (PMS). This implementation demonstrates both the capabilities and benefits such a...analyzing, designing , and implementing a fully workable Planned Maintenance System (PMS). This implementation demonstrates both the capabilities and... design and implementation. PMS is the system developed by the navy to provide each ship, department, and supervisor with the tools needed to plan

  11. Implementation of Discharge Plans for Chronically Ill Elders Discharged Home.

    ERIC Educational Resources Information Center

    Proctor, Enola K.; And Others

    1996-01-01

    Addresses the extent to which discharge plans for elderly patients with congestive heart failure were implemented as planned, tested the consequences of implementation problems, and identified factors associated with implementation problems. Implications for hospital discharge planners and home health care are discussed. (KW)

  12. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    USGS Publications Warehouse

    Keane, Robert E.; Rollins, Matthew; Zhu, Zhi-Liang

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed.

  13. Implementation of The World Starts With Me, a comprehensive rights-based sex education programme in Uganda.

    PubMed

    Rijsdijk, Liesbeth E; Bos, Arjan E R; Lie, Rico; Leerlooijer, Joanne N; Eiling, Ellen; Atema, Vera; Gebhardt, Winifred A; Ruiter, Robert A C

    2014-04-01

    This article presents a process evaluation of the implementation of the sex education programme the World Starts With Me (WSWM) for secondary school students in Uganda. The purpose of this mixed-methods study was to examine factors associated with dose delivered (number of lessons implemented) and fidelity of implementation (implementation according to the manual), as well as to identify the main barriers and facilitators of implementation. Teachers' confidence in teaching WSWM was negatively associated with dose delivered. Confidence in educating and discussing sexuality issues in class was positively associated with fidelity of implementation, whereas the importance teachers attached to open sex education showed a negative association with fidelity. Main barriers for implementing WSWM were lack of time, unavailability of computers, lack of student manuals and lack of financial support and rewards. Other barriers for successful implementation were related to high turnover of staff and insufficient training and guidance of teachers. Teachers' beliefs/attitudes towards sexuality of adolescents, condom use and sex education were found to be important socio-cognitive factors intervening with full fidelity of implementation. These findings can be used to improve the intervention implementation and to better plan for large-scale dissemination of school-based sex education programmes in sub-Saharan Africa.

  14. Decommissioning strategy for liquid low-level radioactive waste surface storage water reservoir.

    PubMed

    Utkin, S S; Linge, I I

    2016-11-22

    The Techa Cascade of water reservoirs (TCR) is one of the most environmentally challenging facilities resulted from FSUE "PA "Mayak" operations. Its reservoirs hold over 360 mln m 3 of liquid radioactive waste with a total activity of some 5 × 10 15 Bq. A set of actions implemented under a special State program involving the development of a strategic plan aimed at complete elimination of TCR challenges (Strategic Master-Plan for the Techa Cascade of water reservoirs) resulted in considerable reduction of potential hazards associated with this facility. The paper summarizes the key elements of this master-plan: defining TCR final state, feasibility study of the main strategies aimed at its attainment, evaluation of relevant long-term decommissioning strategy, development of computational tools enabling the long-term forecast of TCR behavior depending on various engineering solutions and different weather conditions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.

    PubMed

    Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko

    2016-01-01

    DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.

  16. Distributed intelligence for supervisory control

    NASA Technical Reports Server (NTRS)

    Wolfe, W. J.; Raney, S. D.

    1987-01-01

    Supervisory control systems must deal with various types of intelligence distributed throughout the layers of control. Typical layers are real-time servo control, off-line planning and reasoning subsystems and finally, the human operator. Design methodologies must account for the fact that the majority of the intelligence will reside with the human operator. Hierarchical decompositions and feedback loops as conceptual building blocks that provide a common ground for man-machine interaction are discussed. Examples of types of parallelism and parallel implementation on several classes of computer architecture are also discussed.

  17. Implementation Plan for Worldwide Airborne Command Post (WWABNCP) Operator Computer-Based Training (PLATO): Decision Paper.

    DTIC Science & Technology

    1985-10-03

    Electrospace Systems, Inc. (ESI). ESI con- ducted a market search for training systems that would enhance unit level training, minimize cost-prohibitive...can be reprogrammed to simulate the UGC -129 keyboard. This keyboard is the standard keyboard used for data transmission on board the EC-135 and E-4B...with the appropriate technical order, and the functions and operation of the AN/ UGC -129 (ASR) terminals used with the AN/ASC-21 AFSATCOM system. In

  18. Galileo Teacher Training Program - MoonDays

    NASA Astrophysics Data System (ADS)

    Heenatigala, T.; Doran, R.

    2012-09-01

    Moon is an excellent tool for classroom education. Many teachers fail to implement lunar science in classroom at several levels though - lack of guidance, finding the right materials, and implanting lessons in the school curriculum - just to name a few. To overcome this need, Galileo Teacher Training Program (GTTP) [1] present MoonDays, a resource guide for teachers globally which can be used both in and out of classroom. GTTP MoonDays includes scientific knowledge, hands-on activities, computing skills, creativity and disability based lesson plans.

  19. F100 multivariable control synthesis program: Evaluation of a multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Seldner, K.; Cwynar, D. S.

    1977-01-01

    The design, evaluation, and testing of a practical, multivariable, linear quadratic regulator control for the F100 turbofan engine were accomplished. NASA evaluation of the multivariable control logic and implementation are covered. The evaluation utilized a real time, hybrid computer simulation of the engine. Results of the evaluation are presented, and recommendations concerning future engine testing of the control are made. Results indicated that the engine testing of the control should be conducted as planned.

  20. 40 CFR 49.10042 - Approval status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRIBAL CLEAN AIR ACT AUTHORITY Implementation Plans for Tribes-Region X Implementation Plan for the Cow... Tribal rules or measures in the implementation plan for the Reservation of the Cow Creek Band of Umpqua...

  1. Ames Research Center FY 2000 Implementation Plan: Leading Technology into the New Millennium

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This document presents the implementation plan for Ames Research Center (ARC) within the overall framework of the NASA Strategic Plan. It describes how ARC intends to implement its Center of Excellence responsibilities, Agency assigned missions, Agency and Enterprise lead programs, and other roles in support of NASA's vision and mission. All Federal agencies are required by the 1993 Government Performance and Results Act to implement a long-term strategic planning process that includes measurable outcomes and strict accountability. At NASA, this planning process is shaped by the Space Act of 1958, annual appropriations, and other external mandates, as well as by customer requirements. The resulting Strategic Plan sets the overall architecture for what we do, identifies who our customers are, and directs where we are going and why. The Strategic Plan is the basis upon which decisions regarding program implementation and resource deployment are made. Whereas the strategic planning process examines the long-term direction of the organization and identifies a specific set of goals, the implementation planning process examines the detailed performance of the organization and allocates resources toward meeting these goals. It is the purpose of this implementation document to provide the connection between the NASA Strategic Plan and the specific programs and support functions that ARC employees perform. This connection flows from the NASA Strategic Plan, through the various Strategic Enterprise plans to the ARC Center of Excellence, primary missions, Lead Center programs, program support responsibilities, and ultimately, to the role of the individual ARC employee.

  2. 77 FR 60661 - Approval and Promulgation of Air Quality Implementation Plans; Indiana; South Bend-Elkhart...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ... Promulgation of Air Quality Implementation Plans; Indiana; South Bend-Elkhart, Indiana Ozone Maintenance Plan..., Indiana 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP) by replacing the... Vehicle Emissions Simulator (MOVES) 2010a emissions model. Indiana submitted this request to EPA for...

  3. 77 FR 59356 - Approval and Promulgation of Implementation Plans; North Carolina: Approval of Rocky Mount...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-27

    ... Promulgation of Implementation Plans; North Carolina: Approval of Rocky Mount Supplemental Motor Vehicle... is proposing to approve a revision to the North Carolina State Implementation Plan (SIP), submitted... supplements the original redesignation request and maintenance plan for Rocky Mount 1997 8-hour ozone area...

  4. 77 FR 30212 - Approval and Promulgation of Air Quality Implementation Plans; Vermont; Regional Haze

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ... Promulgation of Air Quality Implementation Plans; Vermont; Regional Haze AGENCY: Environmental Protection... Implementation Plan (SIP) that addresses regional haze for the first planning period from 2008 through 2018. The... numerous sources located over a wide geographic area (also referred to as the ``regional haze program...

  5. 77 FR 47530 - Approval and Promulgation of State Implementation Plans; Hawaii; Infrastructure Requirements for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-09

    ... Promulgation of State Implementation Plans; Hawaii; Infrastructure Requirements for the 1997 8-Hour Ozone and... State Implementation Plan (SIP) revision submitted by the state of Hawaii pursuant to the requirements... FURTHER INFORMATION CONTACT: Dawn Richmond, Air Planning Office (AIR-2), U.S. Environmental Protection...

  6. Compilation and development of K-6 aerospace materials for implementation in NASA spacelink electronic information system

    NASA Technical Reports Server (NTRS)

    Blake, Jean A.

    1987-01-01

    Spacelink is an electronic information service to be operated by the Marshall Space Flight Center. It will provide NASA news and educational resources including software programs that can be accessed by anyone with a computer and modem. Spacelink is currently being installed and will soon begin service. It will provide daily updates of NASA programs, information about NASA educational services, manned space flight, unmanned space flight, aeronautics, NASA itself, lesson plans and activities, and space program spinoffs. Lesson plans and activities were extracted from existing NASA publications on aerospace activities for the elementary school. These materials were arranged into 206 documents which have been entered into the Spacelink program for use in grades K-6.

  7. Representation and Use of Temporal Information in ONCOCIN

    PubMed Central

    Kahn, Michael G.; Ferguson, Jay C.; Shortliffe, Edward H.; Fagan, Lawrence M.

    1985-01-01

    The past medical history of a patient is a complex collection of events yet the understanding of these past events is critical for effective medical diagnostic and therapeutic decisions. Although computers can store vast quantities of patient data, diagnostic and therapeutic computer programs have had difficulty in accessing and analyzing collections of patient information that is clinically pertinent to a specific decision facing a particular patient at a given moment in his disease. Without some model of the patient's past, the computer cannot fully interpret the meaning of the available patient data. We present some of the difficulties that were encountered in ONCOCIN, a cancer chemotherapy planning program. This program must be able to reason about the patient's past treatment history in order to generate a therapy plan that is responsive to the problems he or she may have encountered in the past. A design is presented that supports a more intuitive approach to capture and analyze important temporal relationships in a patient's computer record. In order to represent the time course of a patient, we have implemented a structure called the temporal network and a temporal syntax for data storage and retrieval. Using this system, ONCOCIN is able to quickly obtain data that is patient-specific and context-sensitive. Adding the temporal network to the ONCOCIN system has markedly improved the program's handling of complex temporal issues.

  8. Planning Targets for Phase II Watershed Implementation Plans

    EPA Pesticide Factsheets

    On August 1, 2011, EPA provided planning targets for nitrogen, phosphorus and sediment for the Phase II Watershed Implementation Plans (WIPs) of the Chesapeake Bay TMDL. This page provides the letters containing those planning targets.

  9. A conceptual and computational model of moral decision making in human and artificial agents.

    PubMed

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we will elucidate a process whereby an agent can work through an ethical problem to reach a solution that takes account of ethically relevant factors. Copyright © 2010 Cognitive Science Society, Inc.

  10. Building robust conservation plans.

    PubMed

    Visconti, Piero; Joppa, Lucas

    2015-04-01

    Systematic conservation planning optimizes trade-offs between biodiversity conservation and human activities by accounting for socioeconomic costs while aiming to achieve prescribed conservation objectives. However, the most cost-efficient conservation plan can be very dissimilar to any other plan achieving the set of conservation objectives. This is problematic under conditions of implementation uncertainty (e.g., if all or part of the plan becomes unattainable). We determined through simulations of parallel implementation of conservation plans and habitat loss the conditions under which optimal plans have limited chances of implementation and where implementation attempts would fail to meet objectives. We then devised a new, flexible method for identifying conservation priorities and scheduling conservation actions. This method entails generating a number of alternative plans, calculating the similarity in site composition among all plans, and selecting the plan with the highest density of neighboring plans in similarity space. We compared our method with the classic method that maximizes cost efficiency with synthetic and real data sets. When implementation was uncertain--a common reality--our method provided higher likelihood of achieving conservation targets. We found that χ, a measure of the shortfall in objectives achieved by a conservation plan if the plan could not be implemented entirely, was the main factor determining the relative performance of a flexibility enhanced approach to conservation prioritization. Our findings should help planning authorities prioritize conservation efforts in the face of uncertainty about future condition and availability of sites. © 2014 Society for Conservation Biology.

  11. 76 FR 15 - Approval and Promulgation of Implementation Plans; Texas; Emissions Banking and Trading of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-03

    ... Promulgation of Implementation Plans; Texas; Emissions Banking and Trading of Allowances Program AGENCY... to the Texas State Implementation Plan (SIP) that create and amend the Emissions Banking and Trading... hard copy at the Air Planning Section (6PD-L), Environmental Protection Agency, 1445 Ross Avenue, Suite...

  12. 78 FR 33784 - Approval and Promulgation of Implementation Plans; Kentucky: Kentucky Portion of Cincinnati...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-05

    ... Promulgation of Implementation Plans; Kentucky: Kentucky Portion of Cincinnati-Hamilton, Supplement Motor.... SUMMARY: EPA is proposing to approve a revision to the Kentucky State Implementation Plan (SIP), submitted... maintenance plan for the Kentucky portion of the Cincinnati-Hamilton, OH-KY-IN, maintenance area for the 1997...

  13. 78 FR 34903 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; 1997 8-Hour Ozone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Promulgation of Air Quality Implementation Plans; Ohio; 1997 8-Hour Ozone Maintenance Plan Revision; Motor... request by Ohio to revise the 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP) to... area with budgets developed using EPA's Motor Vehicle Emissions Simulator (MOVES) emissions model. Ohio...

  14. 78 FR 16826 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Cleveland-Akron-Lorain and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-19

    ...-9790-1] Approval and Promulgation of Air Quality Implementation Plans; Ohio; Cleveland-Akron-Lorain and Columbus 1997 8-Hour Ozone Maintenance Plan Revisions to Approved Motor Vehicle Emissions Budgets AGENCY... quality State Implementation Plans (SIPs) under the Clean Air Act to replace the previously approved motor...

  15. 78 FR 28497 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Canton-Massillon 1997 8-Hour...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-15

    ... Promulgation of Air Quality Implementation Plans; Ohio; Canton-Massillon 1997 8-Hour Ozone Maintenance Plan... the Canton-Massillon, Ohio 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP... using EPA's Motor Vehicle Emissions Simulator (MOVES) emissions model. Ohio submitted the SIP revision...

  16. 78 FR 28551 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Canton-Massillon 1997 8-Hour...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-15

    ... Promulgation of Air Quality Implementation Plans; Ohio; Canton-Massillon 1997 8-Hour Ozone Maintenance Plan..., Ohio, 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP) under the Clean Air Act... Motor Vehicle Emissions Simulator (MOVES) emissions model. Ohio submitted the SIP revision request to...

  17. 77 FR 30214 - Approval and Promulgation of Air Quality Implementation Plans; Rhode Island; Regional Haze

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ... Promulgation of Air Quality Implementation Plans; Rhode Island; Regional Haze AGENCY: Environmental Protection... Implementation Plan (SIP) that addresses regional haze for the first planning period from 2008 through 2018. The... geographic area (also referred to as the ``regional haze program''). DATES: Effective Date: This rule is...

  18. 76 FR 43149 - Approval and Promulgation of Air Quality Implementation Plans; State of California; Regional Haze...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R09-OAR-2011-0131, FRL-9317-9] Approval and Promulgation of Air Quality Implementation Plans; State of California; Regional Haze State Implementation Plan and Interstate Transport Plan; Interference With Visibility Requirement Correction In rule document...

  19. 76 FR 57013 - Approval and Promulgation of Air Quality Implementation Plans; West Virginia; Revised Motor...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... Promulgation of Air Quality Implementation Plans; West Virginia; Revised Motor Vehicle Emission Budgets for the... Implementation Plan (SIP) revision submitted by the State of West Virginia for the purpose of amending the 8-hour ozone maintenance plan for the Charleston, Huntington, Parkersburg, Weirton, and Wheeling 8-hour ozone...

  20. Use of cone beam computed tomography in implant dentistry: current concepts, indications and limitations for clinical practice and research.

    PubMed

    Bornstein, Michael M; Horner, Keith; Jacobs, Reinhilde

    2017-02-01

    Diagnostic radiology is an essential component of treatment planning in the field of implant dentistry. This narrative review will present current concepts for the use of cone beam computed tomography imaging, before and after implant placement, in daily clinical practice and research. Guidelines for the selection of three-dimensional imaging will be discussed, and limitations will be highlighted. Current concepts of radiation dose optimization, including novel imaging modalities using low-dose protocols, will be presented. For preoperative cross-sectional imaging, data are still not available which demonstrate that cone beam computed tomography results in fewer intraoperative complications such as nerve damage or bleeding incidents, or that implants inserted using preoperative cone beam computed tomography data sets for planning purposes will exhibit higher survival or success rates. The use of cone beam computed tomography following the insertion of dental implants should be restricted to specific postoperative complications, such as damage of neurovascular structures or postoperative infections in relation to the maxillary sinus. Regarding peri-implantitis, the diagnosis and severity of the disease should be evaluated primarily based on clinical parameters and on radiological findings based on periapical radiographs (two dimensional). The use of cone beam computed tomography scans in clinical research might not yield any evident beneficial effect for the patient included. As many of the cone beam computed tomography scans performed for research have no direct therapeutic consequence, dose optimization measures should be implemented by using appropriate exposure parameters and by reducing the field of view to the actual region of interest. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathew, D; Alaei, P

    2016-06-15

    Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predeterminedmore » doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.« less

  2. 40 CFR 109.5 - Development and implementation criteria for State, local and regional oil removal contingency plans.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... OIL REMOVAL CONTINGENCY PLANS § 109.5 Development and implementation criteria for State, local and regional oil removal contingency plans. Criteria for the development and implementation of State, local and... 40 Protection of Environment 23 2012-07-01 2012-07-01 false Development and implementation...

  3. 77 FR 65125 - Approval and Promulgation of Implementation Plans; Georgia 110(a)(1) and (2) Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-25

    ... controls are enforced through the associated SIP rules or Federal Implementation Plans (FIPs). Any purchase... Promulgation of Implementation Plans; Georgia 110(a)(1) and (2) Infrastructure Requirements for the 1997 and... Agency (EPA). ACTION: Final rule. SUMMARY: EPA is taking final action to approve the State Implementation...

  4. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RIECK, C.A.

    1999-02-23

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less

  5. Implementing corporate wellness programs: a business approach to program planning.

    PubMed

    Helmer, D C; Dunn, L M; Eaton, K; Macedonio, C; Lubritz, L

    1995-11-01

    1. Support of key decision makers is critical to the successful implementation of a corporate wellness program. Therefore, the program implementation plan must be communicated in a format and language readily understood by business people. 2. A business approach to corporate wellness program planning provides a standardized way to communicate the implementation plan. 3. A business approach incorporates the program planning components in a format that ranges from general to specific. This approach allows for flexibility and responsiveness to changes in program planning. 4. Components of the business approach are the executive summary, purpose, background, ground rules, approach, requirements, scope of work, schedule, and financials.

  6. DSP Implementation of the Retinex Image Enhancement Algorithm

    NASA Technical Reports Server (NTRS)

    Hines, Glenn; Rahman, Zia-Ur; Jobson, Daniel; Woodell, Glenn

    2004-01-01

    The Retinex is a general-purpose image enhancement algorithm that is used to produce good visual representations of scenes. It performs a non-linear spatial/spectral transform that synthesizes strong local contrast enhancement and color constancy. A real-time, video frame rate implementation of the Retinex is required to meet the needs of various potential users. Retinex processing contains a relatively large number of complex computations, thus to achieve real-time performance using current technologies requires specialized hardware and software. In this paper we discuss the design and development of a digital signal processor (DSP) implementation of the Retinex. The target processor is a Texas Instruments TMS320C6711 floating point DSP. NTSC video is captured using a dedicated frame-grabber card, Retinex processed, and displayed on a standard monitor. We discuss the optimizations used to achieve real-time performance of the Retinex and also describe our future plans on using alternative architectures.

  7. Calculation of Lung Cancer Volume of Target Based on Thorax Computed Tomography Images using Active Contour Segmentation Method for Treatment Planning System

    NASA Astrophysics Data System (ADS)

    Patra Yosandha, Fiet; Adi, Kusworo; Edi Widodo, Catur

    2017-06-01

    In this research, calculation process of the lung cancer volume of target based on computed tomography (CT) thorax images was done. Volume of the target calculation was done in purpose to treatment planning system in radiotherapy. The calculation of the target volume consists of gross tumor volume (GTV), clinical target volume (CTV), planning target volume (PTV) and organs at risk (OAR). The calculation of the target volume was done by adding the target area on each slices and then multiply the result with the slice thickness. Calculations of area using of digital image processing techniques with active contour segmentation method. This segmentation for contouring to obtain the target volume. The calculation of volume produced on each of the targets is 577.2 cm3 for GTV, 769.9 cm3 for CTV, 877.8 cm3 for PTV, 618.7 cm3 for OAR 1, 1,162 cm3 for OAR 2 right, and 1,597 cm3 for OAR 2 left. These values indicate that the image processing techniques developed can be implemented to calculate the lung cancer target volume based on CT thorax images. This research expected to help doctors and medical physicists in determining and contouring the target volume quickly and precisely.

  8. 40 CFR 49.10050 - Federally-promulgated regulations and Federal implementation plans.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-Region X Implementation Plan for the Cow Creek Band of Umpqua Indians of Oregon § 49.10050 Federally... part of the implementation plan for the Reservation of the Cow Creek Band of Umpqua Indians: (a...

  9. Knowledge-based zonal grid generation for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation of flow field zoning in two dimensions is an important step towards reducing the difficulty of three-dimensional grid generation in computational fluid dynamics. Using a knowledge-based approach makes sense, but problems arise which are caused by aspects of zoning involving perception, lack of expert consensus, and design processes. These obstacles are overcome by means of a simple shape and configuration language, a tunable zoning archetype, and a method of assembling plans from selected, predefined subplans. A demonstration system for knowledge-based two-dimensional flow field zoning has been successfully implemented and tested on representative aerodynamic configurations. The results show that this approach can produce flow field zonings that are acceptable to experts with differing evaluation criteria.

  10. Computer aided fixture design - A case based approach

    NASA Astrophysics Data System (ADS)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  11. Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy

    NASA Astrophysics Data System (ADS)

    Martinengo, Chiara; Curatelli, Francesco

    Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.

  12. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    NASA Astrophysics Data System (ADS)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  13. Planning for strategic change? A participative planning approach for community hospitals.

    PubMed

    MacDonald, S K; Beange, J E; Blachford, P C

    1992-01-01

    Strategic planning is becoming to hospitals what business case analysis is to private corporations. In fact, this type of planning is becoming essential for the professional management of Ontario hospitals. The participative strategic planning process at Toronto East General Hospital (TEGH) is an example of how a professionally structured and implemented strategic planning process can be successfully developed and implemented in a community hospital. In this article, the environmental factors driving planning are reviewed and the critical success factors for the development and implementation of a strategic plan are examined in the context of TEGH's experience.

  14. A Distributed Laboratory for Event-Driven Coastal Prediction and Hazard Planning

    NASA Astrophysics Data System (ADS)

    Bogden, P.; Allen, G.; MacLaren, J.; Creager, G. J.; Flournoy, L.; Sheng, Y. P.; Graber, H.; Graves, S.; Conover, H.; Luettich, R.; Perrie, W.; Ramakrishnan, L.; Reed, D. A.; Wang, H. V.

    2006-12-01

    The 2005 Atlantic hurricane season was the most active in recorded history. Collectively, 2005 hurricanes caused more than 2,280 deaths and record damages of over 100 billion dollars. Of the storms that made landfall, Dennis, Emily, Katrina, Rita, and Wilma caused most of the destruction. Accurate predictions of storm-driven surge, wave height, and inundation can save lives and help keep recovery costs down, provided the information gets to emergency response managers in time. The information must be available well in advance of landfall so that responders can weigh the costs of unnecessary evacuation against the costs of inadequate preparation. The SURA Coastal Ocean Observing and Prediction (SCOOP) Program is a multi-institution collaboration implementing a modular, distributed service-oriented architecture for real time prediction and visualization of the impacts of extreme atmospheric events. The modular infrastructure enables real-time prediction of multi- scale, multi-model, dynamic, data-driven applications. SURA institutions are working together to create a virtual and distributed laboratory integrating coastal models, simulation data, and observations with computational resources and high speed networks. The loosely coupled architecture allows teams of computer and coastal scientists at multiple institutions to innovate complex system components that are interconnected with relatively stable interfaces. The operational system standardizes at the interface level to enable substantial innovation by complementary communities of coastal and computer scientists. This architectural philosophy solves a long-standing problem associated with the transition from research to operations. The SCOOP Program thereby implements a prototype laboratory consistent with the vision of a national, multi-agency initiative called the Integrated Ocean Observing System (IOOS). Several service- oriented components of the SCOOP enterprise architecture have already been designed and implemented, including data archive and transport services, metadata registry and retrieval (catalog), resource management, and portal interfaces. SCOOP partners are integrating these at the service level and implementing reconfigurable workflows for several kinds of user scenarios, and are working with resource providers to prototype new policies and technologies for on-demand computing.

  15. SU-E-T-628: A Cloud Computing Based Multi-Objective Optimization Method for Inverse Treatment Planning.

    PubMed

    Na, Y; Suh, T; Xing, L

    2012-06-01

    Multi-objective (MO) plan optimization entails generation of an enormous number of IMRT or VMAT plans constituting the Pareto surface, which presents a computationally challenging task. The purpose of this work is to overcome the hurdle by developing an efficient MO method using emerging cloud computing platform. As a backbone of cloud computing for optimizing inverse treatment planning, Amazon Elastic Compute Cloud with a master node (17.1 GB memory, 2 virtual cores, 420 GB instance storage, 64-bit platform) is used. The master node is able to scale seamlessly a number of working group instances, called workers, based on the user-defined setting account for MO functions in clinical setting. Each worker solved the objective function with an efficient sparse decomposition method. The workers are automatically terminated if there are finished tasks. The optimized plans are archived to the master node to generate the Pareto solution set. Three clinical cases have been planned using the developed MO IMRT and VMAT planning tools to demonstrate the advantages of the proposed method. The target dose coverage and critical structure sparing of plans are comparable obtained using the cloud computing platform are identical to that obtained using desktop PC (Intel Xeon® CPU 2.33GHz, 8GB memory). It is found that the MO planning speeds up the processing of obtaining the Pareto set substantially for both types of plans. The speedup scales approximately linearly with the number of nodes used for computing. With the use of N nodes, the computational time is reduced by the fitting model, 0.2+2.3/N, with r̂2>0.99, on average of the cases making real-time MO planning possible. A cloud computing infrastructure is developed for MO optimization. The algorithm substantially improves the speed of inverse plan optimization. The platform is valuable for both MO planning and future off- or on-line adaptive re-planning. © 2012 American Association of Physicists in Medicine.

  16. Automated microwave ablation therapy planning with single and multiple entry points

    NASA Astrophysics Data System (ADS)

    Liu, Sheena X.; Dalal, Sandeep; Kruecker, Jochen

    2012-02-01

    Microwave ablation (MWA) has become a recommended treatment modality for interventional cancer treatment. Compared with radiofrequency ablation (RFA), MWA provides more rapid and larger-volume tissue heating. It allows simultaneous ablation from different entry points and allows users to change the ablation size by controlling the power/time parameters. Ablation planning systems have been proposed in the past, mainly addressing the needs for RFA procedures. Thus a planning system addressing MWA-specific parameters and workflows is highly desirable to help physicians achieve better microwave ablation results. In this paper, we design and implement an automated MWA planning system that provides precise probe locations for complete coverage of tumor and margin. We model the thermal ablation lesion as an ellipsoidal object with three known radii varying with the duration of the ablation and the power supplied to the probe. The search for the best ablation coverage can be seen as an iterative optimization problem. The ablation centers are steered toward the location which minimizes both un-ablated tumor tissue and the collateral damage caused to the healthy tissue. We assess the performance of our algorithm using simulated lesions with known "ground truth" optimal coverage. The Mean Localization Error (MLE) between the computed ablation center in 3D and the ground truth ablation center achieves 1.75mm (Standard deviation of the mean (STD): 0.69mm). The Mean Radial Error (MRE) which is estimated by comparing the computed ablation radii with the ground truth radii reaches 0.64mm (STD: 0.43mm). These preliminary results demonstrate the accuracy and robustness of the described planning algorithm.

  17. 78 FR 34965 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Lima 1997 8-Hour Ozone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Promulgation of Air Quality Implementation Plans; Ohio; Lima 1997 8-Hour Ozone Maintenance Plan Revision; Motor... 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP) to replace the previously... Simulator (MOVES) emissions model. Ohio submitted the SIP revision request to EPA on January 11, 2013. DATES...

  18. 78 FR 34965 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; 1997 8-Hour Ozone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Promulgation of Air Quality Implementation Plans; Ohio; 1997 8-Hour Ozone Maintenance Plan Revision; Motor... request by Ohio to revise the 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP...) emissions model. Ohio submitted the SIP revision request to EPA on December 7, 2012. DATES: Comments must be...

  19. 78 FR 34906 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Lima 1997 8-Hour Ozone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Promulgation of Air Quality Implementation Plans; Ohio; Lima 1997 8-Hour Ozone Maintenance Plan Revision to... Lima, Ohio 1997 8-hour ozone maintenance air quality State Implementation Plan (SIP) to replace motor... (MOVES) emissions model. Ohio submitted the SIP revision request to EPA on January 11, 2013. DATES: This...

  20. ASC FY17 Implementation Plan, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, P. G.

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less

Top