Sample records for save computation time

  1. Computer multitasking with Desqview 386 in a family practice.

    PubMed Central

    Davis, A E

    1990-01-01

    Computers are now widely used in medical practice for accounting and secretarial tasks. However, it has been much more difficult to use computers in more physician-related activities of daily practice. I investigated the Desqview multitasking system on a 386 computer as a solution to this problem. Physician-directed tasks of management of patient charts, retrieval of reference information, word processing, appointment scheduling and office organization were each managed by separate programs. Desqview allowed instantaneous switching back and forth between the various programs. I compared the time and cost savings and the need for physician input between Desqview 386, a 386 computer alone and an older, XT computer. Desqview significantly simplified the use of computer programs for medical information management and minimized the necessity for physician intervention. The time saved was 15 minutes per day; the costs saved were estimated to be $5000 annually. PMID:2383848

  2. Paperless Payroll: Implementation of a Paperless Payroll Certification.

    ERIC Educational Resources Information Center

    Reese, Larry D.

    1991-01-01

    The University of Florida has implemented an online payroll certification system that exemplifies how computer applications can result in higher quality information and provide real cost savings. In this case, the combined personnel savings exceeded 6.5 full-time-equivalent positions, more than twice the computing costs incurred. (MSE)

  3. Clinical impact and value of workstation single sign-on.

    PubMed

    Gellert, George A; Crouch, John F; Gibson, Lynn A; Conklin, George S; Webster, S Luke; Gillean, John A

    2017-05-01

    CHRISTUS Health began implementation of computer workstation single sign-on (SSO) in 2015. SSO technology utilizes a badge reader placed at each workstation where clinicians swipe or "tap" their identification badges. To assess the impact of SSO implementation in reducing clinician time logging in to various clinical software programs, and in financial savings from migrating to a thin client that enabled replacement of traditional hard drive computer workstations. Following implementation of SSO, a total of 65,202 logins were sampled systematically during a 7day period among 2256 active clinical end users for time saved in 6 facilities when compared to pre-implementation. Dollar values were assigned to the time saved by 3 groups of clinical end users: physicians, nurses and ancillary service providers. The reduction of total clinician login time over the 7day period showed a net gain of 168.3h per week of clinician time - 28.1h (2.3 shifts) per facility per week. Annualized, 1461.2h of mixed physician and nursing time is liberated per facility per annum (121.8 shifts of 12h per year). The annual dollar cost savings of this reduction of time expended logging in is $92,146 per hospital per annum and $1,658,745 per annum in the first phase implementation of 18 hospitals. Computer hardware equipment savings due to desktop virtualization increases annual savings to $2,333,745. Qualitative value contributions to clinician satisfaction, reduction in staff turnover, facilitation of adoption of EHR applications, and other benefits of SSO are discussed. SSO had a positive impact on clinician efficiency and productivity in the 6 hospitals evaluated, and is an effective and cost-effective method to liberate clinician time from repetitive and time consuming logins to clinical software applications. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Efficient Fourier-based algorithms for time-periodic unsteady problems

    NASA Astrophysics Data System (ADS)

    Gopinath, Arathi Kamath

    2007-12-01

    This dissertation work proposes two algorithms for the simulation of time-periodic unsteady problems via the solution of Unsteady Reynolds-Averaged Navier-Stokes (URANS) equations. These algorithms use a Fourier representation in time and hence solve for the periodic state directly without resolving transients (which consume most of the resources in a time-accurate scheme). In contrast to conventional Fourier-based techniques which solve the governing equations in frequency space, the new algorithms perform all the calculations in the time domain, and hence require minimal modifications to an existing solver. The complete space-time solution is obtained by iterating in a fifth pseudo-time dimension. Various time-periodic problems such as helicopter rotors, wind turbines, turbomachinery and flapping-wings can be simulated using the Time Spectral method. The algorithm is first validated using pitching airfoil/wing test cases. The method is further extended to turbomachinery problems, and computational results verified by comparison with a time-accurate calculation. The technique can be very memory intensive for large problems, since the solution is computed (and hence stored) simultaneously at all time levels. Often, the blade counts of a turbomachine are rescaled such that a periodic fraction of the annulus can be solved. This approximation enables the solution to be obtained at a fraction of the cost of a full-scale time-accurate solution. For a viscous computation over a three-dimensional single-stage rescaled compressor, an order of magnitude savings is achieved. The second algorithm, the reduced-order Harmonic Balance method is applicable only to turbomachinery flows, and offers even larger computational savings than the Time Spectral method. It simulates the true geometry of the turbomachine using only one blade passage per blade row as the computational domain. In each blade row of the turbomachine, only the dominant frequencies are resolved, namely, combinations of neighbor's blade passing. An appropriate set of frequencies can be chosen by the analyst/designer based on a trade-off between accuracy and computational resources available. A cost comparison with a time-accurate computation for an Euler calculation on a two-dimensional multi-stage compressor obtained an order of magnitude savings, and a RANS calculation on a three-dimensional single-stage compressor achieved two orders of magnitude savings, with comparable accuracy.

  5. The Computer Revolution and Physical Chemistry.

    ERIC Educational Resources Information Center

    O'Brien, James F.

    1989-01-01

    Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)

  6. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    NASA Technical Reports Server (NTRS)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  7. 39 CFR 966.6 - Filing, docketing and serving documents; computation of time; representation of parties.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... are between 8:45 a.m. and 4:45 p.m., eastern standard or daylight saving time as appropriate during...; computation of time; representation of parties. 966.6 Section 966.6 Postal Service UNITED STATES POSTAL... time; representation of parties. (a) Filing. All documents required under this part must be filed by...

  8. 39 CFR 966.6 - Filing, docketing and serving documents; computation of time; representation of parties.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... are between 8:45 a.m. and 4:45 p.m., eastern standard or daylight saving time as appropriate during...; computation of time; representation of parties. 966.6 Section 966.6 Postal Service UNITED STATES POSTAL... time; representation of parties. (a) Filing. All documents required under this part must be filed by...

  9. Quantitative Microbial Risk Assessment Tutorial: Publishing a Microbial Density Time Series as a Txt File

    EPA Science Inventory

    A SARA Timeseries Utility supports analysis and management of time-varying environmental data including listing, graphing, computing statistics, computing meteorological data and saving in a WDM or text file. File formats supported include WDM, HSPF Binary (.hbn), USGS RDB, and T...

  10. Troubleshooting Computer Problems--a Teachers' Guide.

    ERIC Educational Resources Information Center

    Zeitz, Leigh

    1995-01-01

    Presents a troubleshooting flow chart for teachers and others to use when trying to figure out why their computers do not work correctly. Written mainly for Macintosh computers, the purpose of this guide is to save school technology coordinators time and to help educate teachers. (Author/LRW)

  11. 39 CFR 966.6 - Filing, docketing and serving documents; computation of time; representation of parties.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... business hours are between 8:15 a.m. and 4:45 p.m., eastern standard or daylight saving time as appropriate...; computation of time; representation of parties. 966.6 Section 966.6 Postal Service UNITED STATES POSTAL... time; representation of parties. (a) Filing. All documents required under this part must be filed by...

  12. 39 CFR 966.6 - Filing, docketing and serving documents; computation of time; representation of parties.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... business hours are between 8:15 a.m. and 4:45 p.m., eastern standard or daylight saving time as appropriate...; computation of time; representation of parties. 966.6 Section 966.6 Postal Service UNITED STATES POSTAL... time; representation of parties. (a) Filing. All documents required under this part must be filed by...

  13. 39 CFR 966.6 - Filing, docketing and serving documents; computation of time; representation of parties.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... business hours are between 8:15 a.m. and 4:45 p.m., eastern standard or daylight saving time as appropriate...; computation of time; representation of parties. 966.6 Section 966.6 Postal Service UNITED STATES POSTAL... time; representation of parties. (a) Filing. All documents required under this part must be filed by...

  14. Maximizing Energy Savings for Small Business Text Version | Buildings |

    Science.gov Websites

    owners have a big opportunity to save money and energy, while cutting greenhouse gas emissions. Drawing have the money, nor time, to pursue something like that. Drawing of computer screen, showing NREL's energy and non-energy related benefits. Drawing of money, buildings, machinery, and furniture. Narrator

  15. Energy Savings in Cellular Networks Based on Space-Time Structure of Traffic Loads

    NASA Astrophysics Data System (ADS)

    Sun, Jingbo; Wang, Yue; Yuan, Jian; Shan, Xiuming

    Since most of energy consumed by the telecommunication infrastructure is due to the Base Transceiver Station (BTS), switching off BTSs when traffic load is low has been recognized as an effective way of saving energy. In this letter, an energy saving scheme is proposed to minimize the number of active BTSs based on the space-time structure of traffic loads as determined by principal component analysis. Compared to existing methods, our approach models traffic loads more accurately, and has a much smaller input size. As it is implemented in an off-line manner, our scheme also avoids excessive communications and computing overheads. Simulation results show that the proposed method has a comparable performance in energy savings.

  16. Measured energy savings and performance of power-managed personal computers and monitors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordman, B.; Piette, M.A.; Kinney, K.

    1996-08-01

    Personal computers and monitors are estimated to use 14 billion kWh/year of electricity, with power management potentially saving $600 million/year by the year 2000. The effort to capture these savings is lead by the US Environmental Protection Agency`s Energy Star program, which specifies a 30W maximum demand for the computer and for the monitor when in a {open_quote}sleep{close_quote} or idle mode. In this paper the authors discuss measured energy use and estimated savings for power-managed (Energy Star compliant) PCs and monitors. They collected electricity use measurements of six power-managed PCs and monitors in their office and five from two othermore » research projects. The devices are diverse in machine type, use patterns, and context. The analysis method estimates the time spent in each system operating mode (off, low-, and full-power) and combines these with real power measurements to derive hours of use per mode, energy use, and energy savings. Three schedules are explored in the {open_quotes}As-operated,{close_quotes} {open_quotes}Standardized,{close_quotes} and `Maximum` savings estimates. Energy savings are established by comparing the measurements to a baseline with power management disabled. As-operated energy savings for the eleven PCs and monitors ranged from zero to 75 kWh/year. Under the standard operating schedule (on 20% of nights and weekends), the savings are about 200 kWh/year. An audit of power management features and configurations for several dozen Energy Star machines found only 11% of CPU`s fully enabled and about two thirds of monitors were successfully power managed. The highest priority for greater power management savings is to enable monitors, as opposed to CPU`s, since they are generally easier to configure, less likely to interfere with system operation, and have greater savings. The difficulties in properly configuring PCs and monitors is the largest current barrier to achieving the savings potential from power management.« less

  17. Computers in the Gym: Friends and Assistants.

    ERIC Educational Resources Information Center

    Hurwitz, Dick

    Designed to assist physical education teachers realize the benefits of microcomputer usage, this paper presents the case study of a hypothetical middle school teacher who utilizes Apple computers for record-keeping, planning, teaching, and coaching. The case study shows how the computers save time, assist in individualizing instruction, help…

  18. A SIEPON based transmitter sleep mode energy-efficient mechanism in EPON

    NASA Astrophysics Data System (ADS)

    Nikoukar, AliAkbar; Hwang, I.-Shyan; Wang, Chien-Jung; Ab-Rahman, Mohammad Syuhaimi; Liem, Andrew Tanny

    2015-06-01

    The main energy consumption in computer networks is the access networks. The passive optical network (PON) has the least energy consumption among access network technologies. In addition, the time division multiplexing (TDM) Ethernet PON (EPON) is one of the best candidates to improve energy consumption by time utilization. The optical network unit (ONU) can utilize the time and save the energy in the EPON by turning off its transmitter/receiver when there is no upstream/downstream traffic. The ITU-T and IEEE organizations are published standards for energy-saving in the TDM-PON. Although their standards provide the framework to accomplish the energy-saving, the algorithms/criteria to generate events to accommodate various operational policies, time to wake up, parameter values for timers are out of scope of the standards. Many studies have proposed schemes for energy-saving in TDM-PON to achieve maximum energy saving. Even so, these schemes increase the mean packet delay and consequently, reduce the quality of service (QoS). In this paper, first we take a look to the state of the art for PON energy-saving. Additionally, a mechanism based on SIEPON standard in EPON with new components in the ONUs and optical line terminal (OLT) is proposed to save the transmitter energy and guarantee QoS. The proposed mechanism follows the SIEPON standard, considers the QoS first, and then saves the energy as far as possible. The ONU sleep controller unit (OSC) and green dynamic bandwidth allocation (GDBA) are used to calculate the ONU transmitter sleep (Tx) duration and grant the proper time to the ONUs. Simulation results show that the proposed energy-saving mechanism not only promises the QoS performance in terms of mean packet delay, packet loss, throughput, and jitter, but also saves energy in different maximum cycle times.

  19. Orientation/Time Management Skill Training Lesson: Development and Evaluation

    DTIC Science & Technology

    1979-07-01

    instructional environment. This Orientation/ Time Management lesson provides students with appropriate role models for increasing acceptance of their...time savings can be obtained by a combination of this type of orientation and time management skill training with a computer-based progress targeting

  20. A computer-managed undergraduate physics laboratory

    NASA Astrophysics Data System (ADS)

    Kalman, C. S.

    1987-01-01

    Seventeen one-semester undergraduate laboratory courses are managed by a microcomputer system at Concordia University. Students may perform experiments at any time during operating hours. The computer administers pre- and post-tests. Considerable savings in manpower costs is achieved. The system also provides many pedagogical advantages.

  1. Computers Help Technicians Become Managers.

    ERIC Educational Resources Information Center

    Instructional Innovator, 1984

    1984-01-01

    Briefly describes the Academy of Advanced Traffic's use of the Numerax electronic tariff library in financial management, business logistics management, and warehousing courses to familiarize future traffic managers with time saving computer-based information systems that will free them to become integral members of their company's decision-making…

  2. Pan Am gets big savings at no cost

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanz, D.

    Pan American World Airways' contract with an energy management control systems distributor enabled the company's terminal and maintenance facilities at JFK airport in New York to shift from housekeeping to major savings without additional cost. Energy savings from a pneumatic control system were split almost equally between Pan Am and Thomas S. Brown Associates (TSBA) Inc., and further savings are expected from a planned computer-controlled system. A full-time energy manager, able to give top priority to energy-consumption problems, was considered crucial to the program's success. Early efforts in light-level reduction and equipment scheduling required extensive persuasion and policing, but successfulmore » energy savings allowed the manager to progress to the more-extensive plants with TSBA.« less

  3. Far-field radiation patterns of aperture antennas by the Winograd Fourier transform algorithm

    NASA Technical Reports Server (NTRS)

    Heisler, R.

    1978-01-01

    A more time-efficient algorithm for computing the discrete Fourier transform, the Winograd Fourier transform (WFT), is described. The WFT algorithm is compared with other transform algorithms. Results indicate that the WFT algorithm in antenna analysis appears to be a very successful application. Significant savings in cpu time will improve the computer turn around time and circumvent the need to resort to weekend runs.

  4. Comparability of Computer Delivered versus Traditional Paper and Pencil Testing

    ERIC Educational Resources Information Center

    Strader, Douglas A.

    2012-01-01

    There are many advantages supporting the use of computers as an alternate mode of delivery for high stakes testing: cost savings, increased test security, flexibility in test administrations, innovations in items, and reduced scoring time. The purpose of this study was to determine if the use of computers as the mode of delivery had any…

  5. 31 CFR 351.66 - What book-entry Series EE savings bonds are included in the computation?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What book-entry Series EE savings... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.66 What book-entry Series EE savings bonds are included in the computation? (a) We include all bonds that...

  6. 31 CFR 351.66 - What book-entry Series EE savings bonds are included in the computation?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false What book-entry Series EE savings... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.66 What book-entry Series EE savings bonds are included in the computation? (a) We include all bonds that...

  7. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing

    PubMed Central

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932

  8. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    PubMed

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  9. 31 CFR 359.51 - What book-entry Series I savings bonds are included in the computation?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false What book-entry Series I savings bonds... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES I Book-Entry Series I Savings Bonds § 359.51 What book-entry Series I savings bonds are included in the computation? (a) We include all bonds that you...

  10. Short-term versus long-term rainfall time series in the assessment of potable water savings by using rainwater in houses.

    PubMed

    Ghisi, Enedir; Cardoso, Karla Albino; Rupp, Ricardo Forgiarini

    2012-06-15

    The main objective of this article is to assess the possibility of using short-term instead of long-term rainfall time series to evaluate the potential for potable water savings by using rainwater in houses. The analysis was performed considering rainfall data from 1960 to 1995 for the city of Santa Bárbara do Oeste, located in the state of São Paulo, southeastern Brazil. The influence of the rainfall time series, roof area, potable water demand and percentage rainwater demand on the potential for potable water savings was evaluated. The potential for potable water savings was estimated using computer simulations considering a set of long-term rainfall time series and different sets of short-term rainfall time series. The ideal rainwater tank capacity was also assessed for some cases. It was observed that the higher the percentage rainwater demand and the shorter the rainfall time series, the larger the difference between the potential for potable water savings and the greater the variation in the ideal rainwater tank size. The sets of short-term rainfall time series considered adequate for different scenarios ranged from 1 to 13 years depending on the roof area, percentage rainwater demand and potable water demand. The main finding of the research is that sets of short-term rainfall time series can be used to assess the potential for potable water savings by using rainwater, as the results obtained are similar to those obtained from the long-term rainfall time series. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  12. Teach Efficient Production with Modular Fixturing Pallets

    ERIC Educational Resources Information Center

    Creger, Don W.; Payne, Brent A.

    2010-01-01

    Advances in technology have yielded computer numerical control (CNC) machines and computer-aided manufacturing (CAM) software that saves time and increases productivity in today's industrial world. Training students to understand and use these technologies has become a key ingredient in preparing them for work in industry. Teachers of machining…

  13. Benefits Analysis of Multi-Center Dynamic Weather Routes

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; McNally, David; Morando, Alexander; Clymer, Alexis; Lock, Jennifer; Petersen, Julien

    2014-01-01

    Dynamic weather routes are flight plan corrections that can provide airborne flights more than user-specified minutes of flying-time savings, compared to their current flight plan. These routes are computed from the aircraft's current location to a flight plan fix downstream (within a predefined limit region), while avoiding forecasted convective weather regions. The Dynamic Weather Routes automation has been continuously running with live air traffic data for a field evaluation at the American Airlines Integrated Operations Center in Fort Worth, TX since July 31, 2012, where flights within the Fort Worth Air Route Traffic Control Center are evaluated for time savings. This paper extends the methodology to all Centers in United States and presents benefits analysis of Dynamic Weather Routes automation, if it was implemented in multiple airspace Centers individually and concurrently. The current computation of dynamic weather routes requires a limit rectangle so that a downstream capture fix can be selected, preventing very large route changes spanning several Centers. In this paper, first, a method of computing a limit polygon (as opposed to a rectangle used for Fort Worth Center) is described for each of the 20 Centers in the National Airspace System. The Future ATM Concepts Evaluation Tool, a nationwide simulation and analysis tool, is used for this purpose. After a comparison of results with the Center-based Dynamic Weather Routes automation in Fort Worth Center, results are presented for 11 Centers in the contiguous United States. These Centers are generally most impacted by convective weather. A breakdown of individual Center and airline savings is presented and the results indicate an overall average savings of about 10 minutes of flying time are obtained per flight.

  14. 17 CFR 260.7a-4 - Calculation of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... eastern daylight-saving time, whichever is in effect at the principal office of the Commission on the date... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Calculation of time. 260.7a-4... time. Saturdays, Sundays and holidays shall be counted in computing the effective date of applications...

  15. 17 CFR 260.7a-4 - Calculation of time.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... eastern daylight-saving time, whichever is in effect at the principal office of the Commission on the date... 17 Commodity and Securities Exchanges 4 2014-04-01 2014-04-01 false Calculation of time. 260.7a-4... time. Saturdays, Sundays and holidays shall be counted in computing the effective date of applications...

  16. 17 CFR 260.7a-4 - Calculation of time.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... eastern daylight-saving time, whichever is in effect at the principal office of the Commission on the date... 17 Commodity and Securities Exchanges 3 2013-04-01 2013-04-01 false Calculation of time. 260.7a-4... time. Saturdays, Sundays and holidays shall be counted in computing the effective date of applications...

  17. 17 CFR 260.7a-4 - Calculation of time.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... eastern daylight-saving time, whichever is in effect at the principal office of the Commission on the date... 17 Commodity and Securities Exchanges 3 2011-04-01 2011-04-01 false Calculation of time. 260.7a-4... time. Saturdays, Sundays and holidays shall be counted in computing the effective date of applications...

  18. 17 CFR 260.7a-4 - Calculation of time.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... eastern daylight-saving time, whichever is in effect at the principal office of the Commission on the date... 17 Commodity and Securities Exchanges 3 2012-04-01 2012-04-01 false Calculation of time. 260.7a-4... time. Saturdays, Sundays and holidays shall be counted in computing the effective date of applications...

  19. Computed Tomography For Internal Inspection Of Castings

    NASA Technical Reports Server (NTRS)

    Hanna, Timothy L.

    1995-01-01

    Computed tomography used to detect internal flaws in metal castings before machining and otherwise processing them into finished parts. Saves time and money otherwise wasted on machining and other processing of castings eventually rejected because of internal defects. Knowledge of internal defects gained by use of computed tomography also provides guidance for changes in foundry techniques, procedures, and equipment to minimize defects and reduce costs.

  20. Don't Gamble with Y2K Compliance.

    ERIC Educational Resources Information Center

    Sturgeon, Julie

    1999-01-01

    Examines one school district's (Clark County, Nevada) response to the Y2K computer problem and provides tips on time-saving Y2K preventive measures other school districts can use. Explains how the district de-bugged its computer system including mainframe considerations and client-server applications. Highlights office equipment and teaching…

  1. An economic prefeasibility study of geothermal energy development at Platanares, Honduras. Estudio economico de prefactibilidad del desarrollo de energia geotermica en Platanares, Honduras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trocki, L.K.

    1989-09-01

    The expected economic benefits from development of a geothermal power plant at Platanares in the Department of Copan, Honduras are evaluated in this report. The economic benefits of geothermal plants ranging in size from a 10-MW plant in the shallow reservoir to a 20-, 30-, 55-, or 110-MW plant in the assumed deeper reservoir were measured by computing optimal expansion plans for each size of geothermal plant. Savings are computed as the difference in present value cost between a plan that contains no geothermal plant and one that does. Present value savings in millions of 1987 dollars range from $25more » million for the 10-MW plant to $110 million for the 110-MW plant -- savings of 6% to 25% over the time period 1988 through 2008. 8 refs., 9 figs., 6 tabs.« less

  2. Efficient Memory Access with NumPy Global Arrays using Local Memory Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Berghofer, Dan C.

    This paper discusses the work completed working with Global Arrays of data on distributed multi-computer systems and improving their performance. The tasks completed were done at Pacific Northwest National Laboratory in the Science Undergrad Laboratory Internship program in the summer of 2013 for the Data Intensive Computing Group in the Fundamental and Computational Sciences DIrectorate. This work was done on the Global Arrays Toolkit developed by this group. This toolkit is an interface for programmers to more easily create arrays of data on networks of computers. This is useful because scientific computation is often done on large amounts of datamore » sometimes so large that individual computers cannot hold all of it. This data is held in array form and can best be processed on supercomputers which often consist of a network of individual computers doing their computation in parallel. One major challenge for this sort of programming is that operations on arrays on multiple computers is very complex and an interface is needed so that these arrays seem like they are on a single computer. This is what global arrays does. The work done here is to use more efficient operations on that data that requires less copying of data to be completed. This saves a lot of time because copying data on many different computers is time intensive. The way this challenge was solved is when data to be operated on with binary operations are on the same computer, they are not copied when they are accessed. When they are on separate computers, only one set is copied when accessed. This saves time because of less copying done although more data access operations were done.« less

  3. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Taylor, Cody

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less

  4. Nonlinear Analysis of a Bolted Marine Riser Connector Using NASTRAN Substructuring

    NASA Technical Reports Server (NTRS)

    Fox, G. L.

    1984-01-01

    Results of an investigation of the behavior of a bolted, flange type marine riser connector is reported. The method used to account for the nonlinear effect of connector separation due to bolt preload and axial tension load is described. The automated multilevel substructing capability of COSMIC/NASTRAN was employed at considerable savings in computer run time. Simplified formulas for computer resources, i.e., computer run times for modules SDCOMP, FBS, and MPYAD, as well as disk storage space, are presented. Actual run time data on a VAX-11/780 is compared with the formulas presented.

  5. Automated Ordering System.

    ERIC Educational Resources Information Center

    Jones, Richard M.

    1981-01-01

    A computer program that utilizes an optical scanning machine is used for ordering supplies in a Louisiana school system. The program provides savings in time and labor, more accurate data, and easy-to-use reports. (Author/MLF)

  6. Marine Jet

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The marine turbine pump pictured is the Jacuzzi 12YJ, a jet propulsion system for pleasure or commercial boating. Its development was aided by a NASA computer program made available by the Computer Software Management and Information Center (COSMIC) at the University of Georgia. The manufacturer, Jacuzzi Brothers, Incorporated, Little Rock, Arkansas, used COSMIC'S Computer Program for Predicting Turbopump Inducer Loading, which enabled substantial savings in development time and money through reduction of repetitive testing.

  7. A Framework to Improve Energy Efficient Behaviour at Home through Activity and Context Monitoring

    PubMed Central

    García, Óscar; Alonso, Ricardo S.; Corchado, Juan M.

    2017-01-01

    Real-time Localization Systems have been postulated as one of the most appropriated technologies for the development of applications that provide customized services. These systems provide us with the ability to locate and trace users and, among other features, they help identify behavioural patterns and habits. Moreover, the implementation of policies that will foster energy saving in homes is a complex task that involves the use of this type of systems. Although there are multiple proposals in this area, the implementation of frameworks that combine technologies and use Social Computing to influence user behaviour have not yet reached any significant savings in terms of energy. In this work, the CAFCLA framework (Context-Aware Framework for Collaborative Learning Applications) is used to develop a recommendation system for home users. The proposed system integrates a Real-Time Localization System and Wireless Sensor Networks, making it possible to develop applications that work under the umbrella of Social Computing. The implementation of an experimental use case aided efficient energy use, achieving savings of 17%. Moreover, the conducted case study pointed to the possibility of attaining good energy consumption habits in the long term. This can be done thanks to the system’s real time and historical localization, tracking and contextual data, based on which customized recommendations are generated. PMID:28758987

  8. Improved Surgery Planning Using 3-D Printing: a Case Study.

    PubMed

    Singhal, A J; Shetty, V; Bhagavan, K R; Ragothaman, Ananthan; Shetty, V; Koneru, Ganesh; Agarwala, M

    2016-04-01

    The role of 3-D printing is presented for improved patient-specific surgery planning. Key benefits are time saved and surgery outcome. Two hard-tissue surgery models were 3-D printed, for orthopedic, pelvic surgery, and craniofacial surgery. We discuss software data conversion in computed tomography (CT)/magnetic resonance (MR) medical image for 3-D printing. 3-D printed models save time in surgery planning and help visualize complex pre-operative anatomy. Time saved in surgery planning can be as much as two thirds. In addition to improved surgery accuracy, 3-D printing presents opportunity in materials research. Other hard-tissue and soft-tissue cases in maxillofacial, abdominal, thoracic, cardiac, orthodontics, and neurosurgery are considered. We recommend using 3-D printing as standard protocol for surgery planning and for teaching surgery practices. A quick turnaround time of a 3-D printed surgery model, in improved accuracy in surgery planning, is helpful for the surgery team. It is recommended that these costs be within 20 % of the total surgery budget.

  9. Improved Algorithms Speed It Up for Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A

    2005-09-20

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less

  10. Pressure Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    FluiDyne Engineering Corporation, Minneapolis, MN is one of the world's leading companies in design and construction of wind tunnels. In its designing work, FluiDyne uses a computer program called GTRAN. With GTRAN, engineers create a design and test its performance on the computer before actually building a model; should the design fail to meet criteria, the system or any component part can be redesigned and retested on the computer, saving a great deal of time and money.

  11. Program Helps Generate And Manage Graphics

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Living Color Frame Maker (LCFM) computer program generates computer-graphics frames. Graphical frames saved as text files, in readable and disclosed format, easily retrieved and manipulated by user programs for wide range of real-time visual information applications. LCFM implemented in frame-based expert system for visual aids in management of systems. Monitoring, diagnosis, and/or control, diagrams of circuits or systems brought to "life" by use of designated video colors and intensities to symbolize status of hardware components (via real-time feedback from sensors). Status of systems can be displayed. Written in C++ using Borland C++ 2.0 compiler for IBM PC-series computers and compatible computers running MS-DOS.

  12. Using Testbanking To Implement Classroom Management/Extension through the Use of Computers.

    ERIC Educational Resources Information Center

    Thommen, John D.

    Testbanking provides teachers with an effective, low-cost, time-saving opportunity to improve the testing aspect of their classes. Testbanking, which involves the use of a testbank program and a computer, allows teachers to develop and generate tests and test-forms with a minimum of effort. Teachers who test using true and false, multiple choice,…

  13. Computer Health Score

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The algorithm develops a single health score for office computers, today just Windows, but we plan to extend this to Apple computers. The score is derived from various parameters, including: CPU Utilization; Memory Utilization; Various Error logs; Disk Problems; and Disk write queue length. It then uses a weighting scheme to balance these parameters and provide an overall health score. By using these parameters, we are not just assessing the theoretical performance of the components of the computer, rather we are using actual performance metrics that are selected to be a more realistic representation of the experience of the personmore » using the computer. This includes compensating for the nature of their use. If there are two identical computers and the user of one places heavy demands on their computer compared with the user of the second computer, the former will have a lower health score. This allows us to provide a 'fit for purpose' score tailored to the assigned user. This is very helpful data to inform the mangers when individual computers need to be replaced. Additionally it provides specific information that can facilitate the fixing of the computer, to extend it's useful lifetime. This presents direct financial savings, time savings for users transferring from one computer to the next, and better environmental stewardship.« less

  14. A cost/benefit analysis of commercial fusion-fission hybrid reactor development

    NASA Astrophysics Data System (ADS)

    Kostoff, Ronald N.

    1983-04-01

    A simple algorithm was developed that allows rapid computation of the ratio, R, of present worth of benefits to present worth of hybrid R&D program costs as a function of potential hybrid unit electricity cost savings, discount rate, electricity demand growth rate, total hybrid R&D program cost, and time to complete a demonstration reactor. In the sensitivity study, these variables were assigned nominal values (unit electricity cost savings of 4 mills/kW-hr, discount rate of 4%/year, growth rate of 2.25%/year, total R&D program cost of 20 billion, and time to complete a demonstration reactor of 30 years), and the variable of interest was varied about its nominal value. Results show that R increases with decreasing discount rate and increasing unit electricity savings and ranges from 4 to 94 as discount rate ranges from 5 to 3%/year and unit electricity savings range from 2 to 6 mills/kW-hr. R increases with increasing growth rate and ranges from 3 to 187 as growth rate ranges from 1 to 3.5%/year and unit electricity cost savings range from 2 to 6 mills/kW-hr. R attains a maximum value when plotted against time to complete a demonstration reactor. The location of this maximum value occurs at shorter completion times as discount rate increases, and this optimal completion time ranges from 20 years for a discount rate of 4%/year to 45 years for a discount rate of 3%/year.

  15. Practical use of a word processor in a histopathology laboratory.

    PubMed Central

    Briggs, J C; Ibrahim, N B; Mackintosh, I; Norris, D

    1982-01-01

    Some of the facilities available with a commercially purchased word processing program, linked to a DEC PDP 11/23 computer are described, together with an account of the practical histopathological use. The system is based on a share of the computer with a Clinical Chemistry Department. Development was time-consuming and required the constant availability of the Department of Physics. However, once working, considerable saving in secretarial time has resulted and a number of projects have been started which would not have been contemplated without the use of the word processor and its linked computer. Images PMID:7068906

  16. Harmonic analysis of spacecraft power systems using a personal computer

    NASA Technical Reports Server (NTRS)

    Williamson, Frank; Sheble, Gerald B.

    1989-01-01

    The effects that nonlinear devices such as ac/dc converters, HVDC transmission links, and motor drives have on spacecraft power systems are discussed. The nonsinusoidal currents, along with the corresponding voltages, are calculated by a harmonic power flow which decouples and solves for each harmonic component individually using an iterative Newton-Raphson algorithm. The sparsity of the harmonic equations and the overall Jacobian matrix is used to an advantage in terms of saving computer memory space and in terms of reducing computation time. The algorithm could also be modified to analyze each harmonic separately instead of all at the same time.

  17. The research and application of green computer room environmental monitoring system based on internet of things technology

    NASA Astrophysics Data System (ADS)

    Wei, Wang; Chongchao, Pan; Yikai, Liang; Gang, Li

    2017-11-01

    With the rapid development of information technology, the scale of data center increases quickly, and the energy consumption of computer room also increases rapidly, among which, energy consumption of air conditioning cooling makes up a large proportion. How to apply new technology to reduce the energy consumption of the computer room becomes an important topic of energy saving in the current research. This paper study internet of things technology, and design a kind of green computer room environmental monitoring system. In the system, we can get the real-time environment data from the application of wireless sensor network technology, which will be showed in a creative way of three-dimensional effect. In the environment monitor, we can get the computer room assets view, temperature cloud view, humidity cloud view, microenvironment view and so on. Thus according to the condition of the microenvironment, we can adjust the air volume, temperature and humidity parameters of the air conditioning for the individual equipment cabinet to realize the precise air conditioning refrigeration. And this can reduce the energy consumption of air conditioning, as a result, the overall energy consumption of the green computer room will reduce greatly. At the same time, we apply this project in the computer center of Weihai, and after a year of test and running, we find that it took a good energy saving effect, which fully verified the effectiveness of this project on the energy conservation of the computer room.

  18. A coarse-grid projection method for accelerating incompressible flow computations

    NASA Astrophysics Data System (ADS)

    San, Omer; Staples, Anne E.

    2013-01-01

    We present a coarse-grid projection (CGP) method for accelerating incompressible flow computations, which is applicable to methods involving Poisson equations as incompressibility constraints. The CGP methodology is a modular approach that facilitates data transfer with simple interpolations and uses black-box solvers for the Poisson and advection-diffusion equations in the flow solver. After solving the Poisson equation on a coarsened grid, an interpolation scheme is used to obtain the fine data for subsequent time stepping on the full grid. A particular version of the method is applied here to the vorticity-stream function, primitive variable, and vorticity-velocity formulations of incompressible Navier-Stokes equations. We compute several benchmark flow problems on two-dimensional Cartesian and non-Cartesian grids, as well as a three-dimensional flow problem. The method is found to accelerate these computations while retaining a level of accuracy close to that of the fine resolution field, which is significantly better than the accuracy obtained for a similar computation performed solely using a coarse grid. A linear acceleration rate is obtained for all the cases we consider due to the linear-cost elliptic Poisson solver used, with reduction factors in computational time between 2 and 42. The computational savings are larger when a suboptimal Poisson solver is used. We also find that the computational savings increase with increasing distortion ratio on non-Cartesian grids, making the CGP method a useful tool for accelerating generalized curvilinear incompressible flow solvers.

  19. Point-Process Models of Social Network Interactions: Parameter Estimation and Missing Data Recovery

    DTIC Science & Technology

    2014-08-01

    treating them as zero will have a de minimis impact on the results, but avoiding computing them (and computing with them) saves tremendous time. Set a... test the methods on simulated time series on artificial social networks, including some toy networks and some meant to resemble IkeNet. We conclude...the section by discussing the results in detail. In each of our tests we begin with a complete data set, whether it is real (IkeNet) or simulated. Then

  20. Application of automated measurement and verification to utility energy efficiency program data

    DOE PAGES

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...

    2017-02-17

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  1. Application of automated measurement and verification to utility energy efficiency program data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  2. A new ChainMail approach for real-time soft tissue simulation.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-07-03

    This paper presents a new ChainMail method for real-time soft tissue simulation. This method enables the use of different material properties for chain elements to accommodate various materials. Based on the ChainMail bounding region, a new time-saving scheme is developed to improve computational efficiency for isotropic materials. The proposed method also conserves volume and strain energy. Experimental results demonstrate that the proposed ChainMail method can not only accommodate isotropic, anisotropic and heterogeneous materials but also model incompressibility and relaxation behaviors of soft tissues. Further, the proposed method can achieve real-time computational performance.

  3. Method of up-front load balancing for local memory parallel processors

    NASA Technical Reports Server (NTRS)

    Baffes, Paul Thomas (Inventor)

    1990-01-01

    In a parallel processing computer system with multiple processing units and shared memory, a method is disclosed for uniformly balancing the aggregate computational load in, and utilizing minimal memory by, a network having identical computations to be executed at each connection therein. Read-only and read-write memory are subdivided into a plurality of process sets, which function like artificial processing units. Said plurality of process sets is iteratively merged and reduced to the number of processing units without exceeding the balance load. Said merger is based upon the value of a partition threshold, which is a measure of the memory utilization. The turnaround time and memory savings of the instant method are functions of the number of processing units available and the number of partitions into which the memory is subdivided. Typical results of the preferred embodiment yielded memory savings of from sixty to seventy five percent.

  4. RDBMS Applications as Online Based Data Archive: A Case of Harbour Medical Center in Pekanbaru

    NASA Astrophysics Data System (ADS)

    Febriadi, Bayu; Zamsuri, Ahmad

    2017-12-01

    Kantor Kesehatan Pelabuhan Kelas II Pekanbaru is a government office that concerns about healthy, especially about environment health. There is a problem in case of saving electronic data, also in analyzing daily data both for internal and external data. The office has some computers and other tools that are useful in saving electronic data. In fact, the data are still saved in available cupboards and it is not efficient for an important data that is analyzed for more than one time. In other words, it is not good for a data is needed to be analyzed continuously. Rational Data Base Management System (RDBMS) application is an online based saving data and it uses System Development Life Cycle (SDLC) method. Hopefully, the application will be very useful for employees Kantor Kesehatan Pelabuhan Pekanbaru in managing their work.

  5. Architectural Analysis of Complex Evolving Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Ackemann, Chris; Yonkwa, Lyly; Ganesan, Dharma

    2009-01-01

    The goal of this collaborative project between FC-MD, APL, and GSFC and supported by NASA IV&V Software Assurance Research Program (SARP), was to develop a tool, Dynamic SAVE, or Dyn-SAVE for short, for analyzing architectures of systems of systems. The project team was comprised of the principal investigator (PI) from FC-MD and four other FC-MD scientists (part time) and several FC-MD students (full time), as well as, two APL software architects (part time), and one NASA POC (part time). The PI and FC-MD scientists together with APL architects were responsible for requirements analysis, and for applying and evaluating the Dyn-SAVE tool and method. The PI and a group of FC-MD scientists were responsible for improving the method and conducting outreach activities, while another group of FC-MD scientists were responsible for development and improvement of the tool. Oversight and reporting was conducted by the PI and NASA POC. The project team produced many results including several prototypes of the Dyn-SAVE tool and method, several case studies documenting how the tool and method was applied to APL s software systems, and several published papers in highly respected conferences and journals. Dyn-SAVE as developed and enhanced throughout this research period, is a software tool intended for software developers and architects, software integration testers, and persons who need to analyze software systems from the point of view of how it communicates with other systems. Using the tool, the user specifies the planned communication behavior of the system modeled as a sequence diagram. The user then captures and imports the actual communication behavior of the system, which is then converted and visualized as a sequence diagram by Dyn-SAVE. After mapping the planned to the actual and specifying parameter and timing constraints, Dyn-SAVE detects and highlights deviations between the planned and the actual behavior. Requirements based on the need to analyze two inter-system communication protocols that are representative of protocols used in the Aerospace industry have been specified. The protocols are related: APL s Common Ground System (CGS) as used in the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) and the Radiation Belt Space Probes (RBSP) missions. The analyzed communications were implementations of the Telemetry protocol and the CCSDS File Delivery Protocol (CFDP) protocol. Based on these requirements, three prototypes of Dyn-SAVE were developed and applied to these protocols. The application of Dyn-SAVE to these protocols resulted in the detection of several issues. Dyn-SAVE was also applied to several Testbeds that have previously been used for experimentation earlier on this project, as well as, to other protocols and logs for testing its broader applicability. For example, Dyn-SAVE was used to analyze 1) the communication pattern between a web browser and a web server, 2) the system log of a computer in order to detect offnominal computer shut-down behavior, and 3) the actual test cases of NASA Goddard s Core Flight System (CFS) and automatically generated test cases in order to determine the overlap between the two sets of test cases. In all cases, Dyn-SAVE assisted in providing insightful conclusions about each of the cases identified above.

  6. Preoperative computed tomography angiography for planning DIEP flap breast reconstruction reduces operative time and overall complications

    PubMed Central

    Rozen, Warren Matthew; Chowdhry, Muhammad; Band, Bassam; Ramakrishnan, Venkat V.; Griffiths, Matthew

    2016-01-01

    Background The approach and operative techniques associated with breast reconstruction have steadily been refined since its inception, with abdominal perforator-based flaps becoming the gold standard reconstructive option for women undergoing breast cancer surgery. The current study comprises a cohort of 632 patients, in whom specific operative times are recorded by a blinded observer, and aims to address the potential benefits seen with the use of computer tomography (CT) scanning preoperatively on operative outcomes, complications and surgical times. Methods A prospectively recorded, retrospective review was undertaken of patients undergoing autologous breast reconstruction with a DIEP flap at the St Andrews Centre over a 4-year period from 2010 to 2014. Computed tomography angiography (CTA) scanning of patients began in September 2012 and thus 2 time periods were compared: 2 years prior to the use of CTA scans and 2 years afterwards. For all patients, key variables were collected including patient demographics, operative times, flap harvest time, pedicle length, surgeon experience and complications. Results In group 1, comprising patients within the period prior to CTA scans, 265 patients underwent 312 flaps; whilst in group 2, the immediately following 2 years, 275 patients had 320 flaps. The use of preoperative CTA scans demonstrated a significant reduction in flap harvest time of 13 minutes (P<0.013). This significant time saving was seen in all flap modifications: unilateral, bilateral and bipedicled DIEP flaps. The greatest time saving was seen in bipedicle flaps, with a 35-minute time saving. The return to theatre rate significantly dropped from 11.2% to 6.9% following the use of CTA scans, but there was no difference in the total failure rate. Conclusions The study has demonstrated both a benefit to flap harvest time as well as overall operative times when using preoperative CTA. The use of CTA was associated with a significant reduction in complications requiring a return to theatre in the immediate postoperative period. Modern scanners and techniques can reduce the level of ionising radiation, facilitating patients being able to benefit from the advantages that this preoperative planning can convey. PMID:27047777

  7. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    NASA Astrophysics Data System (ADS)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  8. Lanczos eigensolution method for high-performance computers

    NASA Technical Reports Server (NTRS)

    Bostic, Susan W.

    1991-01-01

    The theory, computational analysis, and applications are presented of a Lanczos algorithm on high performance computers. The computationally intensive steps of the algorithm are identified as: the matrix factorization, the forward/backward equation solution, and the matrix vector multiples. These computational steps are optimized to exploit the vector and parallel capabilities of high performance computers. The savings in computational time from applying optimization techniques such as: variable band and sparse data storage and access, loop unrolling, use of local memory, and compiler directives are presented. Two large scale structural analysis applications are described: the buckling of a composite blade stiffened panel with a cutout, and the vibration analysis of a high speed civil transport. The sequential computational time for the panel problem executed on a CONVEX computer of 181.6 seconds was decreased to 14.1 seconds with the optimized vector algorithm. The best computational time of 23 seconds for the transport problem with 17,000 degs of freedom was on the the Cray-YMP using an average of 3.63 processors.

  9. Simplifying Facility and Event Scheduling: Saving Time and Money.

    ERIC Educational Resources Information Center

    Raasch, Kevin

    2003-01-01

    Describes a product called the Event Management System (EMS), a computer software program to manage facility and event scheduling. Provides example of the school district and university uses of EMS. Describes steps in selecting a scheduling-management system. (PKP)

  10. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  11. Using Hand-Held Computers When Conducting National Security Background Interviews: Utility Test Results

    DTIC Science & Technology

    2010-05-01

    Tablet computers resemble ordinary notebook computers but can be set up as a flat display for handwriting by means of a stylus (digital pen). When used...PC accessories, and often strongly resemble notebook computers. However, all tablets can be set up as a flat display for handwriting by means of a...P3: “Depending on how the tablet handles the post-interview process, it would save time over paper.”  P4: “I hoped you were going to say that this

  12. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    NASA Astrophysics Data System (ADS)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  13. New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program

    NASA Technical Reports Server (NTRS)

    Strain, D.; Levy, R.

    1986-01-01

    The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.

  14. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    NASA Astrophysics Data System (ADS)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  15. 31 CFR 363.52 - What amount of book-entry Series EE and Series I savings bonds may I purchase in one year?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for Series I savings bonds. (b) Computation of amount for gifts. Bonds purchased or transferred as gifts will be included in the computation of the purchase limitation for the account of the recipient... and Series I savings bonds may I purchase in one year? 363.52 Section 363.52 Money and Finance...

  16. New core-reflector boundary conditions for transient nodal reactor calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, E.K.; Kim, C.H.; Joo, H.K.

    1995-09-01

    New core-reflector boundary conditions designed for the exclusion of the reflector region in transient nodal reactor calculations are formulated. Spatially flat frequency approximations for the temporal neutron behavior and two types of transverse leakage approximations in the reflector region are introduced to solve the transverse-integrated time-dependent one-dimensional diffusion equation and then to obtain relationships between net current and flux at the core-reflector interfaces. To examine the effectiveness of new core-reflector boundary conditions in transient nodal reactor computations, nodal expansion method (NEM) computations with and without explicit representation of the reflector are performed for Laboratorium fuer Reaktorregelung und Anlagen (LRA) boilingmore » water reactor (BWR) and Nuclear Energy Agency Committee on Reactor Physics (NEACRP) pressurized water reactor (PWR) rod ejection kinetics benchmark problems. Good agreement between two NEM computations is demonstrated in all the important transient parameters of two benchmark problems. A significant amount of CPU time saving is also demonstrated with the boundary condition model with transverse leakage (BCMTL) approximations in the reflector region. In the three-dimensional LRA BWR, the BCMTL and the explicit reflector model computations differ by {approximately}4% in transient peak power density while the BCMTL results in >40% of CPU time saving by excluding both the axial and the radial reflector regions from explicit computational nodes. In the NEACRP PWR problem, which includes six different transient cases, the largest difference is 24.4% in the transient maximum power in the one-node-per-assembly B1 transient results. This difference in the transient maximum power of the B1 case is shown to reduce to 11.7% in the four-node-per-assembly computations. As for the computing time, BCMTL is shown to reduce the CPU time >20% in all six transient cases of the NEACRP PWR.« less

  17. Computerized Fleet Maintenance.

    ERIC Educational Resources Information Center

    Cataldo, John J.

    The computerization of school bus maintenance records by the Niskayuna (New York) Central School District enabled the district's transportation department to engage in management practices resulting in significant savings. The district obtains computer analyses of the work performed on all vehicles, including time spent, parts, labor, costs,…

  18. "Grinding" cavities in polyurethane foam

    NASA Technical Reports Server (NTRS)

    Brower, J. R.; Davey, R. E.; Dixon, W. F.; Robb, P. H.; Zebus, P. P.

    1980-01-01

    Grinding tool installed on conventional milling machine cuts precise cavities in foam blocks. Method is well suited for prototype or midsize production runs and can be adapted to computer control for mass production. Method saves time and materials compared to bonding or hot wire techniques.

  19. Nondestructive Methods for Detecting Defects in Softwood Logs

    Treesearch

    Kristin C. Schad; Daniel L. Schmoldt; Robert J. Ross

    1996-01-01

    Wood degradation and defects, such as voids and knots, affect the quality and processing time of lumber. The ability to detect internal defects in the log can save mills time and processing costs. In this study, we investigated three nondestructive evaluation techniques for detecting internal wood defects. Sound wave transmission, x-ray computed tomography, and impulse...

  20. Efficient genetic algorithms using discretization scheduling.

    PubMed

    McLay, Laura A; Goldberg, David E

    2005-01-01

    In many applications of genetic algorithms, there is a tradeoff between speed and accuracy in fitness evaluations when evaluations use numerical methods with varying discretization. In these types of applications, the cost and accuracy vary from discretization errors when implicit or explicit quadrature is used to estimate the function evaluations. This paper examines discretization scheduling, or how to vary the discretization within the genetic algorithm in order to use the least amount of computation time for a solution of a desired quality. The effectiveness of discretization scheduling can be determined by comparing its computation time to the computation time of a GA using a constant discretization. There are three ingredients for the discretization scheduling: population sizing, estimated time for each function evaluation and predicted convergence time analysis. Idealized one- and two-dimensional experiments and an inverse groundwater application illustrate the computational savings to be achieved from using discretization scheduling.

  1. Atlas-Based Segmentation Improves Consistency and Decreases Time Required for Contouring Postoperative Endometrial Cancer Nodal Volumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Amy V.; Department of Radiation Oncology, St. Luke's-Roosevelt Hospital, New York, NY; Wortham, Angela

    2011-03-01

    Purpose: Accurate target delineation of the nodal volumes is essential for three-dimensional conformal and intensity-modulated radiotherapy planning for endometrial cancer adjuvant therapy. We hypothesized that atlas-based segmentation ('autocontouring') would lead to time savings and more consistent contours among physicians. Methods and Materials: A reference anatomy atlas was constructed using the data from 15 postoperative endometrial cancer patients by contouring the pelvic nodal clinical target volume on the simulation computed tomography scan according to the Radiation Therapy Oncology Group 0418 trial using commercially available software. On the simulation computed tomography scans from 10 additional endometrial cancer patients, the nodal clinical targetmore » volume autocontours were generated. Three radiation oncologists corrected the autocontours and delineated the manual nodal contours under timed conditions while unaware of the other contours. The time difference was determined, and the overlap of the contours was calculated using Dice's coefficient. Results: For all physicians, manual contouring of the pelvic nodal target volumes and editing the autocontours required a mean {+-} standard deviation of 32 {+-} 9 vs. 23 {+-} 7 minutes, respectively (p = .000001), a 26% time savings. For each physician, the time required to delineate the manual contours vs. correcting the autocontours was 30 {+-} 3 vs. 21 {+-} 5 min (p = .003), 39 {+-} 12 vs. 30 {+-} 5 min (p = .055), and 29 {+-} 5 vs. 20 {+-} 5 min (p = .0002). The mean overlap increased from manual contouring (0.77) to correcting the autocontours (0.79; p = .038). Conclusion: The results of our study have shown that autocontouring leads to increased consistency and time savings when contouring the nodal target volumes for adjuvant treatment of endometrial cancer, although the autocontours still required careful editing to ensure that the lymph nodes at risk of recurrence are properly included in the target volume.« less

  2. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  3. Neural Network Design on the SRC-6 Reconfigurable Computer

    DTIC Science & Technology

    2006-12-01

    fingerprint identification. In this field, automatic identification methods are used to save time, especially for the purpose of fingerprint matching in...grid widths and lengths and therefore was useful in producing an accurate canvas with which to create sample training images. The added benefit of...tools available free of charge and readily accessible on the computer, it was simple to design bitmap data files visually on a canvas and then

  4. We introduce an algorithm for the simultaneous reconstruction of faults and slip fields. We prove that the minimum of a related regularized functional converges to the unique solution of the fault inverse problem. We consider a Bayesian approach. We use a parallel multi-core platform and we discuss techniques to save on computational time.

    NASA Astrophysics Data System (ADS)

    Volkov, D.

    2017-12-01

    We introduce an algorithm for the simultaneous reconstruction of faults and slip fields on those faults. We define a regularized functional to be minimized for the reconstruction. We prove that the minimum of that functional converges to the unique solution of the related fault inverse problem. Due to inherent uncertainties in measurements, rather than seeking a deterministic solution to the fault inverse problem, we consider a Bayesian approach. The advantage of such an approach is that we obtain a way of quantifying uncertainties as part of our final answer. On the downside, this Bayesian approach leads to a very large computation. To contend with the size of this computation we developed an algorithm for the numerical solution to the stochastic minimization problem which can be easily implemented on a parallel multi-core platform and we discuss techniques to save on computational time. After showing how this algorithm performs on simulated data and assessing the effect of noise, we apply it to measured data. The data was recorded during a slow slip event in Guerrero, Mexico.

  5. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1995-01-01

    This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.

  6. Evaluating Thin Client Computers for Use by the Polish Army

    DTIC Science & Technology

    2006-06-01

    43 Figure 15. Annual Electricity Cost and Savings for 5 to 100 Users (source: Thin Client Computing...50 percent in hard costs in the first year of thin client network deployment.20 However, the greatest savings come from the reduction in soft costs ...resources from both the classrooms and home. The thin client solution increased the reliability of the IT infrastructure and resulted in cost savings

  7. Comsat Antenna

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The antenna shown is the new, multiple-beam, Unattended Earth Terminal, located at COMSAT Laboratories in Clarksburg, Maryland. Seemingly simple, it is actually a complex structure capable of maintaining contact with several satellites simultaneously (conventional Earth station antennas communicate with only one satellite at a time). In developing the antenna, COMSAT Laboratories used NASTRAN, NASA's structural analysis computer program, together with BANDIT, a companion program. The computer programs were used to model several structural configurations and determine the most suitable, The speed and accuracy of the computerized design analysis afforded appreciable savings in time and money.

  8. Personal Computers and Laser Printers Are Becoming Popular Tools for Creating Documents on Campuses.

    ERIC Educational Resources Information Center

    DeLoughry, Thomas J.

    1987-01-01

    Desktop publishing techniques are bringing control over institutional newsletters, catalogues, brochures, and many other print materials directly to the author's office. The technology also has the potential for integrating campus information systems and saving much time and money. (MSE)

  9. Saving Energy and Money: A Lesson in Computer Power Management

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Hua, David

    2012-01-01

    In this activity, students will develop an understanding of the economic impact of technology by estimating the cost savings of power management strategies in the classroom. Students will learn how to adjust computer display settings to influence the impact that the computer has on the financial burden to the school. They will use mathematics to…

  10. Oh No, I Lost All of My Work!

    ERIC Educational Resources Information Center

    Zimerman, Martin

    2009-01-01

    Having been in the computer industry for many years, the author is reminded of one of the earliest tenets of word processing: Saving one's work, and save it often. It's encouraging to see that people trust computers not to lose their work. Unfortunately, due to budget cuts, aging computer hardware, a possibly questionable electrical supply, and…

  11. Graphic artist in computerland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolberg, K.M.

    1983-01-01

    The field of computer graphics is rapidly opening up to the graphic artist. It is not necessary to be a programming expert to enter this fascinating world. The capabilities of the medium are astounding: neon and metallic effects, translucent plastic and clear glass effects, sensitive 3-D shadings, limitless textures, and above all color. As with any medium, computer graphics has its advantages, such as speed, ease of form manipulation, and a variety of type fonts and alphabets. It also has its limitations, such as data input time, final output turnaround time, and not necessarily being the right medium for themore » job at hand. And finally, it is the time- and cost-saving characteristics of computer-generated visuals, opposed to original artwork, that make computer graphics a viable alternative. This paper focuses on parts of the computer graphics system in use at the Los Alamos National Laboratory to provide specific examples.« less

  12. Application of the perfectly matched layer in 2.5D marine controlled-source electromagnetic modeling

    NASA Astrophysics Data System (ADS)

    Li, Gang; Han, Bo

    2017-09-01

    For the traditional framework of EM modeling algorithms, the Dirichlet boundary is usually used which assumes the field values are zero at the boundaries. This crude condition requires that the boundaries should be sufficiently far away from the area of interest. Although cell sizes could become larger toward the boundaries as electromagnetic wave is propagated diffusively, a large modeling area may still be necessary to mitigate the boundary artifacts. In this paper, the complex frequency-shifted perfectly matched layer (CFS-PML) in stretching Cartesian coordinates is successfully applied to 2.5D frequency-domain marine controlled-source electromagnetic (CSEM) field modeling. By using this PML boundary, one can restrict the modeling area of interest to the target region. Only a few absorbing layers surrounding the computational area can effectively depress the artificial boundary effect without losing the numerical accuracy. A 2.5D marine CSEM modeling scheme with the CFS-PML is developed by using the staggered finite-difference discretization. This modeling algorithm using the CFS-PML is of high accuracy, and shows advantages in computational time and memory saving than that using the Dirichlet boundary. For 3D problem, this computation time and memory saving should be more significant.

  13. On the use of inexact, pruned hardware in atmospheric modelling

    PubMed Central

    Düben, Peter D.; Joven, Jaume; Lingamneni, Avinash; McNamara, Hugh; De Micheli, Giovanni; Palem, Krishna V.; Palmer, T. N.

    2014-01-01

    Inexact hardware design, which advocates trading the accuracy of computations in exchange for significant savings in area, power and/or performance of computing hardware, has received increasing prominence in several error-tolerant application domains, particularly those involving perceptual or statistical end-users. In this paper, we evaluate inexact hardware for its applicability in weather and climate modelling. We expand previous studies on inexact techniques, in particular probabilistic pruning, to floating point arithmetic units and derive several simulated set-ups of pruned hardware with reasonable levels of error for applications in atmospheric modelling. The set-up is tested on the Lorenz ‘96 model, a toy model for atmospheric dynamics, using software emulation for the proposed hardware. The results show that large parts of the computation tolerate the use of pruned hardware blocks without major changes in the quality of short- and long-time diagnostics, such as forecast errors and probability density functions. This could open the door to significant savings in computational cost and to higher resolution simulations with weather and climate models. PMID:24842031

  14. High-speed multiple sequence alignment on a reconfigurable platform.

    PubMed

    Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf

    2006-01-01

    Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.

  15. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the targetmore » vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.« less

  16. Optimizing Aircraft Trajectories with Multiple Cruise Altitudes in the Presence of Winds

    NASA Technical Reports Server (NTRS)

    Ng, Hok K.; Sridhar, Banavar; Grabbe, Shon

    2014-01-01

    This study develops a trajectory optimization algorithm for approximately minimizing aircraft travel time and fuel burn by combining a method for computing minimum-time routes in winds on multiple horizontal planes, and an aircraft fuel burn model for generating fuel-optimal vertical profiles. It is applied to assess the potential benefits of flying user-preferred routes for commercial cargo flights operating between Anchorage, Alaska and major airports in Asia and the contiguous United States. Flying wind optimal trajectories with a fuel-optimal vertical profile reduces average fuel burn of international flights cruising at a single altitude by 1-3 percent. The potential fuel savings of performing en-route step climbs are not significant for many shorter domestic cargo flights that have only one step climb. Wind-optimal trajectories reduce fuel burn and travel time relative to the flight plan route by up to 3 percent for the domestic cargo flights. However, for trans-oceanic traffic, the fuel burn savings could be as much as 10 percent. The actual savings in operations will vary from the simulation results due to differences in the aircraft models and user defined cost indices. In general, the savings are proportional to trip length, and depend on the en-route wind conditions and aircraft types.

  17. Automated System Tests High-Power MOSFET's

    NASA Technical Reports Server (NTRS)

    Huston, Steven W.; Wendt, Isabel O.

    1994-01-01

    Computer-controlled system tests metal-oxide/semiconductor field-effect transistors (MOSFET's) at high voltages and currents. Measures seven parameters characterizing performance of MOSFET, with view toward obtaining early indication MOSFET defective. Use of test system prior to installation of power MOSFET in high-power circuit saves time and money.

  18. Automated Training Evaluation (ATE). Final Report.

    ERIC Educational Resources Information Center

    Charles, John P.; Johnson, Robert M.

    The automation of weapons system training presents the potential for significant savings in training costs in terms of manpower, time, and money. The demonstration of the technical feasibility of automated training through the application of advanced digital computer techniques and advanced training techniques is essential before the application…

  19. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  20. Laser Research

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Eastman Kodak Company, Rochester, New York is a broad-based firm which produces photographic apparatus and supplies, fibers, chemicals and vitamin concentrates. Much of the company's research and development effort is devoted to photographic science and imaging technology, including laser technology. Eastman Kodak is using a COSMIC computer program called LACOMA in the analysis of laser optical systems and camera design studies. The company reports that use of the program has provided development time savings and reduced computer service fees.

  1. Computer simulation for optimizing windbreak placement to save energy for heating and cooling buildings

    Treesearch

    Gordon M. Heisler

    1991-01-01

    Saving energy has recently acquired new importance because of increased concern for dwindling fossil fuel supplies and for the problem of carbon dioxide contributions to global climate change. Many studies have indicated that windbreaks have the ability to save energy for heating buildings. Suggested savings have ranged up 40 percent; though more commonly savings of...

  2. ACON: a multipurpose production controller for plasma physics codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, C.

    1983-01-01

    ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less

  3. Low-cost autonomous perceptron neural network inspired by quantum computation

    NASA Astrophysics Data System (ADS)

    Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud

    2017-11-01

    Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.

  4. Computer aided drug design

    NASA Astrophysics Data System (ADS)

    Jain, A.

    2017-08-01

    Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.

  5. Computation of unsteady transonic aerodynamics with steady state fixed by truncation error injection

    NASA Technical Reports Server (NTRS)

    Fung, K.-Y.; Fu, J.-K.

    1985-01-01

    A novel technique is introduced for efficient computations of unsteady transonic aerodynamics. The steady flow corresponding to body shape is maintained by truncation error injection while the perturbed unsteady flows corresponding to unsteady body motions are being computed. This allows the use of different grids comparable to the characteristic length scales of the steady and unsteady flows and, hence, allows efficient computation of the unsteady perturbations. An example of typical unsteady computation of flow over a supercritical airfoil shows that substantial savings in computation time and storage without loss of solution accuracy can easily be achieved. This technique is easy to apply and requires very few changes to existing codes.

  6. PyEEG: an open source Python module for EEG/MEG feature extraction.

    PubMed

    Bao, Forrest Sheng; Liu, Xin; Zhang, Christina

    2011-01-01

    Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction.

  7. PyEEG: An Open Source Python Module for EEG/MEG Feature Extraction

    PubMed Central

    Bao, Forrest Sheng; Liu, Xin; Zhang, Christina

    2011-01-01

    Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction. PMID:21512582

  8. Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.

    ERIC Educational Resources Information Center

    Ng, Raymond

    Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…

  9. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  10. Reducing power usage on demand

    NASA Astrophysics Data System (ADS)

    Corbett, G.; Dewhurst, A.

    2016-10-01

    The Science and Technology Facilities Council (STFC) datacentre provides large- scale High Performance Computing facilities for the scientific community. It currently consumes approximately 1.5MW and this has risen by 25% in the past two years. STFC has been investigating leveraging preemption in the Tier 1 batch farm to save power. HEP experiments are increasing using jobs that can be killed to take advantage of opportunistic CPU resources or novel cost models such as Amazon's spot pricing. Additionally, schemes from energy providers are available that offer financial incentives to reduce power consumption at peak times. Under normal operating conditions, 3% of the batch farm capacity is wasted due to draining machines. By using preempt-able jobs, nodes can be rapidly made available to run multicore jobs without this wasted resource. The use of preempt-able jobs has been extended so that at peak times machines can be hibernated quickly to save energy. This paper describes the implementation of the above and demonstrates that STFC could in future take advantage of such energy saving schemes.

  11. Higher-Order Adaptive Finite-Element Methods for Kohn-Sham Density Functional Theory

    DTIC Science & Technology

    2012-07-03

    systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemi- cal accuracy...calculations. Further, we demonstrate the capability of the proposed approach to compute the electronic structure of materials systems contain- ing a...benchmark systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemical accuracy

  12. Development of an efficient procedure for calculating the aerodynamic effects of planform variation

    NASA Technical Reports Server (NTRS)

    Mercer, J. E.; Geller, E. W.

    1981-01-01

    Numerical procedures to compute gradients in aerodynamic loading due to planform shape changes using panel method codes were studied. Two procedures were investigated: one computed the aerodynamic perturbation directly; the other computed the aerodynamic loading on the perturbed planform and on the base planform and then differenced these values to obtain the perturbation in loading. It is indicated that computing the perturbed values directly can not be done satisfactorily without proper aerodynamic representation of the pressure singularity at the leading edge of a thin wing. For the alternative procedure, a technique was developed which saves most of the time-consuming computations from a panel method calculation for the base planform. Using this procedure the perturbed loading can be calculated in about one-tenth the time of that for the base solution.

  13. Computer-Based Mathematics Instructions for Engineering Students

    NASA Technical Reports Server (NTRS)

    Khan, Mustaq A.; Wall, Curtiss E.

    1996-01-01

    Almost every engineering course involves mathematics in one form or another. The analytical process of developing mathematical models is very important for engineering students. However, the computational process involved in the solution of some mathematical problems may be very tedious and time consuming. There is a significant amount of mathematical software such as Mathematica, Mathcad, and Maple designed to aid in the solution of these instructional problems. The use of these packages in classroom teaching can greatly enhance understanding, and save time. Integration of computer technology in mathematics classes, without de-emphasizing the traditional analytical aspects of teaching, has proven very successful and is becoming almost essential. Sample computer laboratory modules are developed for presentation in the classroom setting. This is accomplished through the use of overhead projectors linked to graphing calculators and computers. Model problems are carefully selected from different areas.

  14. Immunity-Based Optimal Estimation Approach for a New Real Time Group Elevator Dynamic Control Application for Energy and Time Saving

    PubMed Central

    Baygin, Mehmet; Karakose, Mehmet

    2013-01-01

    Nowadays, the increasing use of group elevator control systems owing to increasing building heights makes the development of high-performance algorithms necessary in terms of time and energy saving. Although there are many studies in the literature about this topic, they are still not effective enough because they are not able to evaluate all features of system. In this paper, a new approach of immune system-based optimal estimate is studied for dynamic control of group elevator systems. The method is mainly based on estimation of optimal way by optimizing all calls with genetic, immune system and DNA computing algorithms, and it is evaluated with a fuzzy system. The system has a dynamic feature in terms of the situation of calls and the option of the most appropriate algorithm, and it also adaptively works in terms of parameters such as the number of floors and cabins. This new approach which provides both time and energy saving was carried out in real time. The experimental results comparatively demonstrate the effects of method. With dynamic and adaptive control approach in this study carried out, a significant progress on group elevator control systems has been achieved in terms of time and energy efficiency according to traditional methods. PMID:23935433

  15. Thermal radiation view factor: Methods, accuracy and computer-aided procedures

    NASA Technical Reports Server (NTRS)

    Kadaba, P. V.

    1982-01-01

    The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.

  16. QIKAIM, a fast seminumerical algorithm for the generation of minute-of-arc accuracy satellite predictions

    NASA Astrophysics Data System (ADS)

    Vermeer, M.

    1981-07-01

    A program was designed to replace AIMLASER for the generation of aiming predictions, to achieve a major saving in computing time, and to keep the program small enough for use even on small systems. An approach was adopted that incorporated the numerical integration of the orbit through a pass, limiting the computation of osculating elements to only one point per pass. The numerical integration method which is fourth order in delta t in the cumulative error after a given time lapse is presented. Algorithms are explained and a flowchart and listing of the program are provided.

  17. Drone Control System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Drones, subscale vehicles like the Firebees, and full scale retired military aircraft are used to test air defense missile systems. The DFCS (Drone Formation Control System) computer, developed by IBM (International Business Machines) Federal Systems Division, can track ten drones at once. A program called ORACLS is used to generate software to track and control Drones. It was originally developed by Langley and supplied by COSMIC (Computer Software Management and Information Center). The program saved the company both time and money.

  18. Plugging into Energy Savings.

    ERIC Educational Resources Information Center

    Harrigan, Merrilee

    1999-01-01

    The nonprofit Alliance to Save Energy has been helping schools reduce energy consumption through a combination of retrofits, classroom instruction, and behavior. Lists eight small steps to big energy savings, among them: involve the whole school, stop leaks, turn off computers, and recycle. (MLF)

  19. 75 FR 20111 - Energy Conservation Program: Energy Conservation Standards for Residential Water Heaters, Direct...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... the ``three heating products'') must be designed to ``achieve the maximum improvement in energy... and CO 2 savings are performed with different computer models, leading to different time frames for... of EPCA sets forth a variety of provisions designed to improve energy efficiency. Part A\\1\\ of Title...

  20. Electronic Mail Is One High-Tech Management Tool that Really Delivers.

    ERIC Educational Resources Information Center

    Parker, Donald C.

    1987-01-01

    Describes an electronic mail system used by the Horseheads (New York) Central School Distict's eight schools and central office that saves time and enhances productivity. This software calls up information from the district's computer network and sends it to other users' special files--electronic "mailboxes" set aside for messages and…

  1. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  2. 31 CFR 332.5 - Limitation on holdings.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., DEPARTMENT OF THE TREASURY BUREAU OF THE FISCAL SERVICE OFFERING OF UNITED STATES SAVINGS BONDS, SERIES H § 332.5 Limitation on holdings. The amount of Series H bonds, originally issued during any one calendar year, that could be held by any one person, at any one time, computed in accordance with the governing...

  3. 31 CFR 332.5 - Limitation on holdings.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES H § 332.5 Limitation on holdings. The amount of Series H bonds, originally issued during any one calendar year, that could be held by any one person, at any one time, computed in accordance with the governing...

  4. 31 CFR 332.5 - Limitation on holdings.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES H § 332.5 Limitation on holdings. The amount of Series H bonds, originally issued during any one calendar year, that could be held by any one person, at any one time, computed in accordance with the governing...

  5. 31 CFR 332.5 - Limitation on holdings.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES H § 332.5 Limitation on holdings. The amount of Series H bonds, originally issued during any one calendar year, that could be held by any one person, at any one time, computed in accordance with the governing...

  6. 31 CFR 332.5 - Limitation on holdings.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES H § 332.5 Limitation on holdings. The amount of Series H bonds, originally issued during any one calendar year, that could be held by any one person, at any one time, computed in accordance with the governing...

  7. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  8. Analysis of Application Power and Schedule Composition in a High Performance Computing Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elmore, Ryan; Gruchalla, Kenny; Phillips, Caleb

    As the capacity of high performance computing (HPC) systems continues to grow, small changes in energy management have the potential to produce significant energy savings. In this paper, we employ an extensive informatics system for aggregating and analyzing real-time performance and power use data to evaluate energy footprints of jobs running in an HPC data center. We look at the effects of algorithmic choices for a given job on the resulting energy footprints, and analyze application-specific power consumption, and summarize average power use in the aggregate. All of these views reveal meaningful power variance between classes of applications as wellmore » as chosen methods for a given job. Using these data, we discuss energy-aware cost-saving strategies based on reordering the HPC job schedule. Using historical job and power data, we present a hypothetical job schedule reordering that: (1) reduces the facility's peak power draw and (2) manages power in conjunction with a large-scale photovoltaic array. Lastly, we leverage this data to understand the practical limits on predicting key power use metrics at the time of submission.« less

  9. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section A

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    Various advanced energy conversion systems (ECS) are compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidates which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on-site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented for coal fired process boilers. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented.

  10. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang; Flapper, Joris; Ke, Jing

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  11. Cost-Benefit Analysis of a Support Program for Nursing Staff.

    PubMed

    Moran, Dane; Wu, Albert W; Connors, Cheryl; Chappidi, Meera R; Sreedhara, Sushama K; Selter, Jessica H; Padula, William V

    2017-04-27

    A peer-support program called Resilience In Stressful Events (RISE) was designed to help hospital staff cope with stressful patient-related events. The aim of this study was to evaluate the impact of the RISE program by conducting an economic evaluation of its cost benefit. A Markov model with a 1-year time horizon was developed to compare the cost benefit with and without the RISE program from a provider (hospital) perspective. Nursing staff who used the RISE program between 2015 and 2016 at a 1000-bed, private hospital in the United States were included in the analysis. The cost of running the RISE program, nurse turnover, and nurse time off were modeled. Data on costs were obtained from literature review and hospital data. Probabilities of quitting or taking time off with or without the RISE program were estimated using survey data. Net monetary benefit (NMB) and budget impact of having the RISE program were computed to determine cost benefit to the hospital. Expected model results of the RISE program found a net monetary benefit savings of US $22,576.05 per nurse who initiated a RISE call. These savings were determined to be 99.9% consistent on the basis of a probabilistic sensitivity analysis. The budget impact analysis revealed that a hospital could save US $1.81 million each year because of the RISE program. The RISE program resulted in substantial cost savings to the hospital. Hospitals should be encouraged by these findings to implement institution-wide support programs for medical staff, based on a high demand for this type of service and the potential for cost savings.

  12. A high-order strong stability preserving Runge-Kutta method for three-dimensional full waveform modeling and inversion of anelastic models

    NASA Astrophysics Data System (ADS)

    Wang, N.; Shen, Y.; Yang, D.; Bao, X.; Li, J.; Zhang, W.

    2017-12-01

    Accurate and efficient forward modeling methods are important for high resolution full waveform inversion. Compared with the elastic case, solving anelastic wave equation requires more computational time, because of the need to compute additional material-independent anelastic functions. A numerical scheme with a large Courant-Friedrichs-Lewy (CFL) condition number enables us to use a large time step to simulate wave propagation, which improves computational efficiency. In this work, we apply the fourth-order strong stability preserving Runge-Kutta method with an optimal CFL coeffiecient to solve the anelastic wave equation. We use a fourth order DRP/opt MacCormack scheme for the spatial discretization, and we approximate the rheological behaviors of the Earth by using the generalized Maxwell body model. With a larger CFL condition number, we find that the computational efficient is significantly improved compared with the traditional fourth-order Runge-Kutta method. Then, we apply the scattering-integral method for calculating travel time and amplitude sensitivity kernels with respect to velocity and attenuation structures. For each source, we carry out one forward simulation and save the time-dependent strain tensor. For each station, we carry out three `backward' simulations for the three components and save the corresponding strain tensors. The sensitivity kernels at each point in the medium are the convolution of the two sets of the strain tensors. Finally, we show several synthetic tests to verify the effectiveness of the strong stability preserving Runge-Kutta method in generating accurate synthetics in full waveform modeling, and in generating accurate strain tensors for calculating sensitivity kernels at regional and global scales.

  13. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  14. [Use of cyber library and digital tools are crucial for academic surgeons].

    PubMed

    Tomizawa, Yasuko

    2010-10-01

    In addition to busy clinical work, an academic surgeon has to spend a lot of time and efforts in writing and submitting articles to scientific journals, teaching young surgical trainees to write an article, organizing and updating his/her academic performances in the curriculum vitae, and writing research grant applications. The use of cyber library and commercially available computer software is useful in saving time and effort.

  15. System Level Applications of Adaptive Computing (SLAAC)

    DTIC Science & Technology

    2003-11-01

    saved clock cycles, as the computation cycle time was directly proportional to the number of bitplanes in the image. The simulation was undertaken in...S-1][D -1] SK E W E R [k+K S-1][0] SK E W E R [k+K S-1][1] MinMax MinMax MinMax Min - IdxMin Max - IdxMax 0 Figure 3: PPI algorithm architeture ...parallel processing of data. The total throughput in these extended architectures is directly proportional to the amount of resources (CLB slices

  16. Software for simulation of a computed tomography imaging spectrometer using optical design software

    NASA Astrophysics Data System (ADS)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  17. Thermoelectric property measurements with computer controlled systems

    NASA Technical Reports Server (NTRS)

    Chmielewski, A. B.; Wood, C.

    1984-01-01

    A joint JPL-NASA program to develop an automated system to measure the thermoelectric properties of newly developed materials is described. Consideration is given to the difficulties created by signal drift in measurements of Hall voltage and the Large Delta T Seebeck coefficient. The benefits of a computerized system were examined with respect to error reduction and time savings for human operators. It is shown that the time required to measure Hall voltage can be reduced by a factor of 10 when a computer is used to fit a curve to the ratio of the measured signal and its standard deviation. The accuracy of measurements of the Large Delta T Seebeck coefficient and thermal diffusivity was also enhanced by the use of computers.

  18. Dynamic Computation Offloading for Low-Power Wearable Health Monitoring Systems.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Mortazavi, Bobak; Alshurafa, Nabil; Sarrafzadeh, Majid

    2017-03-01

    The objective of this paper is to describe and evaluate an algorithm to reduce power usage and increase battery lifetime for wearable health-monitoring devices. We describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data processing between the wearable device and mobile application as a function of desired classification accuracy. By making the correct offloading decision based on current system parameters, we show that we are able to reduce system power by as much as 20%. We demonstrate that computation offloading can be applied to real-time monitoring systems, and yields significant power savings. Making correct offloading decisions for health monitoring devices can extend battery life and improve adherence.

  19. Computational problems and signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard

    1991-01-01

    The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.

  20. SPAN: Ocean science

    NASA Technical Reports Server (NTRS)

    Thomas, Valerie L.; Koblinsky, Chester J.; Webster, Ferris; Zlotnicki, Victor; Green, James L.

    1987-01-01

    The Space Physics Analysis Network (SPAN) is a multi-mission, correlative data comparison network which links space and Earth science research and data analysis computers. It provides a common working environment for sharing computer resources, sharing computer peripherals, solving proprietary problems, and providing the potential for significant time and cost savings for correlative data analysis. This is one of a series of discipline-specific SPAN documents which are intended to complement the SPAN primer and SPAN Management documents. Their purpose is to provide the discipline scientists with a comprehensive set of documents to assist in the use of SPAN for discipline specific scientific research.

  1. 12 CFR 502.28 - How does OTS determine the organizational form component for a savings and loan holding company?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... component for a savings and loan holding company? 502.28 Section 502.28 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY ASSESSMENTS AND FEES Assessments Savings and Loan Holding Companies... savings and loan holding company that OTS regulates under section 10(l) of the HOLA. OTS will compute your...

  2. Computing with a single qubit faster than the computation quantum speed limit

    NASA Astrophysics Data System (ADS)

    Sinitsyn, Nikolai A.

    2018-02-01

    The possibility to save and process information in fundamentally indistinguishable states is the quantum mechanical resource that is not encountered in classical computing. I demonstrate that, if energy constraints are imposed, this resource can be used to accelerate information-processing without relying on entanglement or any other type of quantum correlations. In fact, there are computational problems that can be solved much faster, in comparison to currently used classical schemes, by saving intermediate information in nonorthogonal states of just a single qubit. There are also error correction strategies that protect such computations.

  3. Neural Network Training by Integration of Adjoint Systems of Equations Forward in Time

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad (Inventor); Barhen, Jacob (Inventor)

    1999-01-01

    A method and apparatus for supervised neural learning of time dependent trajectories exploits the concepts of adjoint operators to enable computation of the gradient of an objective functional with respect to the various parameters of the network architecture in a highly efficient manner. Specifically. it combines the advantage of dramatic reductions in computational complexity inherent in adjoint methods with the ability to solve two adjoint systems of equations together forward in time. Not only is a large amount of computation and storage saved. but the handling of real-time applications becomes also possible. The invention has been applied it to two examples of representative complexity which have recently been analyzed in the open literature and demonstrated that a circular trajectory can be learned in approximately 200 iterations compared to the 12000 reported in the literature. A figure eight trajectory was achieved in under 500 iterations compared to 20000 previously required. Tbc trajectories computed using our new method are much closer to the target trajectories than was reported in previous studies.

  4. Neural network training by integration of adjoint systems of equations forward in time

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad (Inventor); Barhen, Jacob (Inventor)

    1992-01-01

    A method and apparatus for supervised neural learning of time dependent trajectories exploits the concepts of adjoint operators to enable computation of the gradient of an objective functional with respect to the various parameters of the network architecture in a highly efficient manner. Specifically, it combines the advantage of dramatic reductions in computational complexity inherent in adjoint methods with the ability to solve two adjoint systems of equations together forward in time. Not only is a large amount of computation and storage saved, but the handling of real-time applications becomes also possible. The invention has been applied it to two examples of representative complexity which have recently been analyzed in the open literature and demonstrated that a circular trajectory can be learned in approximately 200 iterations compared to the 12000 reported in the literature. A figure eight trajectory was achieved in under 500 iterations compared to 20000 previously required. The trajectories computed using our new method are much closer to the target trajectories than was reported in previous studies.

  5. Soft-copy sonography: cost reduction sensitivity analysis in a pediatric hospital.

    PubMed

    Don, S; Albertina, M J; Ammann, D

    1998-03-01

    Our objective was to determine whether interpreting sonograms of pediatric patients using soft-copy (computer workstation) instead of laser-printed film could reduce costs for a pediatric radiology department. We used theoretic models of growth to analyze costs. The costs of a sonographic picture archiving and communication system (three interface devices, two workstations, a network server, maintenance expenses, and storage media costs) were compared with the potential savings of eliminating film and increasing technologist efficiency or reducing the number of technologists. The model was based on historic trends and future capitation estimates that will reduce fee-for-service reimbursement. The effects of varying the study volume and reducing technologists' work hours were analyzed. By converting to soft-copy interpretation, we saved 6 min 32 sec per examination by eliminating film processing waiting time, thus reducing examination time from 30 min to 24 min. During an average day of 27 examinations, 176 min were saved. However, 33 min a day were spent retrieving prior studies from long-term storage; thus, 143 extra minutes a day were available for scanning. This improved efficiency could result in five more sonograms a day obtained by converting to soft-copy interpretation, using existing staff and equipment. Alternatively, five examinations a day would equate to one half of a full-time equivalent technologists position. Our analysis of costs considered that the hospital's anticipated growth of sonography and the depreciation of equipment during 5 years resulted in a savings of more than $606,000. Increasing the examinations by just 200 sonograms in the first year and no further growth resulted in a savings of more than $96,000. If the number of sonograms stayed constant, elimination of film printing alone resulted in a loss of approximately $157,000; reduction of one half of a full-time equivalent technologist's position would recuperate approximately $134,000 of that loss. Soft-copy sonography can save money through improved technologist efficiency, thereby increasing the number of sonograms obtained and revenue generated. If the number of sonograms does not increase, elimination of printing costs and reduction of staff technologists will not result in a savings.

  6. Roth 401(k): asking the right questions.

    PubMed

    Joyner, James F

    2006-01-01

    Roth 401(k) provisions are a newly available feature of 401(k) plans. Roth 401(k) provisions are after-tax savings that generally are tax-free at the time of distribution. Questions arise for plan sponsors about whether the new feature is beneficial, and to whom, and what needs to be done if the plan sponsor decides to offer this provision to its employees. This article tries to answer some of those common questions, including a simple computational analysis to try to answer the important question of how much an employee-participant genuinely benefits from this savings approach. Some practical issues of implementation are touched on, and some unanswered questions are identified.

  7. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  8. 31 CFR 359.51 - What book-entry Series I savings bonds are included in the computation?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What book-entry Series I savings bonds are included in the computation? 359.51 Section 359.51 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  9. 31 CFR 359.32 - What definitive Series I savings bonds are excluded from the computation?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What definitive Series I savings bonds are excluded from the computation? 359.32 Section 359.32 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  10. Efficient rejection-based simulation of biochemical reactions with stochastic noise and delays

    NASA Astrophysics Data System (ADS)

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2014-10-01

    We propose a new exact stochastic rejection-based simulation algorithm for biochemical reactions and extend it to systems with delays. Our algorithm accelerates the simulation by pre-computing reaction propensity bounds to select the next reaction to perform. Exploiting such bounds, we are able to avoid recomputing propensities every time a (delayed) reaction is initiated or finished, as is typically necessary in standard approaches. Propensity updates in our approach are still performed, but only infrequently and limited for a small number of reactions, saving computation time and without sacrificing exactness. We evaluate the performance improvement of our algorithm by experimenting with concrete biological models.

  11. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  12. Analysis of airborne antenna systems using geometrical theory of diffraction and moment method computer codes

    NASA Technical Reports Server (NTRS)

    Hartenstein, Richard G., Jr.

    1985-01-01

    Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.

  13. A hybrid computer program for rapidly solving flowing or static chemical kinetic problems involving many chemical species

    NASA Technical Reports Server (NTRS)

    Mclain, A. G.; Rao, C. S. R.

    1976-01-01

    A hybrid chemical kinetic computer program was assembled which provides a rapid solution to problems involving flowing or static, chemically reacting, gas mixtures. The computer program uses existing subroutines for problem setup, initialization, and preliminary calculations and incorporates a stiff ordinary differential equation solution technique. A number of check cases were recomputed with the hybrid program and the results were almost identical to those previously obtained. The computational time saving was demonstrated with a propane-oxygen-argon shock tube combustion problem involving 31 chemical species and 64 reactions. Information is presented to enable potential users to prepare an input data deck for the calculation of a problem.

  14. Consequences of Common Topological Rearrangements for Partition Trees in Phylogenomic Inference.

    PubMed

    Chernomor, Olga; Minh, Bui Quang; von Haeseler, Arndt

    2015-12-01

    In phylogenomic analysis the collection of trees with identical score (maximum likelihood or parsimony score) may hamper tree search algorithms. Such collections are coined phylogenetic terraces. For sparse supermatrices with a lot of missing data, the number of terraces and the number of trees on the terraces can be very large. If terraces are not taken into account, a lot of computation time might be unnecessarily spent to evaluate many trees that in fact have identical score. To save computation time during the tree search, it is worthwhile to quickly identify such cases. The score of a species tree is the sum of scores for all the so-called induced partition trees. Therefore, if the topological rearrangement applied to a species tree does not change the induced partition trees, the score of these partition trees is unchanged. Here, we provide the conditions under which the three most widely used topological rearrangements (nearest neighbor interchange, subtree pruning and regrafting, and tree bisection and reconnection) change the topologies of induced partition trees. During the tree search, these conditions allow us to quickly identify whether we can save computation time on the evaluation of newly encountered trees. We also introduce the concept of partial terraces and demonstrate that they occur more frequently than the original "full" terrace. Hence, partial terrace is the more important factor of timesaving compared to full terrace. Therefore, taking into account the above conditions and the partial terrace concept will help to speed up the tree search in phylogenomic inference.

  15. AIRSLUG: A fortran program for the computation of type curves to estimate transmissivity and storativity from prematurely terminated air-pressurized slug tests

    USGS Publications Warehouse

    Greene, E.A.; Shapiro, A.M.

    1998-01-01

    The Fortran code AIRSLUG can be used to generate the type curves needed to analyze the recovery data from prematurely terminated air-pressurized slug tests. These type curves, when used with a graphical software package, enable the engineer or scientist to analyze field tests to estimate transmissivity and storativity. Prematurely terminating the slug test can significantly reduce the overall time needed to conduct the test, especially at low-permeability sites, thus saving time and money.The Fortran code AIRSLUG can be used to generate the type curves needed to analyze the recovery data from prematurely terminated air-pressurized slug tests. These type curves, when used with a graphical software package, enable the engineer or scientist to analyze field tests to estimate transmissivity and storativity. Prematurely terminating the slug test can significantly reduce the overall time needed to conduct the test, especially at low-permeability sites, thus saving time and money.

  16. Automated constraint checking of spacecraft command sequences

    NASA Astrophysics Data System (ADS)

    Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Spitale, Joseph M.; Le, Dang

    1995-01-01

    Robotic spacecraft are controlled by onboard sets of commands called "sequences." Determining that sequences will have the desired effect on the spacecraft can be expensive in terms of both labor and computer coding time, with different particular costs for different types of spacecraft. Specification languages and appropriate user interface to the languages can be used to make the most effective use of engineering validation time. This paper describes one specification and verification environment ("SAVE") designed for validating that command sequences have not violated any flight rules. This SAVE system was subsequently adapted for flight use on the TOPEX/Poseidon spacecraft. The relationship of this work to rule-based artificial intelligence and to other specification techniques is discussed, as well as the issues that arise in the transfer of technology from a research prototype to a full flight system.

  17. Spacelab Mission Implementation Cost Assessment (SMICA)

    NASA Technical Reports Server (NTRS)

    Guynes, B. V.

    1984-01-01

    A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.

  18. Light extraction efficiency analysis of GaN-based light-emitting diodes with nanopatterned sapphire substrates.

    PubMed

    Pan, Jui-Wen; Tsai, Pei-Jung; Chang, Kao-Der; Chang, Yung-Yuan

    2013-03-01

    In this paper, we propose a method to analyze the light extraction efficiency (LEE) enhancement of a nanopatterned sapphire substrates (NPSS) light-emitting diode (LED) by comparing wave optics software with ray optics software. Finite-difference time-domain (FDTD) simulations represent the wave optics software and Light Tools (LTs) simulations represent the ray optics software. First, we find the trends of and an optimal solution for the LEE enhancement when the 2D-FDTD simulations are used to save on simulation time and computational memory. The rigorous coupled-wave analysis method is utilized to explain the trend we get from the 2D-FDTD algorithm. The optimal solution is then applied in 3D-FDTD and LTs simulations. The results are similar and the difference in LEE enhancement between the two simulations does not exceed 8.5% in the small LED chip area. More than 10(4) times computational memory is saved during the LTs simulation in comparison to the 3D-FDTD simulation. Moreover, LEE enhancement from the side of the LED can be obtained in the LTs simulation. An actual-size NPSS LED is simulated using the LTs. The results show a more than 307% improvement in the total LEE enhancement of the NPSS LED with the optimal solution compared to the conventional LED.

  19. Parallel-vector unsymmetric Eigen-Solver on high performance computers

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Jiangning, Qin

    1993-01-01

    The popular QR algorithm for solving all eigenvalues of an unsymmetric matrix is reviewed. Among the basic components in the QR algorithm, it was concluded from this study, that the reduction of an unsymmetric matrix to a Hessenberg form (before applying the QR algorithm itself) can be done effectively by exploiting the vector speed and multiple processors offered by modern high-performance computers. Numerical examples of several test cases have indicated that the proposed parallel-vector algorithm for converting a given unsymmetric matrix to a Hessenberg form offers computational advantages over the existing algorithm. The time saving obtained by the proposed methods is increased as the problem size increased.

  20. Calculation of recirculating flow behind flame-holders

    NASA Astrophysics Data System (ADS)

    Zeng, Q.; Sheng, Y.; Zhou, Q.

    1985-10-01

    Adoptability of standard K-epsilon turbulence model for numerical calculation of recirculating flow is discussed. Many computations of recirculating flows behind bluff-bodies used as flame-holders in afterburner of aeroengine have been completed. Blocking-off method to treat the incline walls of the flame-holder gives good results. In isothermal recirculating flows the flame-holder wall is assumed to be isolated. Therefore, it is possible to remove the inactive zone from the calculation domain in programming to save computer time. The computation for a V-shaped flame-holder exhibits an interesting phenomenon that the recirculation zone extends to the cavity of the flame-holder.

  1. Chemistry Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.

  2. Cost savings from peritoneal dialysis therapy time extension using icodextrin.

    PubMed

    Johnson, David W; Vincent, Kaia; Blizzard, Sophie; Rumpsfeld, Markus; Just, Paul

    2003-01-01

    Previous retrospective studies have reported that icodextrin may prolong peritoneal dialysis (PD) treatment time in patients with refractory fluid overload (RFO). Because the annual cost of PD therapy is lower than that of hemodialysis (HD) therapy in Australia, we prospectively investigated the ability of icodextrin to prolong PD technique survival in patients with RFO. We used a computer model to estimate the savings associated with that therapeutic strategy, based on annual therapy costs determined in a regional PD and HD costing exercise. Patients who met standard criteria for RFO and who were otherwise to be converted immediately to HD, were asked to consent to an open-label assessment of the ability of icodextrin to delay the need to start HD. Time to conversion to HD was measured. The study enrolled 39 patients who were followed for a mean period of 1.1 years. Icodextrin significantly increased peritoneal ultrafiltration by a median value of 368 mL daily. It prolonged technique survival by a mean period of 1.21 years [95% confidence interval (CI): 0.80-1.62 years]. Extension of PD treatment time by icodextrin was particularly marked for patients who had ultrafiltration failure (UFF, n = 20), defined as net daily peritoneal ultrafiltration < 1 L daily (mean extension time: 1.70 years; 95% CI: 1.16-2.25 years). Overall, annualized savings were US$3,683 per patient per year. If just the patients with UFF were considered, the savings increased to US$4,893 per year. Icodextrin prolongs PD technique survival in patients with RFO, permitting them to continue on their preferred therapy. In Australia, that practice is highly cost-effective, particularly in individuals with UFF.

  3. A regression-based approach to estimating retrofit savings using the Building Performance Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Travis; Sohn, Michael D.

    Retrofitting building systems is known to provide cost-effective energy savings. This article addresses how the Building Performance Database is used to help identify potential savings. Currently, prioritizing retrofits and computing their expected energy savings and cost/benefits can be a complicated, costly, and an uncertain effort. Prioritizing retrofits for a portfolio of buildings can be even more difficult if the owner must determine different investment strategies for each of the buildings. Meanwhile, we are seeing greater availability of data on building energy use, characteristics, and equipment. These data provide opportunities for the development of algorithms that link building characteristics and retrofitsmore » empirically. In this paper we explore the potential of using such data for predicting the expected energy savings from equipment retrofits for a large number of buildings. We show that building data with statistical algorithms can provide savings estimates when detailed energy audits and physics-based simulations are not cost- or time-feasible. We develop a multivariate linear regression model with numerical predictors (e.g., operating hours, occupant density) and categorical indicator variables (e.g., climate zone, heating system type) to predict energy use intensity. The model quantifies the contribution of building characteristics and systems to energy use, and we use it to infer the expected savings when modifying particular equipment. We verify the model using residual analysis and cross-validation. We demonstrate the retrofit analysis by providing a probabilistic estimate of energy savings for several hypothetical building retrofits. We discuss the ways understanding the risk associated with retrofit investments can inform decision making. The contributions of this work are the development of a statistical model for estimating energy savings, its application to a large empirical building dataset, and a discussion of its use in informing building retrofit decisions.« less

  4. Study of effects of injector geometry on fuel-air mixing and combustion

    NASA Technical Reports Server (NTRS)

    Bangert, L. H.; Roach, R. L.

    1977-01-01

    An implicit finite-difference method has been developed for computing the flow in the near field of a fuel injector as part of a broader study of the effects of fuel injector geometry on fuel-air mixing and combustion. Detailed numerical results have been obtained for cases of laminar and turbulent flow without base injection, corresponding to the supersonic base flow problem. These numerical results indicated that the method is stable and convergent, and that significant savings in computer time can be achieved, compared with explicit methods.

  5. An Automated Motion Detection and Reward System for Animal Training.

    PubMed

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F; Black, Kevin J

    2015-12-04

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an "off-the-shelf" automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use.

  6. Using reconfigurable hardware to accelerate multiple sequence alignment with ClustalW.

    PubMed

    Oliver, Tim; Schmidt, Bertil; Nathan, Darran; Clemens, Ralf; Maskell, Douglas

    2005-08-15

    Aligning hundreds of sequences using progressive alignment tools such as ClustalW requires several hours on state-of-the-art workstations. We present a new approach to compute multiple sequence alignments in far shorter time using reconfigurable hardware. This results in an implementation of ClustalW with significant runtime savings on a standard off-the-shelf FPGA.

  7. Phase matrix induced symmetrics for multiple scattering using the matrix operator method

    NASA Technical Reports Server (NTRS)

    Hitzfelder, S. J.; Kattawar, G. W.

    1973-01-01

    Entirely rigorous proofs of the symmetries induced by the phase matrix into the reflection and transmission operators used in the matrix operator theory are given. Results are obtained for multiple scattering in both homogeneous and inhomogeneous atmospheres. These results will be useful to researchers using the method since large savings in computer time and storage are obtainable.

  8. User’s Guide. To the Federal Insurance Administration’s 1978-1979 Flood Claims File for Computation of Depth-Damage Relationships.

    DTIC Science & Technology

    1981-12-01

    reading a file either saved in a previous session or created as a result of the internal execution save file (described later). LOAND PFN LOADS...command is used to make new data retrievals. READ PEN DIRECT ENTRY FROM A PREVIOUSLY SAVED FILE This command bypasses the conventional terminal entry by...INTERNAL SAVE FILE This command accesses a file created using the internal execution save file output option. Loading a file results in entering the

  9. The impact of e-prescribing on prescriber and staff time in ambulatory care clinics: a time motion study.

    PubMed

    Hollingworth, William; Devine, Emily Beth; Hansen, Ryan N; Lawless, Nathan M; Comstock, Bryan A; Wilson-Norton, Jennifer L; Tharp, Kathleen L; Sullivan, Sean D

    2007-01-01

    Electronic prescribing has improved the quality and safety of care. One barrier preventing widespread adoption is the potential detrimental impact on workflow. We used time-motion techniques to compare prescribing times at three ambulatory care sites that used paper-based prescribing, desktop, or laptop e-prescribing. An observer timed all prescriber (n = 27) and staff (n = 42) tasks performed during a 4-hour period. At the sites with optional e-prescribing >75% of prescription-related events were performed electronically. Prescribers at e-prescribing sites spent less time writing, but time-savings were offset by increased computer tasks. After adjusting for site, prescriber and prescription type, e-prescribing tasks took marginally longer than hand written prescriptions (12.0 seconds; -1.6, 25.6 CI). Nursing staff at the e-prescribing sites spent longer on computer tasks (5.4 minutes/hour; 0.0, 10.7 CI). E-prescribing was not associated with an increase in combined computer and writing time for prescribers. If carefully implemented, e-prescribing will not greatly disrupt workflow.

  10. The economics and timing of preoperative antibiotics for orthopaedic procedures.

    PubMed

    Norman, B A; Bartsch, S M; Duggan, A P; Rodrigues, M B; Stuckey, D R; Chen, A F; Lee, B Y

    2013-12-01

    The efficacy of antibiotics in preventing surgical site infections (SSIs) depends on the timing of administration relative to the start of surgery. However, currently, both the timing of and recommendations for administration vary substantially. To determine how the economic value from the hospital perspective of preoperative antibiotics varies with the timing of administration for orthopaedic procedures. Computational decision and operational models were developed from the hospital perspective. Baseline analyses looked at current timing of administration, while additional analyses varied the timing of administration, compliance with recommended guidelines, and the goal time-interval. Beginning antibiotic administration within 0-30 min prior to surgery resulted in the lowest costs and SSIs. Operationally, linking to a pre-surgical activity, administering antibiotics prior to incision but after anaesthesia-ready time was optimal, as 92.1% of the time, antibiotics were administered in the optimal time-interval (0-30 min prior to incision). Improving administration compliance from 80% to 90% for this pre-surgical activity results in cost savings of $447 per year for a hospital performing 100 orthopaedic operations a year. This study quantifies the potential cost-savings when antibiotic administration timing is improved, which in turn can guide the amount hospitals should invest to address this issue.

  11. Performance Comparison of Mainframe, Workstations, Clusters, and Desktop Computers

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2005-01-01

    A performance evaluation of a variety of computers frequently found in a scientific or engineering research environment was conducted using a synthetic and application program benchmarks. From a performance perspective, emerging commodity processors have superior performance relative to legacy mainframe computers. In many cases, the PC clusters exhibited comparable performance with traditional mainframe hardware when 8-12 processors were used. The main advantage of the PC clusters was related to their cost. Regardless of whether the clusters were built from new computers or whether they were created from retired computers their performance to cost ratio was superior to the legacy mainframe computers. Finally, the typical annual maintenance cost of legacy mainframe computers is several times the cost of new equipment such as multiprocessor PC workstations. The savings from eliminating the annual maintenance fee on legacy hardware can result in a yearly increase in total computational capability for an organization.

  12. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 2: Residual-fired nocogeneration process boiler

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  13. Comparison of joint space versus task force load distribution optimization for a multiarm manipulator system

    NASA Technical Reports Server (NTRS)

    Soloway, Donald I.; Alberts, Thomas E.

    1989-01-01

    It is often proposed that the redundancy in choosing a force distribution for multiple arms grasping a single object should be handled by minimizing a quadratic performance index. The performance index may be formulated in terms of joint torques or in terms of the Cartesian space force/torque applied to the body by the grippers. The former seeks to minimize power consumption while the latter minimizes body stresses. Because the cost functions are related to each other by a joint angle dependent transformation on the weight matrix, it might be argued that either method tends to reduce power consumption, but clearly the joint space minimization is optimal. A comparison of these two options is presented with consideration given to computational cost and power consumption. Simulation results using a two arm robot system are presented to show the savings realized by employing the joint space optimization. These savings are offset by additional complexity, computation time and in some cases processor power consumption.

  14. Dynamic VMs placement for energy efficiency by PSO in cloud computing

    NASA Astrophysics Data System (ADS)

    Dashti, Seyed Ebrahim; Rahmani, Amir Masoud

    2016-03-01

    Recently, cloud computing is growing fast and helps to realise other high technologies. In this paper, we propose a hieratical architecture to satisfy both providers' and consumers' requirements in these technologies. We design a new service in the PaaS layer for scheduling consumer tasks. In the providers' perspective, incompatibility between specification of physical machine and user requests in cloud leads to problems such as energy-performance trade-off and large power consumption so that profits are decreased. To guarantee Quality of service of users' tasks, and reduce energy efficiency, we proposed to modify Particle Swarm Optimisation to reallocate migrated virtual machines in the overloaded host. We also dynamically consolidate the under-loaded host which provides power saving. Simulation results in CloudSim demonstrated that whatever simulation condition is near to the real environment, our method is able to save as much as 14% more energy and the number of migrations and simulation time significantly reduces compared with the previous works.

  15. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section A

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuels consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  16. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section B

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  17. Stereoscopic, Force-Feedback Trainer For Telerobot Operators

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Schenker, Paul S.; Bejczy, Antal K.

    1994-01-01

    Computer-controlled simulator for training technicians to operate remote robots provides both visual and kinesthetic virtual reality. Used during initial stage of training; saves time and expense, increases operational safety, and prevents damage to robots by inexperienced operators. Computes virtual contact forces and torques of compliant robot in real time, providing operator with feel of forces experienced by manipulator as well as view in any of three modes: single view, two split views, or stereoscopic view. From keyboard, user specifies force-reflection gain and stiffness of manipulator hand for three translational and three rotational axes. System offers two simulated telerobotic tasks: insertion of peg in hole in three dimensions, and removal and insertion of drawer.

  18. The automation of an inlet mass flow control system

    NASA Technical Reports Server (NTRS)

    Supplee, Frank; Tcheng, Ping; Weisenborn, Michael

    1989-01-01

    The automation of a closed-loop computer controlled system for the inlet mass flow system (IMFS) developed for a wind tunnel facility at Langley Research Center is presented. This new PC based control system is intended to replace the manual control system presently in use in order to fully automate the plug positioning of the IMFS during wind tunnel testing. Provision is also made for communication between the PC and a host-computer in order to allow total animation of the plug positioning and data acquisition during the complete sequence of predetermined plug locations. As extensive running time is programmed for the IMFS, this new automated system will save both manpower and tunnel running time.

  19. An image compression algorithm for a high-resolution digital still camera

    NASA Technical Reports Server (NTRS)

    Nerheim, Rosalee

    1989-01-01

    The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

  20. Efficient rejection-based simulation of biochemical reactions with stochastic noise and delays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento

    2014-10-07

    We propose a new exact stochastic rejection-based simulation algorithm for biochemical reactions and extend it to systems with delays. Our algorithm accelerates the simulation by pre-computing reaction propensity bounds to select the next reaction to perform. Exploiting such bounds, we are able to avoid recomputing propensities every time a (delayed) reaction is initiated or finished, as is typically necessary in standard approaches. Propensity updates in our approach are still performed, but only infrequently and limited for a small number of reactions, saving computation time and without sacrificing exactness. We evaluate the performance improvement of our algorithm by experimenting with concretemore » biological models.« less

  1. Energy Use and Power Levels in New Monitors and Personal Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can usemore » to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC). Cur rent ENERGY STAR monitor and computer criteria do not specify off or on power, but our results suggest opportunities for saving energy in these modes. Also, significant differences between CRT and LCD technology, and between field-measured and manufacturer-reported power levels reveal the need for standard methods and metrics for measuring and comparing monitor power consumption.« less

  2. Fluid Structure Interaction in a Turbine Blade

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.

    2004-01-01

    An unsteady, three dimensional Navier-Stokes solution in rotating frame formulation for turbomachinery applications is presented. Casting the governing equations in a rotating frame enabled the freezing of grid motion and resulted in substantial savings in computer time. The turbine blade was computationally simulated and probabilistically evaluated in view of several uncertainties in the aerodynamic, structural, material and thermal variables that govern the turbine blade. The interconnection between the computational fluid dynamics code and finite element structural analysis code was necessary to couple the thermal profiles with the structural design. The stresses and their variations were evaluated at critical points on the Turbine blade. Cumulative distribution functions and sensitivity factors were computed for stress responses due to aerodynamic, geometric, mechanical and thermal random variables.

  3. An O(N squared) method for computing the eigensystem of N by N symmetric tridiagonal matrices by the divide and conquer approach

    NASA Technical Reports Server (NTRS)

    Gill, Doron; Tadmor, Eitan

    1988-01-01

    An efficient method is proposed to solve the eigenproblem of N by N Symmetric Tridiagonal (ST) matrices. Unlike the standard eigensolvers which necessitate O(N cubed) operations to compute the eigenvectors of such ST matrices, the proposed method computes both the eigenvalues and eigenvectors with only O(N squared) operations. The method is based on serial implementation of the recently introduced Divide and Conquer (DC) algorithm. It exploits the fact that by O(N squared) of DC operations, one can compute the eigenvalues of N by N ST matrix and a finite number of pairs of successive rows of its eigenvector matrix. The rest of the eigenvectors--all of them or one at a time--are computed by linear three-term recurrence relations. Numerical examples are presented which demonstrate the superiority of the proposed method by saving an order of magnitude in execution time at the expense of sacrificing a few orders of accuracy.

  4. Reprocessing Multiyear GPS Data from Continuously Operating Reference Stations on Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Yoon, S.

    2016-12-01

    To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.

  5. Usefulness of a Regional Health Care Information System in primary care: a case study.

    PubMed

    Maass, Marianne C; Asikainen, Paula; Mäenpää, Tiina; Wanne, Olli; Suominen, Tarja

    2008-08-01

    The goal of this paper is to describe some benefits and possible cost consequences of computer based access to specialised health care information. A before-after activity analysis regarding 20 diabetic patients' clinical appointments was performed in a Health Centre in Satakunta region in Finland. Cost data, an interview, time-and-motion studies, and flow charts based on modelling were applied. Access to up-to-date diagnostic information reduced redundant clinical re-appointments, repeated tests, and mail orders for missing data. Timely access to diagnostic information brought about several benefits regarding workflow, patient care, and disease management. These benefits resulted in theoretical net cost savings. The study results indicated that Regional Information Systems may be useful tools to support performance and improve efficiency. However, further studies are required in order to verify how the monetary savings would impact the performance of Health Care Units.

  6. File Cryptography with AES and RSA for Mobile Based on Android

    NASA Astrophysics Data System (ADS)

    laia, Yonata; Nababan, Marlince; Sihombing, Oloan; Aisyah, Siti; Sitanggang, Delima; Parsaoran, Saut; Zendato, Niskarto

    2018-04-01

    The users of mobile based on android were increasing currently even now mobile was almost the same computer one of which could be used to be done by every users mobile was save the individual important data.Saving the data on mobile was very risk because become hackers’ target. That was the reason of researchers want to add cryptography which the combination between Advance Encryption System (AES) dan Ron Rivest, Adi Shamir dan Len Adleman (RSA). The result of the second method above could do cryptography data on mobile. With different encryption time where the file size; 25.44 KB, encryption time 4 second, 200 KB, 5 second, 600 KB 7 second, 2.29 MB, 10 second. Where decryption 25.44 KB, encryption 2 second, 200 KB, 1.5 second, 600 KB 2.5 second, 2.29 MB, 2.7 second.

  7. Optimal subhourly electricity resource dispatch under multiple price signals with high renewable generation availability

    DOE PAGES

    Chassin, David P.; Behboodi, Sahand; Djilali, Ned

    2018-01-28

    This article proposes a system-wide optimal resource dispatch strategy that enables a shift from a primarily energy cost-based approach, to a strategy using simultaneous price signals for energy, power and ramping behavior. A formal method to compute the optimal sub-hourly power trajectory is derived for a system when the price of energy and ramping are both significant. Optimal control functions are obtained in both time and frequency domains, and a discrete-time solution suitable for periodic feedback control systems is presented. The method is applied to North America Western Interconnection for the planning year 2024, and it is shown that anmore » optimal dispatch strategy that simultaneously considers both the cost of energy and the cost of ramping leads to significant cost savings in systems with high levels of renewable generation: the savings exceed 25% of the total system operating cost for a 50% renewables scenario.« less

  8. Method and System For an Automated Tool for En Route Traffic Controllers

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz (Inventor); McNally, B. David (Inventor)

    2001-01-01

    A method and system for a new automation tool for en route air traffic controllers first finds all aircraft flying on inefficient routes, then determines whether it is possible to save time by bypassing some route segments, and finally whether the improved route is free of conflicts with other aircraft. The method displays all direct-to eligible aircraft to an air traffic controller in a list sorted by highest time savings. By allowing the air traffic controller to easily identify and work with the highest pay-off aircraft, the method of the present invention contributes to a significant increase in both air traffic controller and aircraft productivity. A graphical computer interface (GUI) is used to enable the air traffic controller to send the aircraft direct to a waypoint or fix closer to the destination airport by a simple point and click action.

  9. Method and system for an automated tool for en route traffic controllers

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz (Inventor); McNally, B. David (Inventor)

    2001-01-01

    A method and system for a new automation tool for en route air traffic controllers first finds all aircraft flying on inefficient routes, then determines whether it is possible to save time by bypassing some route segments, and finally whether the improved route is free of conflicts with other aircraft. The method displays all direct-to eligible aircraft to an air traffic controller in a list sorted by highest time savings. By allowing the air traffic controller to easily identify and work with the highest pay-off aircraft, the method of the present invention contributes to a significant increase in both air traffic controller and aircraft productivity. A graphical computer interface (GUI) is used to enable the air traffic controller to send the aircraft direct to a waypoint or fix closer to the destination airport by a simple point and click action.

  10. Optimal subhourly electricity resource dispatch under multiple price signals with high renewable generation availability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Behboodi, Sahand; Djilali, Ned

    This article proposes a system-wide optimal resource dispatch strategy that enables a shift from a primarily energy cost-based approach, to a strategy using simultaneous price signals for energy, power and ramping behavior. A formal method to compute the optimal sub-hourly power trajectory is derived for a system when the price of energy and ramping are both significant. Optimal control functions are obtained in both time and frequency domains, and a discrete-time solution suitable for periodic feedback control systems is presented. The method is applied to North America Western Interconnection for the planning year 2024, and it is shown that anmore » optimal dispatch strategy that simultaneously considers both the cost of energy and the cost of ramping leads to significant cost savings in systems with high levels of renewable generation: the savings exceed 25% of the total system operating cost for a 50% renewables scenario.« less

  11. Residential Capabilities | Buildings | NREL

    Science.gov Websites

    components, develop whole-house strategies, and predict performance at various levels of energy savings. This packages at levels of whole-house energy savings. This photo shows a man in front of two computer screens

  12. An efficient method for computing unsteady transonic aerodynamics of swept wings with control surfaces

    NASA Technical Reports Server (NTRS)

    Liu, D. D.; Kao, Y. F.; Fung, K. Y.

    1989-01-01

    A transonic equivalent strip (TES) method was further developed for unsteady flow computations of arbitrary wing planforms. The TES method consists of two consecutive correction steps to a given nonlinear code such as LTRAN2; namely, the chordwise mean flow correction and the spanwise phase correction. The computation procedure requires direct pressure input from other computed or measured data. Otherwise, it does not require airfoil shape or grid generation for given planforms. To validate the computed results, four swept wings of various aspect ratios, including those with control surfaces, are selected as computational examples. Overall trends in unsteady pressures are established with those obtained by XTRAN3S codes, Isogai's full potential code and measured data by NLR and RAE. In comparison with these methods, the TES has achieved considerable saving in computer time and reasonable accuracy which suggests immediate industrial applications.

  13. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  14. Motmot, an open-source toolkit for realtime video acquisition and analysis.

    PubMed

    Straw, Andrew D; Dickinson, Michael H

    2009-07-22

    Video cameras sense passively from a distance, offer a rich information stream, and provide intuitively meaningful raw data. Camera-based imaging has thus proven critical for many advances in neuroscience and biology, with applications ranging from cellular imaging of fluorescent dyes to tracking of whole-animal behavior at ecologically relevant spatial scales. Here we present 'Motmot': an open-source software suite for acquiring, displaying, saving, and analyzing digital video in real-time. At the highest level, Motmot is written in the Python computer language. The large amounts of data produced by digital cameras are handled by low-level, optimized functions, usually written in C. This high-level/low-level partitioning and use of select external libraries allow Motmot, with only modest complexity, to perform well as a core technology for many high-performance imaging tasks. In its current form, Motmot allows for: (1) image acquisition from a variety of camera interfaces (package motmot.cam_iface), (2) the display of these images with minimal latency and computer resources using wxPython and OpenGL (package motmot.wxglvideo), (3) saving images with no compression in a single-pass, low-CPU-use format (package motmot.FlyMovieFormat), (4) a pluggable framework for custom analysis of images in realtime and (5) firmware for an inexpensive USB device to synchronize image acquisition across multiple cameras, with analog input, or with other hardware devices (package motmot.fview_ext_trig). These capabilities are brought together in a graphical user interface, called 'FView', allowing an end user to easily view and save digital video without writing any code. One plugin for FView, 'FlyTrax', which tracks the movement of fruit flies in real-time, is included with Motmot, and is described to illustrate the capabilities of FView. Motmot enables realtime image processing and display using the Python computer language. In addition to the provided complete applications, the architecture allows the user to write relatively simple plugins, which can accomplish a variety of computer vision tasks and be integrated within larger software systems. The software is available at http://code.astraw.com/projects/motmot.

  15. Improving waveform inversion using modified interferometric imaging condition

    NASA Astrophysics Data System (ADS)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen

    2017-12-01

    Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.

  16. Improving waveform inversion using modified interferometric imaging condition

    NASA Astrophysics Data System (ADS)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen

    2018-02-01

    Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.

  17. A Computationally Efficient Visual Saliency Algorithm Suitable for an Analog CMOS Implementation.

    PubMed

    D'Angelo, Robert; Wood, Richard; Lowry, Nathan; Freifeld, Geremy; Huang, Haiyao; Salthouse, Christopher D; Hollosi, Brent; Muresan, Matthew; Uy, Wes; Tran, Nhut; Chery, Armand; Poppe, Dorothy C; Sonkusale, Sameer

    2018-06-27

    Computer vision algorithms are often limited in their application by the large amount of data that must be processed. Mammalian vision systems mitigate this high bandwidth requirement by prioritizing certain regions of the visual field with neural circuits that select the most salient regions. This work introduces a novel and computationally efficient visual saliency algorithm for performing this neuromorphic attention-based data reduction. The proposed algorithm has the added advantage that it is compatible with an analog CMOS design while still achieving comparable performance to existing state-of-the-art saliency algorithms. This compatibility allows for direct integration with the analog-to-digital conversion circuitry present in CMOS image sensors. This integration leads to power savings in the converter by quantizing only the salient pixels. Further system-level power savings are gained by reducing the amount of data that must be transmitted and processed in the digital domain. The analog CMOS compatible formulation relies on a pulse width (i.e., time mode) encoding of the pixel data that is compatible with pulse-mode imagers and slope based converters often used in imager designs. This letter begins by discussing this time-mode encoding for implementing neuromorphic architectures. Next, the proposed algorithm is derived. Hardware-oriented optimizations and modifications to this algorithm are proposed and discussed. Next, a metric for quantifying saliency accuracy is proposed, and simulation results of this metric are presented. Finally, an analog synthesis approach for a time-mode architecture is outlined, and postsynthesis transistor-level simulations that demonstrate functionality of an implementation in a modern CMOS process are discussed.

  18. Consequences of Common Topological Rearrangements for Partition Trees in Phylogenomic Inference

    PubMed Central

    Minh, Bui Quang; von Haeseler, Arndt

    2015-01-01

    Abstract In phylogenomic analysis the collection of trees with identical score (maximum likelihood or parsimony score) may hamper tree search algorithms. Such collections are coined phylogenetic terraces. For sparse supermatrices with a lot of missing data, the number of terraces and the number of trees on the terraces can be very large. If terraces are not taken into account, a lot of computation time might be unnecessarily spent to evaluate many trees that in fact have identical score. To save computation time during the tree search, it is worthwhile to quickly identify such cases. The score of a species tree is the sum of scores for all the so-called induced partition trees. Therefore, if the topological rearrangement applied to a species tree does not change the induced partition trees, the score of these partition trees is unchanged. Here, we provide the conditions under which the three most widely used topological rearrangements (nearest neighbor interchange, subtree pruning and regrafting, and tree bisection and reconnection) change the topologies of induced partition trees. During the tree search, these conditions allow us to quickly identify whether we can save computation time on the evaluation of newly encountered trees. We also introduce the concept of partial terraces and demonstrate that they occur more frequently than the original “full” terrace. Hence, partial terrace is the more important factor of timesaving compared to full terrace. Therefore, taking into account the above conditions and the partial terrace concept will help to speed up the tree search in phylogenomic inference. PMID:26448206

  19. A weight based genetic algorithm for selecting views

    NASA Astrophysics Data System (ADS)

    Talebian, Seyed H.; Kareem, Sameem A.

    2013-03-01

    Data warehouse is a technology designed for supporting decision making. Data warehouse is made by extracting large amount of data from different operational systems; transforming it to a consistent form and loading it to the central repository. The type of queries in data warehouse environment differs from those in operational systems. In contrast to operational systems, the analytical queries that are issued in data warehouses involve summarization of large volume of data and therefore in normal circumstance take a long time to be answered. On the other hand, the result of these queries must be answered in a short time to enable managers to make decisions as short time as possible. As a result, an essential need in this environment is in improving the performances of queries. One of the most popular methods to do this task is utilizing pre-computed result of queries. In this method, whenever a new query is submitted by the user instead of calculating the query on the fly through a large underlying database, the pre-computed result or views are used to answer the queries. Although, the ideal option would be pre-computing and saving all possible views, but, in practice due to disk space constraint and overhead due to view updates it is not considered as a feasible choice. Therefore, we need to select a subset of possible views to save on disk. The problem of selecting the right subset of views is considered as an important challenge in data warehousing. In this paper we suggest a Weighted Based Genetic Algorithm (WBGA) for solving the view selection problem with two objectives.

  20. Up Close and Personal: The Value of Feedback in Implementing an Individual Energy-Saving Adaptation

    ERIC Educational Resources Information Center

    Pollard, Carol Elaine

    2016-01-01

    Purpose: The purpose of this research is to explore the drivers of computer-related sustainability behavior at a medium-sized US university and the extent to which an inexpensive energy-saving device installed on 146 administrator, faculty and general staff workstations achieved significant savings in kWh, CO[subscript 2] kg and dollars.…

  1. 12 CFR 502.27 - How does OTS determine the risk/complexity component for a savings and loan holding company?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS determine the risk/complexity...-Calculation of Assessments § 502.27 How does OTS determine the risk/complexity component for a savings and loan holding company? (a) OTS computes the risk/complexity component for responsible savings and loan...

  2. 31 CFR 359.30 - Are definitive Series I savings bonds purchased in the name of an individual computed separately...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Are definitive Series I savings bonds....30 Section 359.30 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS...

  3. Modeling and design of a high efficiency hybrid heat pump clothes dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TeGrotenhuis, Ward; Butterfield, Andrew; Caldwell, Dustin

    Computational modeling is used to design a hybrid heat pump clothes dryer capable of saving 50% of the energy used by residential clothes dryers with comparable drying times. The model represents the various stages of a drying cycle from warm-up through constant drying rate and falling drying rate phases and finishing with a cooldown phase. The model is fit to data acquired from a U.S. commercial standard vented electric dryer, and when a hybrid heat pump system is added, the energy factor increases from 3.0 lbs/kWh to 5.7-6.0 lbs/kWh, depending on the increase in blower motor power. The hybrid heatmore » pump system is designed from off-the-shelf components and includes a recuperative heat exchanger, an electric element, and an R-134a vapor compression heat pump. Parametric studies of element power and heating element use show a trade-off between energy savings and cycle time. Results show a step-change in energy savings from heat pump dryers currently marketed in the U.S. based on performance represented by Enery Star from standardized DOE testing.« less

  4. Manufacturing Methods and Technology Project Summary Reports

    DTIC Science & Technology

    1986-07-01

    Yuma Proving Ground in January 1985. The ARBAT system provides a unique real-time computer capability to identify all critical flight...cheaper tnaii the existing radar system. This prototype is expected to save over ^1 million per year at Yuma Proving Grounds . TECOM is planning to...purchase 4 production ballistic radar systems to be installed at Yuma Proving Grounds , Dugway Proving Grounds , and Jefferson Proving Grounds at a

  5. Improved Planning and Programming for Energy Efficient New Army Facilities

    DTIC Science & Technology

    1988-10-01

    setpoints to occupant comfort must be considered carefully. Cutting off the HVAC system to the bedrooms during the day produced only small savings...functions of a building and minimizing the energy usage through optimization . It includes thermostats, time switches, programmable con- trollers...microprocessor systems, computers, and sensing devices that are linked with control and power components to manage energy use. This system optimizes load

  6. Program Helps Decompose Complex Design Systems

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Hall, Laura E.

    1994-01-01

    DeMAID (A Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problem. Groups modular subsystems on basis of interactions among them. Saves considerable money and time in total design process, particularly in new design problem in which order of modules has not been defined. Available in two machine versions: Macintosh and Sun.

  7. Ambient Assisted Living spaces validation by services and devices simulation.

    PubMed

    Fernández-Llatas, Carlos; Mocholí, Juan Bautista; Sala, Pilar; Naranjo, Juan Carlos; Pileggi, Salvatore F; Guillén, Sergio; Traver, Vicente

    2011-01-01

    The design of Ambient Assisted Living (AAL) products is a very demanding challenge. AAL products creation is a complex iterative process which must accomplish exhaustive prerequisites about accessibility and usability. In this process the early detection of errors is crucial to create cost-effective systems. Computer-assisted tools can suppose a vital help to usability designers in order to avoid design errors. Specifically computer simulation of products in AAL environments can be used in all the design phases to support the validation. In this paper, a computer simulation tool for supporting usability designers in the creation of innovative AAL products is presented. This application will benefit their work saving time and improving the final system functionality.

  8. Computer Graphics-aided systems analysis: application to well completion design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.E.; Sarma, M.P.

    1985-03-01

    The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less

  9. A comparison of time-shared vs. batch development of space software

    NASA Technical Reports Server (NTRS)

    Forthofer, M.

    1977-01-01

    In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.

  10. A Novel Sensor Platform Matching the Improved Version of IPMVP Option C for Measuring Energy Savings

    PubMed Central

    Tseng, Yen-Chieh; Lee, Da-Sheng; Lin, Cheng-Fang; Chang, Ching-Yuan

    2013-01-01

    It is easy to measure energy consumption with a power meter. However, energy savings cannot be directly computed by the powers measured using existing power meter technologies, since the power consumption only reflects parts of the real energy flows. The International Performance Measurement and Verification Protocol (IPMVP) was proposed by the Efficiency Valuation Organization (EVO) to quantify energy savings using four different methodologies of A, B, C and D. Although energy savings can be estimated following the IPMVP, there are limitations on its practical implementation. Moreover, the data processing methods of the four IPMVP alternatives use multiple sensors (thermometer, hygrometer, Occupant information) and power meter readings to simulate all facilities, in order to determine an energy usage benchmark and the energy savings. This study proposes a simple sensor platform to measure energy savings. Using usually the Electronic Product Code (EPC) global standard, an architecture framework for an information system is constructed that integrates sensors data, power meter readings and occupancy conditions. The proposed sensor platform is used to monitor a building with a newly built vertical garden system (VGS). A VGS shields solar radiation and saves on energy that would be expended on air-conditioning. With this platform, the amount of energy saved in the whole facility is measured and reported in real-time. The data are compared with those obtained from detailed measurement and verification (M&V) processes. The discrepancy is less than 1.565%. Using measurements from the proposed sensor platform, the energy savings for the entire facility are quantified, with a resolution of ±1.2%. The VGS gives an 8.483% daily electricity saving for the building. Thus, the results show that the simple sensor platform proposed by this study is more widely applicable than the four complicated IPMVP alternatives and the VGS is an effective tool in reducing the carbon footprint of a building. PMID:23698273

  11. The influence of multiple goals on driving behavior: the case of safety, time saving, and fuel saving.

    PubMed

    Dogan, Ebru; Steg, Linda; Delhomme, Patricia

    2011-09-01

    Due to the innate complexity of the task drivers have to manage multiple goals while driving and the importance of certain goals may vary over time leading to priority being given to different goals depending on the circumstances. This study aimed to investigate drivers' behavioral regulation while managing multiple goals during driving. To do so participants drove on urban and rural roads in a driving simulator while trying to manage fuel saving and time saving goals, besides the safety goals that are always present during driving. A between-subjects design was used with one group of drivers managing two goals (safety and fuel saving) and another group managing three goals (safety, fuel saving, and time saving) while driving. Participants were provided continuous feedback on the fuel saving goal via a meter on the dashboard. The results indicate that even when a fuel saving or time saving goal is salient, safety goals are still given highest priority when interactions with other road users take place and when interacting with a traffic light. Additionally, performance on the fuel saving goal diminished for the group that had to manage fuel saving and time saving together. The theoretical implications for a goal hierarchy in driving tasks and practical implications for eco-driving are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Novel techniques for data decomposition and load balancing for parallel processing of vision systems: Implementation and evaluation using a motion estimation system

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.

    1989-01-01

    Computer vision systems employ a sequence of vision algorithms in which the output of an algorithm is the input of the next algorithm in the sequence. Algorithms that constitute such systems exhibit vastly different computational characteristics, and therefore, require different data decomposition techniques and efficient load balancing techniques for parallel implementation. However, since the input data for a task is produced as the output data of the previous task, this information can be exploited to perform knowledge based data decomposition and load balancing. Presented here are algorithms for a motion estimation system. The motion estimation is based on the point correspondence between the involved images which are a sequence of stereo image pairs. Researchers propose algorithms to obtain point correspondences by matching feature points among stereo image pairs at any two consecutive time instants. Furthermore, the proposed algorithms employ non-iterative procedures, which results in saving considerable amounts of computation time. The system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from consecutive time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters.

  13. An efficient two-stage approach for image-based FSI analysis of atherosclerotic arteries

    PubMed Central

    Rayz, Vitaliy L.; Mofrad, Mohammad R. K.; Saloner, David

    2010-01-01

    Patient-specific biomechanical modeling of atherosclerotic arteries has the potential to aid clinicians in characterizing lesions and determining optimal treatment plans. To attain high levels of accuracy, recent models use medical imaging data to determine plaque component boundaries in three dimensions, and fluid–structure interaction is used to capture mechanical loading of the diseased vessel. As the plaque components and vessel wall are often highly complex in shape, constructing a suitable structured computational mesh is very challenging and can require a great deal of time. Models based on unstructured computational meshes require relatively less time to construct and are capable of accurately representing plaque components in three dimensions. These models unfortunately require additional computational resources and computing time for accurate and meaningful results. A two-stage modeling strategy based on unstructured computational meshes is proposed to achieve a reasonable balance between meshing difficulty and computational resource and time demand. In this method, a coarsegrained simulation of the full arterial domain is used to guide and constrain a fine-scale simulation of a smaller region of interest within the full domain. Results for a patient-specific carotid bifurcation model demonstrate that the two-stage approach can afford a large savings in both time for mesh generation and time and resources needed for computation. The effects of solid and fluid domain truncation were explored, and were shown to minimally affect accuracy of the stress fields predicted with the two-stage approach. PMID:19756798

  14. Impact of Extended Daylight Saving Time on National Energy Consumption Report to Congress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belzer, D. B.; Hadley, S. W.; Chin, S-M.

    2008-10-01

    The Energy Policy Act of 2005 (Pub. L. No. 109-58; EPAct 2005) amended the Uniform Time Act of 1966 (Pub. L. No. 89-387) to increase the portion of the year that is subject to Daylight Saving Time. (15 U.S.C. 260a note) EPAct 2005 extended the duration of Daylight Saving Time in the spring by changing its start date from the first Sunday in April to the second Sunday in March, and in the fall by changing its end date from the last Sunday in October to the first Sunday in November. (15 U.S.C. 260a note) EPAct 2005 also called formore » the Department of Energy to evaluate the impact of Extended Daylight Saving Time on energy consumption in the United States and to submit a report to Congress. (15 U.S.C. 260a note) This report presents the results of impacts of Extended Daylight Saving Time on the national energy consumption in the United States. The key findings are: (1) The total electricity savings of Extended Daylight Saving Time were about 1.3 Tera Watt-hour (TWh). This corresponds to 0.5 percent per each day of Extended Daylight Saving Time, or 0.03 percent of electricity consumption over the year. In reference, the total 2007 electricity consumption in the United States was 3,900 TWh. (2) In terms of national primary energy consumption, the electricity savings translate to a reduction of 17 Trillion Btu (TBtu) over the spring and fall Extended Daylight Saving Time periods, or roughly 0.02 percent of total U.S. energy consumption during 2007 of 101,000 TBtu. (3) During Extended Daylight Saving Time, electricity savings generally occurred over a three- to five-hour period in the evening with small increases in usage during the early-morning hours. On a daily percentage basis, electricity savings were slightly greater during the March (spring) extension of Extended Daylight Saving Time than the November (fall) extension. On a regional basis, some southern portions of the United States exhibited slightly smaller impacts of Extended Daylight Saving Time on energy savings compared to the northern regions, a result possibly due to a small, offsetting increase in household air conditioning usage. (4) Changes in national traffic volume and motor gasoline consumption for passenger vehicles in 2007 were determined to be statistically insignificant and therefore, could not be attributed to Extended Daylight Saving Time.« less

  15. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  16. Integrated Laser Characterization, Data Acquisition, and Command and Control Test System

    NASA Technical Reports Server (NTRS)

    Stysley, Paul; Coyle, Barry; Lyness, Eric

    2012-01-01

    Satellite-based laser technology has been developed for topographical measurements of the Earth and of other planets. Lasers for such missions must be highly efficient and stable over long periods in the temperature variations of orbit. In this innovation, LabVIEW is used on an Apple Macintosh to acquire and analyze images of the laser beam as it exits the laser cavity to evaluate the laser s performance over time, and to monitor and control the environmental conditions under which the laser is tested. One computer attached to multiple cameras and instruments running LabVIEW-based software replaces a conglomeration of computers and software packages, saving hours in maintenance and data analysis, and making very longterm tests possible. This all-in-one system was written primarily using LabVIEW for Mac OS X, which allows the combining of data from multiple RS-232, USB, and Ethernet instruments for comprehensive laser analysis and control. The system acquires data from CCDs (charge coupled devices), power meters, thermistors, and oscilloscopes over a controllable period of time. This data is saved to an html file that can be accessed later from a variety of data analysis programs. Also, through the LabVIEW interface, engineers can easily control laser input parameters such as current, pulse width, chiller temperature, and repetition rates. All of these parameters can be adapted and cycled over a period of time.

  17. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less

  18. The Power of Flexibility: Autonomous Agents That Conserve Energy in Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kwak, Jun-young

    Agent-based systems for energy conservation are now a growing area of research in multiagent systems, with applications ranging from energy management and control on the smart grid, to energy conservation in residential buildings, to energy generation and dynamic negotiations in distributed rural communities. Contributing to this area, my thesis presents new agent-based models and algorithms aiming to conserve energy in commercial buildings. More specifically, my thesis provides three sets of algorithmic contributions. First, I provide online predictive scheduling algorithms to handle massive numbers of meeting/event scheduling requests considering flexibility , which is a novel concept for capturing generic user constraints while optimizing the desired objective. Second, I present a novel BM-MDP ( Bounded-parameter Multi-objective Markov Decision Problem) model and robust algorithms for multi-objective optimization under uncertainty both at the planning and execution time. The BM-MDP model and its robust algorithms are useful in (re)scheduling events to achieve energy efficiency in the presence of uncertainty over user's preferences. Third, when multiple users contribute to energy savings, fair division of credit for such savings to incentivize users for their energy saving activities arises as an important question. I appeal to cooperative game theory and specifically to the concept of Shapley value for this fair division. Unfortunately, scaling up this Shapley value computation is a major hindrance in practice. Therefore, I present novel approximation algorithms to efficiently compute the Shapley value based on sampling and partitions and to speed up the characteristic function computation. These new models have not only advanced the state of the art in multiagent algorithms, but have actually been successfully integrated within agents dedicated to energy efficiency: SAVES, TESLA and THINC. SAVES focuses on the day-to-day energy consumption of individuals and groups in commercial buildings by reactively suggesting energy conserving alternatives. TESLA takes a long-range planning perspective and optimizes overall energy consumption of a large number of group events or meetings together. THINC provides an end-to-end integration within a single agent of energy efficient scheduling, rescheduling and credit allocation. While SAVES, TESLA and THINC thus differ in their scope and applicability, they demonstrate the utility of agent-based systems in actually reducing energy consumption in commercial buildings. I evaluate my algorithms and agents using extensive analysis on data from over 110,000 real meetings/events at multiple educational buildings including the main libraries at the University of Southern California. I also provide results on simulations and real-world experiments, clearly demonstrating the power of agent technology to assist human users in saving energy in commercial buildings.

  19. The Ames Power Monitoring System

    NASA Technical Reports Server (NTRS)

    Osetinsky, Leonid; Wang, David

    2003-01-01

    The Ames Power Monitoring System (APMS) is a centralized system of power meters, computer hardware, and specialpurpose software that collects and stores electrical power data by various facilities at Ames Research Center (ARC). This system is needed because of the large and varying nature of the overall ARC power demand, which has been observed to range from 20 to 200 MW. Large portions of peak demand can be attributed to only three wind tunnels (60, 180, and 100 MW, respectively). The APMS helps ARC avoid or minimize costly demand charges by enabling wind-tunnel operators, test engineers, and the power manager to monitor total demand for center in real time. These persons receive the information they need to manage and schedule energy-intensive research in advance and to adjust loads in real time to ensure that the overall maximum allowable demand is not exceeded. The APMS (see figure) includes a server computer running the Windows NT operating system and can, in principle, include an unlimited number of power meters and client computers. As configured at the time of reporting the information for this article, the APMS includes more than 40 power meters monitoring all the major research facilities, plus 15 Windows-based client personal computers that display real-time and historical data to users via graphical user interfaces (GUIs). The power meters and client computers communicate with the server using Transmission Control Protocol/Internet Protocol (TCP/IP) on Ethernet networks, variously, through dedicated fiber-optic cables or through the pre-existing ARC local-area network (ARCLAN). The APMS has enabled ARC to achieve significant savings ($1.2 million in 2001) in the cost of power and electric energy by helping personnel to maintain total demand below monthly allowable levels, to manage the overall power factor to avoid low power factor penalties, and to use historical system data to identify opportunities for additional energy savings. The APMS also provides power engineers and electricians with the information they need to plan modifications in advance and perform day-to-day maintenance of the ARC electric-power distribution system.

  20. Telescience Support Center Data System Software

    NASA Technical Reports Server (NTRS)

    Rahman, Hasan

    2010-01-01

    The Telescience Support Center (TSC) team has developed a databasedriven, increment-specific Data Require - ment Document (DRD) generation tool that automates much of the work required for generating and formatting the DRD. It creates a database to load the required changes to configure the TSC data system, thus eliminating a substantial amount of labor in database entry and formatting. The TSC database contains the TSC systems configuration, along with the experimental data, in which human physiological data must be de-commutated in real time. The data for each experiment also must be cataloged and archived for future retrieval. TSC software provides tools and resources for ground operation and data distribution to remote users consisting of PIs (principal investigators), bio-medical engineers, scientists, engineers, payload specialists, and computer scientists. Operations support is provided for computer systems access, detailed networking, and mathematical and computational problems of the International Space Station telemetry data. User training is provided for on-site staff and biomedical researchers and other remote personnel in the usage of the space-bound services via the Internet, which enables significant resource savings for the physical facility along with the time savings versus traveling to NASA sites. The software used in support of the TSC could easily be adapted to other Control Center applications. This would include not only other NASA payload monitoring facilities, but also other types of control activities, such as monitoring and control of the electric grid, chemical, or nuclear plant processes, air traffic control, and the like.

  1. Development of a Nonequilibrium Radiative Heating Prediction Method for Coupled Flowfield Solutions

    NASA Technical Reports Server (NTRS)

    Hartung, Lin C.

    1991-01-01

    A method for predicting radiative heating and coupling effects in nonequilibrium flow-fields has been developed. The method resolves atomic lines with a minimum number of spectral points, and treats molecular radiation using the smeared band approximation. To further minimize computational time, the calculation is performed on an optimized spectrum, which is computed for each flow condition to enhance spectral resolution. Additional time savings are obtained by performing the radiation calculation on a subgrid optimally selected for accuracy. Representative results from the new method are compared to previous work to demonstrate that the speedup does not cause a loss of accuracy and is sufficient to make coupled solutions practical. The method is found to be a useful tool for studies of nonequilibrium flows.

  2. New protocol for construction of eyeglasses-supported provisional nasal prosthesis using CAD/CAM techniques.

    PubMed

    Ciocca, Leonardo; Fantini, Massimiliano; De Crescenzio, Francesca; Persiani, Franco; Scotti, Roberto

    2010-01-01

    A new protocol for making an immediate provisional eyeglasses-supported nasal prosthesis is presented that uses laser scanning, computer-aided design/computer-aided manufacturing procedures, and rapid prototyping techniques, reducing time and costs while increasing the quality of the final product. With this protocol, the eyeglasses were digitized, and the relative position of the nasal prosthesis was planned and evaluated in a virtual environment without any try-in appointment. This innovative method saves time, reduces costs, and restores the patient's aesthetic appearance after a disfiguration caused by ablation of the nasal pyramid better than conventional restoration methods. Moreover, the digital model of the designed nasal epithesis can be used to develop a definitive prosthesis anchored to osseointegrated craniofacial implants.

  3. Development of computer program NAS3D using Vector processing for geometric nonlinear analysis of structures

    NASA Technical Reports Server (NTRS)

    Mangalgiri, P. D.; Prabhakaran, R.

    1986-01-01

    An algorithm for vectorized computation of stiffness matrices of an 8 noded isoparametric hexahedron element for geometric nonlinear analysis was developed. This was used in conjunction with the earlier 2-D program GAMNAS to develop the new program NAS3D for geometric nonlinear analysis. A conventional, modified Newton-Raphson process is used for the nonlinear analysis. New schemes for the computation of stiffness and strain energy release rates is presented. The organization the program is explained and some results on four sample problems are given. The study of CPU times showed that savings by a factor of 11 to 13 were achieved when vectorized computation was used for the stiffness instead of the conventional scalar one. Finally, the scheme of inputting data is explained.

  4. TICK: Transparent Incremental Checkpointing at Kernel Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrini, Fabrizio; Gioiosa, Roberto

    2004-10-25

    TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5

  5. A High-Performance Genetic Algorithm: Using Traveling Salesman Problem as a Case

    PubMed Central

    Tsai, Chun-Wei; Tseng, Shih-Pang; Yang, Chu-Sing

    2014-01-01

    This paper presents a simple but efficient algorithm for reducing the computation time of genetic algorithm (GA) and its variants. The proposed algorithm is motivated by the observation that genes common to all the individuals of a GA have a high probability of surviving the evolution and ending up being part of the final solution; as such, they can be saved away to eliminate the redundant computations at the later generations of a GA. To evaluate the performance of the proposed algorithm, we use it not only to solve the traveling salesman problem but also to provide an extensive analysis on the impact it may have on the quality of the end result. Our experimental results indicate that the proposed algorithm can significantly reduce the computation time of GA and GA-based algorithms while limiting the degradation of the quality of the end result to a very small percentage compared to traditional GA. PMID:24892038

  6. A high-performance genetic algorithm: using traveling salesman problem as a case.

    PubMed

    Tsai, Chun-Wei; Tseng, Shih-Pang; Chiang, Ming-Chao; Yang, Chu-Sing; Hong, Tzung-Pei

    2014-01-01

    This paper presents a simple but efficient algorithm for reducing the computation time of genetic algorithm (GA) and its variants. The proposed algorithm is motivated by the observation that genes common to all the individuals of a GA have a high probability of surviving the evolution and ending up being part of the final solution; as such, they can be saved away to eliminate the redundant computations at the later generations of a GA. To evaluate the performance of the proposed algorithm, we use it not only to solve the traveling salesman problem but also to provide an extensive analysis on the impact it may have on the quality of the end result. Our experimental results indicate that the proposed algorithm can significantly reduce the computation time of GA and GA-based algorithms while limiting the degradation of the quality of the end result to a very small percentage compared to traditional GA.

  7. Fast dictionary generation and searching for magnetic resonance fingerprinting.

    PubMed

    Jun Xie; Mengye Lyu; Jian Zhang; Hui, Edward S; Wu, Ed X; Ze Wang

    2017-07-01

    A super-fast dictionary generation and searching (DGS) algorithm was developed for MR parameter quantification using magnetic resonance fingerprinting (MRF). MRF is a new technique for simultaneously quantifying multiple MR parameters using one temporally resolved MR scan. But it has a multiplicative computation complexity, resulting in a big burden of dictionary generating, saving, and retrieving, which can easily be intractable for any state-of-art computers. Based on retrospective analysis of the dictionary matching object function, a multi-scale ZOOM like DGS algorithm, dubbed as MRF-ZOOM, was proposed. MRF ZOOM is quasi-parameter-separable so the multiplicative computation complexity is broken into additive one. Evaluations showed that MRF ZOOM was hundreds or thousands of times faster than the original MRF parameter quantification method even without counting the dictionary generation time in. Using real data, it yielded nearly the same results as produced by the original method. MRF ZOOM provides a super-fast solution for MR parameter quantification.

  8. Symplectic multi-particle tracking on GPUs

    NASA Astrophysics Data System (ADS)

    Liu, Zhicong; Qiang, Ji

    2018-05-01

    A symplectic multi-particle tracking model is implemented on the Graphic Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) language. The symplectic tracking model can preserve phase space structure and reduce non-physical effects in long term simulation, which is important for beam property evaluation in particle accelerators. Though this model is computationally expensive, it is very suitable for parallelization and can be accelerated significantly by using GPUs. In this paper, we optimized the implementation of the symplectic tracking model on both single GPU and multiple GPUs. Using a single GPU processor, the code achieves a factor of 2-10 speedup for a range of problem sizes compared with the time on a single state-of-the-art Central Processing Unit (CPU) node with similar power consumption and semiconductor technology. It also shows good scalability on a multi-GPU cluster at Oak Ridge Leadership Computing Facility. In an application to beam dynamics simulation, the GPU implementation helps save more than a factor of two total computing time in comparison to the CPU implementation.

  9. Testing simulation and structural models with applications to energy demand

    NASA Astrophysics Data System (ADS)

    Wolff, Hendrik

    2007-12-01

    This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality theory. Both results would not necessarily be achieved using standard econometric methods. The final chapter "Daylight Time and Energy" uses a quasi-experiment to evaluate a popular energy conservation policy: we challenge the conventional wisdom that extending Daylight Saving Time (DST) reduces energy demand. Using detailed panel data on half-hourly electricity consumption, prices, and weather conditions from four Australian states we employ a novel 'triple-difference' technique to test the electricity-saving hypothesis. We show that the extension failed to reduce electricity demand and instead increased electricity prices. We also apply the most sophisticated electricity simulation model available in the literature to the Australian data. We find that prior simulation models significantly overstate electricity savings. Our results suggest that extending DST will fail as an instrument to save energy resources.

  10. Calculation Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  11. A Reduced Dimension Static, Linearized Kalman Filter and Smoother

    NASA Technical Reports Server (NTRS)

    Fukumori, I.

    1995-01-01

    An approximate Kalman filter and smoother, based on approximations of the state estimation error covariance matrix, is described. Approximations include a reduction of the effective state dimension, use of a static asymptotic error limit, and a time-invariant linearization of the dynamic model for error integration. The approximations lead to dramatic computational savings in applying estimation theory to large complex systems. Examples of use come from TOPEX/POSEIDON.

  12. Numerical Solutions for a Cylindrical Laser Diffuser Flowfield

    DTIC Science & Technology

    1990-06-01

    exhaust conditions with minimum losses to optimize performance of the system. Thus, the handling of the system of shock waves to decelerate the flow...requirement for exhaustive experimental work will result in significant savings of both time and resources. As more advanced computers are developed, the...Mach number (ɚ.5) flows. Recent interest in hypersonic engine inlet performance has resulted in an extension of the methodology to high Mach number

  13. When are solar refrigerators less costly than on-grid refrigerators: A simulation modeling study☆

    PubMed Central

    Haidari, Leila A.; Brown, Shawn T.; Wedlock, Patrick; Connor, Diana L.; Spiker, Marie; Lee, Bruce Y.

    2017-01-01

    Background Gavi recommends solar refrigerators for vaccine storage in areas with less than eight hours of electricity per day, and WHO guidelines are more conservative. The question remains: Can solar refrigerators provide value where electrical outages are less frequent? Methods Using a HERMES-generated computational model of the Mozambique routine immunization supply chain, we simulated the use of solar versus electric mains-powered refrigerators (hereafter referred to as “electric refrigerators”) at different locations in the supply chain under various circumstances. Results At their current price premium, the annual cost of each solar refrigerator is 132% more than each electric refrigerator at the district level and 241% more at health facilities. Solar refrigerators provided savings over electric refrigerators when one-day electrical outages occurred more than five times per year at either the district level or the health facilities, even when the electric refrigerator holdover time exceeded the duration of the outage. Two-day outages occurring more than three times per year at the district level or more than twice per year at the health facilities also caused solar refrigerators to be cost saving. Lowering the annual cost of a solar refrigerator to 75% more than an electric refrigerator allowed solar refrigerators to be cost saving at either level when one-day outages occurred more than once per year, or when two-day outages occurred more than once per year at the district level or even once per year at the health facilities. Conclusion Our study supports WHO and Gavi guidelines. In fact, solar refrigerators may provide savings in total cost per dose administered over electrical refrigerators when electrical outages are less frequent. Our study identified the frequency and duration at which electrical outages need to occur for solar refrigerators to provide savings in total cost per dose administered over electric refrigerators at different solar refrigerator prices. PMID:28364935

  14. When are solar refrigerators less costly than on-grid refrigerators: A simulation modeling study.

    PubMed

    Haidari, Leila A; Brown, Shawn T; Wedlock, Patrick; Connor, Diana L; Spiker, Marie; Lee, Bruce Y

    2017-04-19

    Gavi recommends solar refrigerators for vaccine storage in areas with less than eight hours of electricity per day, and WHO guidelines are more conservative. The question remains: Can solar refrigerators provide value where electrical outages are less frequent? Using a HERMES-generated computational model of the Mozambique routine immunization supply chain, we simulated the use of solar versus electric mains-powered refrigerators (hereafter referred to as "electric refrigerators") at different locations in the supply chain under various circumstances. At their current price premium, the annual cost of each solar refrigerator is 132% more than each electric refrigerator at the district level and 241% more at health facilities. Solar refrigerators provided savings over electric refrigerators when one-day electrical outages occurred more than five times per year at either the district level or the health facilities, even when the electric refrigerator holdover time exceeded the duration of the outage. Two-day outages occurring more than three times per year at the district level or more than twice per year at the health facilities also caused solar refrigerators to be cost saving. Lowering the annual cost of a solar refrigerator to 75% more than an electric refrigerator allowed solar refrigerators to be cost saving at either level when one-day outages occurred more than once per year, or when two-day outages occurred more than once per year at the district level or even once per year at the health facilities. Our study supports WHO and Gavi guidelines. In fact, solar refrigerators may provide savings in total cost per dose administered over electrical refrigerators when electrical outages are less frequent. Our study identified the frequency and duration at which electrical outages need to occur for solar refrigerators to provide savings in total cost per dose administered over electric refrigerators at different solar refrigerator prices. Copyright © 2017. Published by Elsevier Ltd.

  15. Statistical and temporal irradiance fluctuations modeling for a ground-to-geostationary satellite optical link.

    PubMed

    Camboulives, A-R; Velluet, M-T; Poulenard, S; Saint-Antonin, L; Michau, V

    2018-02-01

    An optical communication link performance between the ground and a geostationary satellite can be impaired by scintillation, beam wandering, and beam spreading due to its propagation through atmospheric turbulence. These effects on the link performance can be mitigated by tracking and error correction codes coupled with interleaving. Precise numerical tools capable of describing the irradiance fluctuations statistically and of creating an irradiance time series are needed to characterize the benefits of these techniques and optimize them. The wave optics propagation methods have proven their capability of modeling the effects of atmospheric turbulence on a beam, but these are known to be computationally intensive. We present an analytical-numerical model which provides good results on the probability density functions of irradiance fluctuations as well as a time series with an important saving of time and computational resources.

  16. DEEP: Database of Energy Efficiency Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    A database of energy efficiency performance (DEEP) is a presimulated database to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 10 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER [sic] prototype buildings. The prototype buildings represent seven building types across six vintages of constructions and 16 California climate zones.more » DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air conditioning, plug loads, and domestic hot war. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of the CEC PIER project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users' decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit.« less

  17. Tutorial Guide: Computer-Aided Structural Modeling (CASM). Version 5.00

    DTIC Science & Technology

    1994-04-01

    2-3 SITE-SPECIFIC DATA DIALOG WINDOW ................. 2-4 SAVING PROJECT DATA ....................... 2-7 PRINTING PROJECT CRITERIA DATA...you to the main CASM screen without saving changes. REGIONAL DATA DIALOG WINDOW The Regonal• Dtadlalg windtow contim ms0•twrmofogoik hronimdon.The...Information so that It will be Included In your hardcopy output. 3. Select OK to save your Regional Data entries. The Regional Data dalog window will disappear

  18. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    PubMed

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.

  19. Methods for evaluating and ranking transportation energy conservation programs

    NASA Astrophysics Data System (ADS)

    Santone, L. C.

    1981-04-01

    The energy conservation programs are assessed in terms of petroleum savings, incremental costs to consumers probability of technical and market success, and external impacts due to environmental, economic, and social factors. Three ranking functions and a policy matrix are used to evaluate the programs. The net present value measure which computes the present worth of petroleum savings less the present worth of costs is modified by dividing by the present value of DOE funding to obtain a net present value per program dollar. The comprehensive ranking function takes external impacts into account. Procedures are described for making computations of the ranking functions and the attributes that require computation. Computations are made for the electric vehicle, Stirling engine, gas turbine, and MPG mileage guide program.

  20. User interface user's guide for HYPGEN

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1992-01-01

    The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.

  1. Efficiency and Accuracy in Thermal Simulation of Powder Bed Fusion of Bulk Metallic Glass

    NASA Astrophysics Data System (ADS)

    Lindwall, J.; Malmelöv, A.; Lundbäck, A.; Lindgren, L.-E.

    2018-05-01

    Additive manufacturing by powder bed fusion processes can be utilized to create bulk metallic glass as the process yields considerably high cooling rates. However, there is a risk that reheated material set in layers may become devitrified, i.e., crystallize. Therefore, it is advantageous to simulate the process to fully comprehend it and design it to avoid the aforementioned risk. However, a detailed simulation is computationally demanding. It is necessary to increase the computational speed while maintaining accuracy of the computed temperature field in critical regions. The current study evaluates a few approaches based on temporal reduction to achieve this. It is found that the evaluated approaches save a lot of time and accurately predict the temperature history.

  2. Use of computational fluid dynamics in respiratory medicine.

    PubMed

    Fernández Tena, Ana; Casan Clarà, Pere

    2015-06-01

    Computational Fluid Dynamics (CFD) is a computer-based tool for simulating fluid movement. The main advantages of CFD over other fluid mechanics studies include: substantial savings in time and cost, the analysis of systems or conditions that are very difficult to simulate experimentally (as is the case of the airways), and a practically unlimited level of detail. We used the Ansys-Fluent CFD program to develop a conducting airway model to simulate different inspiratory flow rates and the deposition of inhaled particles of varying diameters, obtaining results consistent with those reported in the literature using other procedures. We hope this approach will enable clinicians to further individualize the treatment of different respiratory diseases. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  3. Modular thermal analyzer routine, volume 1

    NASA Technical Reports Server (NTRS)

    Oren, J. A.; Phillips, M. A.; Williams, D. R.

    1972-01-01

    The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.

  4. Assessing efficiency and economic viability of rainwater harvesting systems for meeting non-potable water demands in four climatic zones of China

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Jing, X.

    2017-12-01

    Rainwater harvesting is now increasingly used to manage urban flood and alleviate water scarcity crisis. In this study, a computational tool based on water balance equation is developed to assess stormwater capture and water saving efficiency and economic viability of rainwater harvesting systems (RHS) in eight cities across four climatic zones of China. It requires daily rainfall, contributing area, runoff losses, first flush volume, storage capacity, daily water demand and economic parameters as inputs. Three non-potable water demand scenarios (i.e., toilet flushing, lawn irrigation, and combination of them) are considered. The water demand for lawn irrigation is estimated using the Cropwat 8.0 and Climwat 2.0. Results indicate that higher water saving efficiency and water supply time reliability can be achieved for RHS with larger storage capacities, for lower water demand scenarios and located in more humid regions, while higher stormwater capture efficiency is associated with larger storage capacity, higher water demand scenarios and less rainfall. For instance, a 40 m3 RHS in Shanghai (humid climate) for lawn irrigation can capture 17% of stormwater, while its water saving efficiency and time reliability can reach 96 % and 98%, respectively. The water saving efficiency and time reliability of a 20 m3 RHS in Xining (semi-arid climate) for toilet flushing are 19% and 16%, respectively, but it can capture 63% of stormwater. With the current values of economic parameters, economic viability of RHS can be achieved in humid and semi-humid regions for reasonably designed RHS; however, it is not financially viable to install RHS in arid regions as the benefit-cost ratio is much smaller than 1.0.

  5. A high performance load balance strategy for real-time multicore systems.

    PubMed

    Cho, Keng-Mao; Tsai, Chun-Wei; Chiu, Yi-Shiuan; Yang, Chu-Sing

    2014-01-01

    Finding ways to distribute workloads to each processor core and efficiently reduce power consumption is of vital importance, especially for real-time systems. In this paper, a novel scheduling algorithm is proposed for real-time multicore systems to balance the computation loads and save power. The developed algorithm simultaneously considers multiple criteria, a novel factor, and task deadline, and is called power and deadline-aware multicore scheduling (PDAMS). Experiment results show that the proposed algorithm can greatly reduce energy consumption by up to 54.2% and the deadline times missed, as compared to the other scheduling algorithms outlined in this paper.

  6. A High Performance Load Balance Strategy for Real-Time Multicore Systems

    PubMed Central

    Cho, Keng-Mao; Tsai, Chun-Wei; Chiu, Yi-Shiuan; Yang, Chu-Sing

    2014-01-01

    Finding ways to distribute workloads to each processor core and efficiently reduce power consumption is of vital importance, especially for real-time systems. In this paper, a novel scheduling algorithm is proposed for real-time multicore systems to balance the computation loads and save power. The developed algorithm simultaneously considers multiple criteria, a novel factor, and task deadline, and is called power and deadline-aware multicore scheduling (PDAMS). Experiment results show that the proposed algorithm can greatly reduce energy consumption by up to 54.2% and the deadline times missed, as compared to the other scheduling algorithms outlined in this paper. PMID:24955382

  7. A fast parallel 3D Poisson solver with longitudinal periodic and transverse open boundary conditions for space-charge simulations

    NASA Astrophysics Data System (ADS)

    Qiang, Ji

    2017-10-01

    A three-dimensional (3D) Poisson solver with longitudinal periodic and transverse open boundary conditions can have important applications in beam physics of particle accelerators. In this paper, we present a fast efficient method to solve the Poisson equation using a spectral finite-difference method. This method uses a computational domain that contains the charged particle beam only and has a computational complexity of O(Nu(logNmode)) , where Nu is the total number of unknowns and Nmode is the maximum number of longitudinal or azimuthal modes. This saves both the computational time and the memory usage of using an artificial boundary condition in a large extended computational domain. The new 3D Poisson solver is parallelized using a message passing interface (MPI) on multi-processor computers and shows a reasonable parallel performance up to hundreds of processor cores.

  8. Computer Programs (Turbomachinery)

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA computer programs are extensively used in design of industrial equipment. Available from the Computer Software Management and Information Center (COSMIC) at the University of Georgia, these programs are employed as analysis tools in design, test and development processes, providing savings in time and money. For example, two NASA computer programs are used daily in the design of turbomachinery by Delaval Turbine Division, Trenton, New Jersey. The company uses the NASA splint interpolation routine for analysis of turbine blade vibration and the performance of compressors and condensers. A second program, the NASA print plot routine, analyzes turbine rotor response and produces graphs for project reports. The photos show examples of Delaval test operations in which the computer programs play a part. In the large photo below, a 24-inch turbine blade is undergoing test; in the smaller photo, a steam turbine rotor is being prepared for stress measurements under actual operating conditions; the "spaghetti" is wiring for test instrumentation

  9. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates

    PubMed Central

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  10. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  11. Finding workers, offenders, or students most at-risk for violence: actuarial tests save lives and resources.

    PubMed

    Zagar, Robert John; Kovach, Joseph W; Basile, Benjamin; Hughes, John Russell; Grove, William M; Busch, Kenneth G; Zablocki, Michael; Osnowitz, William; Neuhengen, Jonas; Liu, Yutong; Zagar, Agata Karolina

    2013-12-01

    147 adults (107 men, 40 women) and 89 adolescents (61 boys, 28 girls), selected randomly from referrals and volunteers, were given the Ammons Quick Test (QT), the Beck Suicide Scale (BSS), the Minnesota Multiphasic Personality Inventory Second (MMPI-2) or Adolescent Versions (MMPI-A), the Raven's Advanced Progressive Matrices, and the Standard Predictor (SP) of Violence Potential Adult or Adolescent Versions. The goals were to: (a) demonstrate computer and paper-and-pencil tests correlated; (b) validate tests to identify at-risk for violence; (c) show that identifying at-risk saves lives and resources; and (d) find which industries benefited from testing at-risk. Paper-and-pencil vs. computer test correlations (.83-.99), sensitivity (.97-.98), and specificity (.50-.97) were computed. Testing at-risk saves lives and resources. Critical industries for testing at-risk individuals may include airlines, energy generating industries, insurance, military, nonprofit-religious, prisoners, trucking or port workers, and veterans.

  12. High performance pipelined multiplier with fast carry-save adder

    NASA Technical Reports Server (NTRS)

    Wu, Angus

    1990-01-01

    A high-performance pipelined multiplier is described. Its high performance results from the fast carry-save adder basic cell which has a simple structure and is suitable for the Gate Forest semi-custom environment. The carry-save adder computes the sum and carry within two gate delay. Results show that the proposed adder can operate at 200 MHz for a 2-micron CMOS process; better performance is expected in a Gate Forest realization.

  13. A Control-Theoretic Approach for the Combined Management of Quality-of-Service and Energy in Service Centers

    NASA Astrophysics Data System (ADS)

    Poussot-Vassal, Charles; Tanelli, Mara; Lovera, Marco

    The complexity of Information Technology (IT) systems is steadily increasing and system complexity has been recognised as the main obstacle to further advancements of IT. This fact has recently raised energy management issues. Control techniques have been proposed and successfully applied to design Autonomic Computing systems, trading-off system performance with energy saving goals. As users behaviour is highly time varying and workload conditions can change substantially within the same business day, the Linear Parametrically Varying (LPV) framework is particularly promising for modeling such systems. In this chapter, a control-theoretic method to investigate the trade-off between Quality of Service (QoS) requirements and energy saving objectives in the case of admission control in Web service systems is proposed, considering as control variables the server CPU frequency and the admission probability. To quantitatively evaluate the trade-off, a dynamic model of the admission control dynamics is estimated via LPV identification techniques. Based on this model, an optimisation problem within the Model Predictive Control (MPC) framework is setup, by means of which it is possible to investigate the optimal trade-off policy to manage QoS and energy saving objectives at design time and taking into explicit account the system dynamics.

  14. Data Characterization Using Artificial-Star Tests: Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Hu, Yi; Deng, Licai; de Grijs, Richard; Liu, Qiang

    2011-01-01

    Traditional artificial-star tests are widely applied to photometry in crowded stellar fields. However, to obtain reliable binary fractions (and their uncertainties) of remote, dense, and rich star clusters, one needs to recover huge numbers of artificial stars. Hence, this will consume much computation time for data reduction of the images to which the artificial stars must be added. In this article, we present a new method applicable to data sets characterized by stable, well-defined, point-spread functions, in which we add artificial stars to the retrieved-data catalog instead of to the raw images. Taking the young Large Magellanic Cloud cluster NGC 1818 as an example, we compare results from both methods and show that they are equivalent, while our new method saves significant computational time.

  15. A sub-space greedy search method for efficient Bayesian Network inference.

    PubMed

    Zhang, Qing; Cao, Yong; Li, Yong; Zhu, Yanming; Sun, Samuel S M; Guo, Dianjing

    2011-09-01

    Bayesian network (BN) has been successfully used to infer the regulatory relationships of genes from microarray dataset. However, one major limitation of BN approach is the computational cost because the calculation time grows more than exponentially with the dimension of the dataset. In this paper, we propose a sub-space greedy search method for efficient Bayesian Network inference. Particularly, this method limits the greedy search space by only selecting gene pairs with higher partial correlation coefficients. Using both synthetic and real data, we demonstrate that the proposed method achieved comparable results with standard greedy search method yet saved ∼50% of the computational time. We believe that sub-space search method can be widely used for efficient BN inference in systems biology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Productivity increase through implementation of CAD/CAE workstation

    NASA Technical Reports Server (NTRS)

    Bromley, L. K.

    1985-01-01

    The tracking and communication division computer aided design/computer aided engineering system is now operational. The system is utilized in an effort to automate certain tasks that were previously performed manually. These tasks include detailed test configuration diagrams of systems under certification test in the ESTL, floorplan layouts of future planned laboratory reconfigurations, and other graphical documentation of division activities. The significant time savings achieved with this CAD/CAE system are examined: (1) input of drawings and diagrams; (2) editing of initial drawings; (3) accessibility of the data; and (4) added versatility. It is shown that the Applicon CAD/CAE system, with its ease of input and editing, the accessibility of data, and its added versatility, has made more efficient many of the necessary but often time-consuming tasks associated with engineering design and testing.

  17. Neutron skyshine calculations for the PDX tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, F.J.; Nigg, D.W.

    1979-01-01

    The Poloidal Divertor Experiment (PDX) at Princeton will be the first operating tokamak to require a substantial radiation shield. The PDX shielding includes a water-filled roof shield over the machine to reduce air scattering skyshine dose in the PDX control room and at the site boundary. During the design of this roof shield a unique method was developed to compute the neutron source emerging from the top of the roof shield for use in Monte Carlo skyshine calculations. The method is based on simple, one-dimensional calculations rather than multidimensional calculations, resulting in considerable savings in computer time and input preparationmore » effort. This method is described.« less

  18. Pulmonary Testing Laboratory Computer Application

    PubMed Central

    Johnson, Martin E.

    1980-01-01

    An interactive computer application reporting patient pulmonary function data has been developed by Washington, D.C. VA Medical Center staff. A permanent on-line data base of patient demographics, lung capacity, flows, diffusion, arterial blood gases and physician interpretation is maintained by a minicomputer at the hospital. A user oriented application program resulted from development in concert with the clinical users. Rapid program development resulted from employing a newly developed time saving technique that has found wide application at other VA Medical Centers. Careful attention to user interaction has resulted in an application program requiring little training and which has been satisfactorily used by a number of clinicians.

  19. An evaluation of computer assisted clinical classification algorithms.

    PubMed

    Chute, C G; Yang, Y; Buntrock, J

    1994-01-01

    The Mayo Clinic has a long tradition of indexing patient records in high resolution and volume. Several algorithms have been developed which promise to help human coders in the classification process. We evaluate variations on code browsers and free text indexing systems with respect to their speed and error rates in our production environment. The more sophisticated indexing systems save measurable time in the coding process, but suffer from incompleteness which requires a back-up system or human verification. Expert Network does the best job of rank ordering clinical text, potentially enabling the creation of thresholds for the pass through of computer coded data without human review.

  20. Combining Thermal And Structural Analyses

    NASA Technical Reports Server (NTRS)

    Winegar, Steven R.

    1990-01-01

    Computer code makes programs compatible so stresses and deformations calculated. Paper describes computer code combining thermal analysis with structural analysis. Called SNIP (for SINDA-NASTRAN Interfacing Program), code provides interface between finite-difference thermal model of system and finite-element structural model when no node-to-element correlation between models. Eliminates much manual work in converting temperature results of SINDA (Systems Improved Numerical Differencing Analyzer) program into thermal loads for NASTRAN (NASA Structural Analysis) program. Used to analyze concentrating reflectors for solar generation of electric power. Large thermal and structural models needed to predict distortion of surface shapes, and SNIP saves considerable time and effort in combining models.

  1. Auto Design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The 1987 Honda Acura Legend Coupe was designed with aid of the NASA-developed NASTRAN computer program. NASTRAN takes an electronic look at a computerized design and predicts how the structure will react under a great many different conditions. Quick and inexpensive, it minimizes trial and error in the design process and makes possible better, lighter, safer structures while affording significant savings in development time. All Honda auto products designed in the 1980's have been analyzed by the NASTRAN program.

  2. Observations on SOFIA Observation Scheduling: Search and Inference in the Face of Discrete and Continuous Constraints

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Gross, Michael; Kuerklu, Elif

    2003-01-01

    We did cool stuff to reduce the number of IVPs and BVPs needed to schedule SOFIA by restricting the problem. The restriction costs us little in terms of the value of the flight plans we can build. The restriction allowed us to reformulate part of the search problem as a zero-finding problem. The result is a simplified planning model and significant savings in computation time.

  3. DOD Weapon Systems Software Management Study, Appendix B. Shipborne Systems

    DTIC Science & Technology

    1975-06-01

    program management, from Inception to development maintenance, 2. Detailed documentation requirements, 3. Standard high -level language development (CS-1...the Guided Missile School (GMS) at Dam Neck. The APL Land-Based Test Site (LETS) consisted of a Mk 152 digital fire control computer, SPG-55B radar...instruction and data segments are respectively placed in low and high core addresses to take advantage of UYK-7 memory accessing time savings. UYK-7

  4. The Environmental Qualification Specification as a Technical Management Tool,

    DTIC Science & Technology

    1981-11-01

    CommuentI The dwell test for fatigue of the isolation system in a container is intended to be an accelerated test, in order to save test time and...diagnostic purposes * Response computation is not the only possible design approach. In the development of control system or servomechanism theory , emphasis...seldom aborts a mission in the same way as a complete failure of any function, and properly influences system effectiveness through a different type of

  5. Computer Based Education.

    ERIC Educational Resources Information Center

    Fauley, Franz E.

    1980-01-01

    A case study of what one company did to increase the productivity of its sales force and generate cost savings by using computer-assisted instruction to teach salespeople at regional offices. (Editor)

  6. Anticipation and Choice Heuristics in the Dynamic Consumption of Pain Relief

    PubMed Central

    Story, Giles W.; Vlaev, Ivo; Dayan, Peter; Seymour, Ben; Darzi, Ara; Dolan, Raymond J.

    2015-01-01

    Humans frequently need to allocate resources across multiple time-steps. Economic theory proposes that subjects do so according to a stable set of intertemporal preferences, but the computational demands of such decisions encourage the use of formally less competent heuristics. Few empirical studies have examined dynamic resource allocation decisions systematically. Here we conducted an experiment involving the dynamic consumption over approximately 15 minutes of a limited budget of relief from moderately painful stimuli. We had previously elicited the participants’ time preferences for the same painful stimuli in one-off choices, allowing us to assess self-consistency. Participants exhibited three characteristic behaviors: saving relief until the end, spreading relief across time, and early spending, of which the last was markedly less prominent. The likelihood that behavior was heuristic rather than normative is suggested by the weak correspondence between one-off and dynamic choices. We show that the consumption choices are consistent with a combination of simple heuristics involving early-spending, spreading or saving of relief until the end, with subjects predominantly exhibiting the last two. PMID:25793302

  7. Anticipation and choice heuristics in the dynamic consumption of pain relief.

    PubMed

    Story, Giles W; Vlaev, Ivo; Dayan, Peter; Seymour, Ben; Darzi, Ara; Dolan, Raymond J

    2015-03-01

    Humans frequently need to allocate resources across multiple time-steps. Economic theory proposes that subjects do so according to a stable set of intertemporal preferences, but the computational demands of such decisions encourage the use of formally less competent heuristics. Few empirical studies have examined dynamic resource allocation decisions systematically. Here we conducted an experiment involving the dynamic consumption over approximately 15 minutes of a limited budget of relief from moderately painful stimuli. We had previously elicited the participants' time preferences for the same painful stimuli in one-off choices, allowing us to assess self-consistency. Participants exhibited three characteristic behaviors: saving relief until the end, spreading relief across time, and early spending, of which the last was markedly less prominent. The likelihood that behavior was heuristic rather than normative is suggested by the weak correspondence between one-off and dynamic choices. We show that the consumption choices are consistent with a combination of simple heuristics involving early-spending, spreading or saving of relief until the end, with subjects predominantly exhibiting the last two.

  8. Handling Neighbor Discovery and Rendezvous Consistency with Weighted Quorum-Based Approach

    PubMed Central

    Own, Chung-Ming; Meng, Zhaopeng; Liu, Kehan

    2015-01-01

    Neighbor discovery and the power of sensors play an important role in the formation of Wireless Sensor Networks (WSNs) and mobile networks. Many asynchronous protocols based on wake-up time scheduling have been proposed to enable neighbor discovery among neighboring nodes for the energy saving, especially in the difficulty of clock synchronization. However, existing researches are divided two parts with the neighbor-discovery methods, one is the quorum-based protocols and the other is co-primality based protocols. Their distinction is on the arrangements of time slots, the former uses the quorums in the matrix, the latter adopts the numerical analysis. In our study, we propose the weighted heuristic quorum system (WQS), which is based on the quorum algorithm to eliminate redundant paths of active slots. We demonstrate the specification of our system: fewer active slots are required, the referring rate is balanced, and remaining power is considered particularly when a device maintains rendezvous with discovered neighbors. The evaluation results showed that our proposed method can effectively reschedule the active slots and save the computing time of the network system. PMID:26404297

  9. Measuring, managing and maximizing performance of mineral processing plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1995-12-31

    The implementation of continuous quality improvement is the confluence of Total Quality Management, People Empowerment, Performance Indicators and Information Engineering. The supporting information technologies allow a mineral processor to narrow the gap between management business objectives and the process control level. One of the most important contributors is the user friendliness and flexibility of the personal computer in a client/server environment. This synergistic combination when used for real time performance monitoring translates into production cost savings, improved communications and enhanced decision support. Other savings come from reduced time to collect data and perform tedious calculations, act quickly with fresh newmore » data, generate and validate data to be used by others. This paper presents an integrated view of plant management. The selection of the proper tools for continuous quality improvement are described. The process of selecting critical performance monitoring indices for improved plant performance are discussed. The importance of a well balanced technological improvement, personnel empowerment, total quality management and organizational assets are stressed.« less

  10. Polimedication: applicability of a computer tool to reduce polypharmacy in nursing homes.

    PubMed

    García-Caballero, Tomás M; Lojo, Juan; Menéndez, Carlos; Fernández-Álvarez, Roberto; Mateos, Raimundo; Garcia-Caballero, Alejandro

    2018-05-11

    ABSTRACTBackground:The risks of polypharmacy can be far greater than the benefits, especially in the elderly. Comorbidity makes polypharmacy very prevalent in this population; thus, increasing the occurrence of adverse effects. To solve this problem, the most common strategy is to use lists of potentially inappropriate medications. However, this strategy is time consuming. In order to minimize the expenditure of time, our group devised a pilot computer tool (Polimedication) that automatically processes lists of medication providing the corresponding Screening Tool of Older Persons' potentially inappropriate Prescriptions alerts and facilitating standardized reports. The drug lists for 115 residents in Santa Marta Nursing Home (Fundación San Rosendo, Ourense, Spain) were processed. The program detected 10.04 alerts/patient, of which 74.29% were not repeated. After reviewing these alerts, 12.12% of the total (1.30 alerts/patient) were considered relevant. The largest number of alerts (41.48%) involved neuroleptic drugs. Finally, the patient's family physician or psychiatrist accepted the alert and made medication changes in 62.86% of the relevant alerts. The largest number of changes (38.64%) also involved neuroleptic drugs. The mean time spent in the generation and review of the warnings was 6.26 minute/patient. Total changes represented a saving of 32.77 € per resident/year in medication. The application of Polimedication tool detected a high proportion of potentially inappropriate prescriptions in institutionalized elderly patients. The use of the computerized tool achieved significant savings in pharmaceutical expenditure, as well as a reduction in the time taken for medication review.

  11. Effect of the time window on the heat-conduction information filtering model

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Song, Wen-Jun; Hou, Lei; Zhang, Yi-Lu; Liu, Jian-Guo

    2014-05-01

    Recommendation systems have been proposed to filter out the potential tastes and preferences of the normal users online, however, the physics of the time window effect on the performance is missing, which is critical for saving the memory and decreasing the computation complexity. In this paper, by gradually expanding the time window, we investigate the impact of the time window on the heat-conduction information filtering model with ten similarity measures. The experimental results on the benchmark dataset Netflix indicate that by only using approximately 11.11% recent rating records, the accuracy could be improved by an average of 33.16% and the diversity could be improved by 30.62%. In addition, the recommendation performance on the dataset MovieLens could be preserved by only considering approximately 10.91% recent records. Under the circumstance of improving the recommendation performance, our discoveries possess significant practical value by largely reducing the computational time and shortening the data storage space.

  12. Valuation of travel time savings in viewpoint of WTA.

    PubMed

    Shao, Chang-Qiao; Liu, Yang; Liu, Xiao-Ming

    2014-01-01

    In order to investigate the issues in measurement of value of travel time savings (VTTS), the willingness-to-accept (WTA) for the private car owner is studied by using surveyed data. It is convincing that trip purpose, trip length, time savings, cost savings, income, and allowance from employee have effects on the WTA. Moreover, influences of these variables are not the same for different trip purposes. For commuting trips, effects of income and allowance from employee are significant while time savings and cost savings are dominated for leisure and shopping trips. It is also found that WTA is much higher than expected which implies that there are a group of drivers who are not prone to switching to other trip modes other than passenger car.

  13. Acceleration for 2D time-domain elastic full waveform inversion using a single GPU card

    NASA Astrophysics Data System (ADS)

    Jiang, Jinpeng; Zhu, Peimin

    2018-05-01

    Full waveform inversion (FWI) is a challenging procedure due to the high computational cost related to the modeling, especially for the elastic case. The graphics processing unit (GPU) has become a popular device for the high-performance computing (HPC). To reduce the long computation time, we design and implement the GPU-based 2D elastic FWI (EFWI) in time domain using a single GPU card. We parallelize the forward modeling and gradient calculations using the CUDA programming language. To overcome the limitation of relatively small global memory on GPU, the boundary saving strategy is exploited to reconstruct the forward wavefield. Moreover, the L-BFGS optimization method used in the inversion increases the convergence of the misfit function. A multiscale inversion strategy is performed in the workflow to obtain the accurate inversion results. In our tests, the GPU-based implementations using a single GPU device achieve >15 times speedup in forward modeling, and about 12 times speedup in gradient calculation, compared with the eight-core CPU implementations optimized by OpenMP. The test results from the GPU implementations are verified to have enough accuracy by comparing the results obtained from the CPU implementations.

  14. Application configuration selection for energy-efficient execution on multicore systems

    DOE PAGES

    Wang, Shinan; Luo, Bing; Shi, Weisong; ...

    2015-09-21

    Balanced performance and energy consumption are incorporated in the design of modern computer systems. Several runtime factors, such as concurrency levels, thread mapping strategies, and dynamic voltage and frequency scaling (DVFS) should be considered in order to achieve optimal energy efficiency fora workload. Selecting appropriate run-time factors, however, is one of the most challenging tasks because the run-time factors are architecture-specific and workload-specific. And while most existing works concentrate on either static analysis of the workload or run-time prediction results, we present a hybrid two-step method that utilizes concurrency levels and DVFS settings to achieve the energy efficiency configuration formore » a worldoad. The experimental results based on a Xeon E5620 server with NPB and PARSEC benchmark suites show that the model is able to predict the energy efficient configuration accurately. On average, an additional 10% EDP (Energy Delay Product) saving is obtained by using run-time DVFS for the entire system. An off-line optimal solution is used to compare with the proposed scheme. Finally, the experimental results show that the average extra EDP saved by the optimal solution is within 5% on selective parallel benchmarks.« less

  15. Coordinated platooning with multiple speeds

    DOE PAGES

    Luo, Fengqiao; Larson, Jeffrey; Munson, Todd

    2018-03-22

    In a platoon, vehicles travel one after another with small intervehicle distances; trailing vehicles in a platoon save fuel because they experience less aerodynamic drag. This work presents a coordinated platooning model with multiple speed options that integrates scheduling, routing, speed selection, and platoon formation/dissolution in a mixed-integer linear program that minimizes the total fuel consumed by a set of vehicles while traveling between their respective origins and destinations. The performance of this model is numerically tested on a grid network and the Chicago-area highway network. We find that the fuel-savings factor of a multivehicle system significantly depends on themore » time each vehicle is allowed to stay in the network; this time affects vehicles’ available speed choices, possible routes, and the amount of time for coordinating platoon formation. For problem instances with a large number of vehicles, we propose and test a heuristic decomposed approach that applies a clustering algorithm to partition the set of vehicles and then routes each group separately. When the set of vehicles is large and the available computational time is small, the decomposed approach finds significantly better solutions than does the full model.« less

  16. Coordinated platooning with multiple speeds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Fengqiao; Larson, Jeffrey; Munson, Todd

    In a platoon, vehicles travel one after another with small intervehicle distances; trailing vehicles in a platoon save fuel because they experience less aerodynamic drag. This work presents a coordinated platooning model with multiple speed options that integrates scheduling, routing, speed selection, and platoon formation/dissolution in a mixed-integer linear program that minimizes the total fuel consumed by a set of vehicles while traveling between their respective origins and destinations. The performance of this model is numerically tested on a grid network and the Chicago-area highway network. We find that the fuel-savings factor of a multivehicle system significantly depends on themore » time each vehicle is allowed to stay in the network; this time affects vehicles’ available speed choices, possible routes, and the amount of time for coordinating platoon formation. For problem instances with a large number of vehicles, we propose and test a heuristic decomposed approach that applies a clustering algorithm to partition the set of vehicles and then routes each group separately. When the set of vehicles is large and the available computational time is small, the decomposed approach finds significantly better solutions than does the full model.« less

  17. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  18. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  19. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  20. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  1. Fishing Forecasts

    NASA Technical Reports Server (NTRS)

    1988-01-01

    ROFFS stands for Roffer's Ocean Fishing Forecasting Service, Inc. Roffer combines satellite and computer technology with oceanographic information from several sources to produce frequently updated charts sometimes as often as 30 times a day showing clues to the location of marlin, sailfish, tuna, swordfish and a variety of other types. Also provides customized forecasts for racing boats and the shipping industry along with seasonal forecasts that allow the marine industry to formulate fishing strategies based on foreknowledge of the arrival and departure times of different fish. Roffs service exemplifies the potential for benefits to marine industries from satellite observations. Most notable results are reduced search time and substantial fuel savings.

  2. An energy efficient and high speed architecture for convolution computing based on binary resistive random access memory

    NASA Astrophysics Data System (ADS)

    Liu, Chen; Han, Runze; Zhou, Zheng; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng

    2018-04-01

    In this work we present a novel convolution computing architecture based on metal oxide resistive random access memory (RRAM) to process the image data stored in the RRAM arrays. The proposed image storage architecture shows performances of better speed-device consumption efficiency compared with the previous kernel storage architecture. Further we improve the architecture for a high accuracy and low power computing by utilizing the binary storage and the series resistor. For a 28 × 28 image and 10 kernels with a size of 3 × 3, compared with the previous kernel storage approach, the newly proposed architecture shows excellent performances including: 1) almost 100% accuracy within 20% LRS variation and 90% HRS variation; 2) more than 67 times speed boost; 3) 71.4% energy saving.

  3. CARMA: Software for continuous affect rating and media annotation

    PubMed Central

    Girard, Jeffrey M

    2017-01-01

    CARMA is a media annotation program that collects continuous ratings while displaying audio and video files. It is designed to be highly user-friendly and easily customizable. Based on Gottman and Levenson's affect rating dial, CARMA enables researchers and study participants to provide moment-by-moment ratings of multimedia files using a computer mouse or keyboard. The rating scale can be configured on a number of parameters including the labels for its upper and lower bounds, its numerical range, and its visual representation. Annotations can be displayed alongside the multimedia file and saved for easy import into statistical analysis software. CARMA provides a tool for researchers in affective computing, human-computer interaction, and the social sciences who need to capture the unfolding of subjective experience and observable behavior over time. PMID:29308198

  4. Improving a data-acquisition software system with abstract data type components

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1990-01-01

    Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.

  5. Computer-Assisted Instruction and Its Application to Air Force Civil Engineering.

    DTIC Science & Technology

    1987-09-01

    train the student. The student’s respesees my cause the ceeputot to preseet the previous material In a different =mus if these resposes indicaed that the...presented by Schlechter. He reports study results that indicate that CAI time savings may be due to self - pacing, a characteristic of other less-expenLve...lesson developer must ask: Is this instruc- tional requirement suited for Individual self -paced inter- active instruction? If the answer is no, then

  6. YAMM - Yet Another Menu Manager

    NASA Technical Reports Server (NTRS)

    Mazer, Alan S.; Weidner, Richard J.

    1991-01-01

    Yet Another Menu Manager (YAMM) computer program an application-independent menuing package of software designed to remove much difficulty and save much time inherent in implementation of front ends of large packages of software. Provides complete menuing front end for wide variety of applications, with provisions for independence from specific types of terminals, configurations that meet specific needs of users, and dynamic creation of menu trees. Consists of two parts: description of menu configuration and body of application code. Written in C.

  7. Finite-Size Scaling for the Baxter-Wu Model Using Block Distribution Functions

    NASA Astrophysics Data System (ADS)

    Velonakis, Ioannis N.; Hadjiagapiou, Ioannis A.

    2018-05-01

    In the present work, we present an alternative way of applying the well-known finite-size scaling (FSS) theory in the case of a Baxter-Wu model using Binder-like blocks. Binder's ideas are extended to estimate phase transition points and the corresponding scaling exponents not only for magnetic but also for energy properties, saving computational time and effort. The vast majority of our conclusions can be easily generalized to other models.

  8. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.

    PubMed

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-05-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  9. Remembrance of phases past: An autoregressive method for generating realistic atmospheres in simulations

    NASA Astrophysics Data System (ADS)

    Srinath, Srikar; Poyneer, Lisa A.; Rudy, Alexander R.; Ammons, S. M.

    2014-08-01

    The advent of expensive, large-aperture telescopes and complex adaptive optics (AO) systems has strengthened the need for detailed simulation of such systems from the top of the atmosphere to control algorithms. The credibility of any simulation is underpinned by the quality of the atmosphere model used for introducing phase variations into the incident photons. Hitherto, simulations which incorporate wind layers have relied upon phase screen generation methods that tax the computation and memory capacities of the platforms on which they run. This places limits on parameters of a simulation, such as exposure time or resolution, thus compromising its utility. As aperture sizes and fields of view increase the problem will only get worse. We present an autoregressive method for evolving atmospheric phase that is efficient in its use of computation resources and allows for variability in the power contained in frozen flow or stochastic components of the atmosphere. Users have the flexibility of generating atmosphere datacubes in advance of runs where memory constraints allow to save on computation time or of computing the phase at each time step for long exposure times. Preliminary tests of model atmospheres generated using this method show power spectral density and rms phase in accordance with established metrics for Kolmogorov models.

  10. Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud

    PubMed Central

    Florence, A. Paulin; Shanthi, V.; Simon, C. B. Sunil

    2016-01-01

    Cloud computing is a new technology which supports resource sharing on a “Pay as you go” basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS) scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption. PMID:27239551

  11. An FPGA computing demo core for space charge simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computedmore » using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.« less

  12. Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud.

    PubMed

    Florence, A Paulin; Shanthi, V; Simon, C B Sunil

    2016-01-01

    Cloud computing is a new technology which supports resource sharing on a "Pay as you go" basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS) scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption.

  13. A transient response analysis of the space shuttle vehicle during liftoff

    NASA Technical Reports Server (NTRS)

    Brunty, J. A.

    1990-01-01

    A proposed transient response method is formulated for the liftoff analysis of the space shuttle vehicles. It uses a power series approximation with unknown coefficients for the interface forces between the space shuttle and mobile launch platform. This allows the equation of motion of the two structures to be solved separately with the unknown coefficients at the end of each step. These coefficients are obtained by enforcing the interface compatibility conditions between the two structures. Once the unknown coefficients are determined, the total response is computed for that time step. The method is validated by a numerical example of a cantilevered beam and by the liftoff analysis of the space shuttle vehicles. The proposed method is compared to an iterative transient response analysis method used by Martin Marietta for their space shuttle liftoff analysis. It is shown that the proposed method uses less computer time than the iterative method and does not require as small a time step for integration. The space shuttle vehicle model is reduced using two different types of component mode synthesis (CMS) methods, the Lanczos method and the Craig and Bampton CMS method. By varying the cutoff frequency in the Craig and Bampton method it was shown that the space shuttle interface loads can be computed with reasonable accuracy. Both the Lanczos CMS method and Craig and Bampton CMS method give similar results. A substantial amount of computer time is saved using the Lanczos CMS method over that of the Craig and Bampton method. However, when trying to compute a large number of Lanczos vectors, input/output computer time increased and increased the overall computer time. The application of several liftoff release mechanisms that can be adapted to the proposed method are discussed.

  14. Temporal acceleration of spatially distributed kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Abhijit; Vlachos, Dionisios G.

    The computational intensity of kinetic Monte Carlo (KMC) simulation is a major impediment in simulating large length and time scales. In recent work, an approximate method for KMC simulation of spatially uniform systems, termed the binomial {tau}-leap method, was introduced [A. Chatterjee, D.G. Vlachos, M.A. Katsoulakis, Binomial distribution based {tau}-leap accelerated stochastic simulation, J. Chem. Phys. 122 (2005) 024112], where molecular bundles instead of individual processes are executed over coarse-grained time increments. This temporal coarse-graining can lead to significant computational savings but its generalization to spatially lattice KMC simulation has not been realized yet. Here we extend the binomial {tau}-leapmore » method to lattice KMC simulations by combining it with spatially adaptive coarse-graining. Absolute stability and computational speed-up analyses for spatial systems along with simulations provide insights into the conditions where accuracy and substantial acceleration of the new spatio-temporal coarse-graining method are ensured. Model systems demonstrate that the r-time increment criterion of Chatterjee et al. obeys the absolute stability limit for values of r up to near 1.« less

  15. Renewable energy in electric utility capacity planning: a decomposition approach with application to a Mexican utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staschus, K.

    1985-01-01

    In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less

  16. An energy-efficient failure detector for vehicular cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Dong, Jian; Wu, Jin; Wen, Dongxin

    2018-01-01

    Failure detectors are one of the fundamental components for maintaining the high availability of vehicular cloud computing. In vehicular cloud computing, lots of RSUs are deployed along the road to improve the connectivity. Many of them are equipped with solar battery due to the unavailability or excess expense of wired electrical power. So it is important to reduce the battery consumption of RSU. However, the existing failure detection algorithms are not designed to save battery consumption RSU. To solve this problem, a new energy-efficient failure detector 2E-FD has been proposed specifically for vehicular cloud computing. 2E-FD does not only provide acceptable failure detection service, but also saves the battery consumption of RSU. Through the comparative experiments, the results show that our failure detector has better performance in terms of speed, accuracy and battery consumption.

  17. An energy-efficient failure detector for vehicular cloud computing

    PubMed Central

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Wen, Dongxin

    2018-01-01

    Failure detectors are one of the fundamental components for maintaining the high availability of vehicular cloud computing. In vehicular cloud computing, lots of RSUs are deployed along the road to improve the connectivity. Many of them are equipped with solar battery due to the unavailability or excess expense of wired electrical power. So it is important to reduce the battery consumption of RSU. However, the existing failure detection algorithms are not designed to save battery consumption RSU. To solve this problem, a new energy-efficient failure detector 2E-FD has been proposed specifically for vehicular cloud computing. 2E-FD does not only provide acceptable failure detection service, but also saves the battery consumption of RSU. Through the comparative experiments, the results show that our failure detector has better performance in terms of speed, accuracy and battery consumption. PMID:29352282

  18. Faster simulation plots

    NASA Technical Reports Server (NTRS)

    Fowell, Richard A.

    1989-01-01

    Most simulation plots are heavily oversampled. Ignoring unnecessary data points dramatically reduces plot time with imperceptible effect on quality. The technique is suited to most plot devices. The departments laser printer's speed was tripled for large simulation plots by data thinning. This reduced printer delays without the expense of a faster laser printer. Surpisingly, it saved computer time as well. All plot data are now thinned, including PostScript and terminal plots. The problem, solution, and conclusions are described. The thinning algorithm is described and performance studies are presented. To obtain FORTRAN 77 or C source listings, mail a SASE to the author.

  19. Transforming parts of a differential equations system to difference equations as a method for run-time savings in NONMEM.

    PubMed

    Petersson, K J F; Friberg, L E; Karlsson, M O

    2010-10-01

    Computer models of biological systems grow more complex as computing power increase. Often these models are defined as differential equations and no analytical solutions exist. Numerical integration is used to approximate the solution; this can be computationally intensive, time consuming and be a large proportion of the total computer runtime. The performance of different integration methods depend on the mathematical properties of the differential equations system at hand. In this paper we investigate the possibility of runtime gains by calculating parts of or the whole differential equations system at given time intervals, outside of the differential equations solver. This approach was tested on nine models defined as differential equations with the goal to reduce runtime while maintaining model fit, based on the objective function value. The software used was NONMEM. In four models the computational runtime was successfully reduced (by 59-96%). The differences in parameter estimates, compared to using only the differential equations solver were less than 12% for all fixed effects parameters. For the variance parameters, estimates were within 10% for the majority of the parameters. Population and individual predictions were similar and the differences in OFV were between 1 and -14 units. When computational runtime seriously affects the usefulness of a model we suggest evaluating this approach for repetitive elements of model building and evaluation such as covariate inclusions or bootstraps.

  20. Selective cultivation and rapid detection of Staphylococcus aureus by computer vision.

    PubMed

    Wang, Yong; Yin, Yongguang; Zhang, Chaonan

    2014-03-01

    In this paper, we developed a selective growth medium and a more rapid detection method based on computer vision for selective isolation and identification of Staphylococcus aureus from foods. The selective medium consisted of tryptic soy broth basal medium, 3 inhibitors (NaCl, K2 TeO3 , and phenethyl alcohol), and 2 accelerators (sodium pyruvate and glycine). After 4 h of selective cultivation, bacterial detection was accomplished using computer vision. The total analysis time was 5 h. Compared to the Baird-Parker plate count method, which requires 4 to 5 d, this new detection method offers great time savings. Moreover, our novel method had a correlation coefficient of greater than 0.998 when compared with the Baird-Parker plate count method. The detection range for S. aureus was 10 to 10(7) CFU/mL. Our new, rapid detection method for microorganisms in foods has great potential for routine food safety control and microbiological detection applications. © 2014 Institute of Food Technologists®

  1. Assessment of time-dependent density functional theory with the restricted excitation space approximation for excited state calculations of large systems

    NASA Astrophysics Data System (ADS)

    Hanson-Heine, Magnus W. D.; George, Michael W.; Besley, Nicholas A.

    2018-06-01

    The restricted excitation subspace approximation is explored as a basis to reduce the memory storage required in linear response time-dependent density functional theory (TDDFT) calculations within the Tamm-Dancoff approximation. It is shown that excluding the core orbitals and up to 70% of the virtual orbitals in the construction of the excitation subspace does not result in significant changes in computed UV/vis spectra for large molecules. The reduced size of the excitation subspace greatly reduces the size of the subspace vectors that need to be stored when using the Davidson procedure to determine the eigenvalues of the TDDFT equations. Furthermore, additional screening of the two-electron integrals in combination with a reduction in the size of the numerical integration grid used in the TDDFT calculation leads to significant computational savings. The use of these approximations represents a simple approach to extend TDDFT to the study of large systems and make the calculations increasingly tractable using modest computing resources.

  2. Zonal and tesseral harmonic coefficients for the geopotential function, from zero to 18th order

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, J. C.

    1976-01-01

    Zonal and tesseral harmonic coefficients for the geopotential function are usually tabulated in normalized form to provide immediate information as to the relative significance of the coefficients in the gravity model. The normalized form of the geopotential coefficients cannot be used for computational purposes unless the gravity model has been modified to receive them. This modification is usually not done because the absolute or unnormalized form of the coefficients can be obtained from the simple mathematical relationship that relates the two forms. This computation can be quite tedious for hand calculation, especially for the higher order terms, and can be costly in terms of storage and execution time for machine computation. In this report, zonal and tesseral harmonic coefficients for the geopotential function are tabulated in absolute or unnormalized form. The report is designed to be used as a ready reference for both hand and machine calculation to save the user time and effort.

  3. Closing University Departments: The Perception of Tax Payers.

    ERIC Educational Resources Information Center

    Furnham, Adrian; Sisterson, Grant

    2000-01-01

    This pilot study looked at the British lay public's evaluation of 20 different disciplines by asking them to rank-order them. In a cutting-saving exercise those departments (disciplines) thought most worthy of saving were English, mathematics, and computer science; those rated as least important were anthropology, film and media studies, and…

  4. G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS.

    PubMed

    Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin

    2015-01-01

    Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%.

  5. G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS

    PubMed Central

    Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin

    2015-01-01

    Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%. PMID:26504488

  6. Computed-tomography modeled polyether ether ketone (PEEK) implants in revision cranioplasty.

    PubMed

    O'Reilly, Eamon B; Barnett, Sam; Madden, Christopher; Welch, Babu; Mickey, Bruce; Rozen, Shai

    2015-03-01

    Traditional cranioplasty methods focus on pre-operative or intraoperative hand molding. Recently, CT-guided polyether ether ketone (PEEK) plate reconstruction enables precise, time-saving reconstruction. This case series aims to show a single institution experience with use of PEEK cranioplasty as an effective, safe, precise, reusable, and time-saving cranioplasty technique in large, complex cranial defects. We performed a 6-year retrospective review of cranioplasty procedures performed at our affiliated hospitals using PEEK implants. A total of nineteen patients underwent twenty-two cranioplasty procedures. Pre-operative, intra-operative, and post-operative data was collected. Nineteen patients underwent twenty-two procedures. Time interval from injury to loss of primary cranioplasty averaged 57.7 months (0-336 mo); 4.0 months (n=10, range 0-19) in cases of trauma. Time interval from primary cranioplasty loss to PEEK cranioplasty was 11.8 months for infection (n=11, range 6-25 mo), 12.2 months for trauma (n=5, range 2-27 mo), and 0.3 months for cosmetic or functional reconstructions (n=3, range 0-1). Similar surgical techniques were used in all patients. Drains were placed in 11/22 procedures. Varying techniques were used in skin closure, including adjacent tissue transfer (4/22) and free tissue transfer (1/22). The PEEK plate required modification in four procedures. Three patients had reoperation following PEEK plate reconstruction. Cranioplasty utilizing CT-guided PEEK plate allows easy inset, anatomic accuracy, mirror image aesthetics, simplification of complex 3D defects, and potential time savings. Additionally, it's easily manipulated in the operating room, and can be easily re-utilized in cases of intraoperative course changes or infection. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  8. Computationally-Efficient Minimum-Time Aircraft Routes in the Presence of Winds

    NASA Technical Reports Server (NTRS)

    Jardin, Matthew R.

    2004-01-01

    A computationally efficient algorithm for minimizing the flight time of an aircraft in a variable wind field has been invented. The algorithm, referred to as Neighboring Optimal Wind Routing (NOWR), is based upon neighboring-optimal-control (NOC) concepts and achieves minimum-time paths by adjusting aircraft heading according to wind conditions at an arbitrary number of wind measurement points along the flight route. The NOWR algorithm may either be used in a fast-time mode to compute minimum- time routes prior to flight, or may be used in a feedback mode to adjust aircraft heading in real-time. By traveling minimum-time routes instead of direct great-circle (direct) routes, flights across the United States can save an average of about 7 minutes, and as much as one hour of flight time during periods of strong jet-stream winds. The neighboring optimal routes computed via the NOWR technique have been shown to be within 1.5 percent of the absolute minimum-time routes for flights across the continental United States. On a typical 450-MHz Sun Ultra workstation, the NOWR algorithm produces complete minimum-time routes in less than 40 milliseconds. This corresponds to a rate of 25 optimal routes per second. The closest comparable optimization technique runs approximately 10 times slower. Airlines currently use various trial-and-error search techniques to determine which of a set of commonly traveled routes will minimize flight time. These algorithms are too computationally expensive for use in real-time systems, or in systems where many optimal routes need to be computed in a short amount of time. Instead of operating in real-time, airlines will typically plan a trajectory several hours in advance using wind forecasts. If winds change significantly from forecasts, the resulting flights will no longer be minimum-time. The need for a computationally efficient wind-optimal routing algorithm is even greater in the case of new air-traffic-control automation concepts. For air-traffic-control automation, thousands of wind-optimal routes may need to be computed and checked for conflicts in just a few minutes. These factors motivated the need for a more efficient wind-optimal routing algorithm.

  9. Year-Round Daylight Saving Time Study : Volume 1. Interim Report on the Operation and Effects of Daylight Saving Time

    DOT National Transportation Integrated Search

    1975-06-01

    The analyses of the effects of Year-Round Daylight Saving Time were not conslusive because they could not be reliablyseparated from other changes occuring simultaneously including fuel availability constraints, speed limit reductions, Sunday gasoline...

  10. A SCILAB Program for Computing Rotating Magnetic Compact Objects

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement the so-called ``complex-plane iterative technique'' (CIT) to the computation of classical differentially rotating magnetic white dwarf and neutron star models. The program has been written in SCILAB (© INRIA-ENPC), a matrix-oriented high-level programming language, which can be downloaded free of charge from the site http://www-rocq.inria.fr/scilab. Due to the advanced capabilities of this language, the code is short and understandable. Highlights of the program are: (a) time-saving character, (b) easy use due to the built-in graphics user interface, (c) easy interfacing with Fortran via online dynamic link. We interpret our numerical results in various ways by extensively using the graphics environment of SCILAB.

  11. Mixed Single/Double Precision in OpenIFS: A Detailed Study of Energy Savings, Scaling Effects, Architectural Effects, and Compilation Effects

    NASA Astrophysics Data System (ADS)

    Fagan, Mike; Dueben, Peter; Palem, Krishna; Carver, Glenn; Chantry, Matthew; Palmer, Tim; Schlacter, Jeremy

    2017-04-01

    It has been shown that a mixed precision approach that judiciously replaces double precision with single precision calculations can speed-up global simulations. In particular, a mixed precision variation of the Integrated Forecast System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF) showed virtually the same quality model results as the standard double precision version (Vana et al., Single precision in weather forecasting models: An evaluation with the IFS, Monthly Weather Review, in print). In this study, we perform detailed measurements of savings in computing time and energy using a mixed precision variation of the -OpenIFS- model. The mixed precision variation of OpenIFS is analogous to the IFS variation used in Vana et al. We (1) present results for energy measurements for simulations in single and double precision using Intel's RAPL technology, (2) conduct a -scaling- study to quantify the effects that increasing model resolution has on both energy dissipation and computing cycles, (3) analyze the differences between single core and multicore processing, and (4) compare the effects of different compiler technologies on the mixed precision OpenIFS code. In particular, we compare intel icc/ifort with gnu gcc/gfortran.

  12. Chapter 10: Peak Demand and Time-Differentiated Energy Savings Cross-Cutting Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Stern, Frank; Spencer, Justin

    Savings from electric energy efficiency measures and programs are often expressed in terms of annual energy and presented as kilowatt-hours per year (kWh/year). However, for a full assessment of the value of these savings, it is usually necessary to consider the measure or program's impact on peak demand as well as time-differentiated energy savings. This cross-cutting protocol describes methods for estimating the peak demand and time-differentiated energy impacts of measures implemented through energy efficiency programs.

  13. Project Lateday : The Level of Accidents Under the Effect of Daylight Saving All Year

    DOT National Transportation Integrated Search

    1975-10-01

    Year-round daylight saving time (YRDST) has recently been observed in the United States. The observance of double daylight saving time (DDST) is under some consideration. One of the principal expected effects of the adoption of these time systems is ...

  14. Playback system designed for X-Band SAR

    NASA Astrophysics Data System (ADS)

    Yuquan, Liu; Changyong, Dou

    2014-03-01

    SAR(Synthetic Aperture Radar) has extensive application because it is daylight and weather independent. In particular, X-Band SAR strip map, designed by Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, provides high ground resolution images, at the same time it has a large spatial coverage and a short acquisition time, so it is promising in multi-applications. When sudden disaster comes, the emergency situation acquires radar signal data and image as soon as possible, in order to take action to reduce loss and save lives in the first time. This paper summarizes a type of X-Band SAR playback processing system designed for disaster response and scientific needs. It describes SAR data workflow includes the payload data transmission and reception process. Playback processing system completes signal analysis on the original data, providing SAR level 0 products and quick image. Gigabit network promises radar signal transmission efficiency from recorder to calculation unit. Multi-thread parallel computing and ping pong operation can ensure computation speed. Through gigabit network, multi-thread parallel computing and ping pong operation, high speed data transmission and processing meet the SAR radar data playback real time requirement.

  15. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  16. Power strain imaging based on vibro-elastography techniques

    NASA Astrophysics Data System (ADS)

    Wen, Xu; Salcudean, S. E.

    2007-03-01

    This paper describes a new ultrasound elastography technique, power strain imaging, based on vibro-elastography (VE) techniques. With this method, tissue is compressed by a vibrating actuator driven by low-pass or band-pass filtered white noise, typically in the 0-20 Hz range. Tissue displacements at different spatial locations are estimated by correlation-based approaches on the raw ultrasound radio frequency signals and recorded in time sequences. The power spectra of these time sequences are computed by Fourier spectral analysis techniques. As the average of the power spectrum is proportional to the squared amplitude of the tissue motion, the square root of the average power over the range of excitation frequencies is used as a measure of the tissue displacement. Then tissue strain is determined by the least squares estimation of the gradient of the displacement field. The computation of the power spectra of the time sequences can be implemented efficiently by using Welch's periodogram method with moving windows or with accumulative windows with a forgetting factor. Compared to the transfer function estimation originally used in VE, the computation of cross spectral densities is not needed, which saves both the memory and computational times. Phantom experiments demonstrate that the proposed method produces stable and operator-independent strain images with high signal-to-noise ratio in real time. This approach has been also tested on a few patient data of the prostate region, and the results are encouraging.

  17. An iterative truncation method for unbounded electromagnetic problems using varying order finite elements

    NASA Astrophysics Data System (ADS)

    Paul, Prakash

    2009-12-01

    The finite element method (FEM) is used to solve three-dimensional electromagnetic scattering and radiation problems. Finite element (FE) solutions of this kind contain two main types of error: discretization error and boundary error. Discretization error depends on the number of free parameters used to model the problem, and on how effectively these parameters are distributed throughout the problem space. To reduce the discretization error, the polynomial order of the finite elements is increased, either uniformly over the problem domain or selectively in those areas with the poorest solution quality. Boundary error arises from the condition applied to the boundary that is used to truncate the computational domain. To reduce the boundary error, an iterative absorbing boundary condition (IABC) is implemented. The IABC starts with an inexpensive boundary condition and gradually improves the quality of the boundary condition as the iteration continues. An automatic error control (AEC) is implemented to balance the two types of error. With the AEC, the boundary condition is improved when the discretization error has fallen to a low enough level to make this worth doing. The AEC has these characteristics: (i) it uses a very inexpensive truncation method initially; (ii) it allows the truncation boundary to be very close to the scatterer/radiator; (iii) it puts more computational effort on the parts of the problem domain where it is most needed; and (iv) it can provide as accurate a solution as needed depending on the computational price one is willing to pay. To further reduce the computational cost, disjoint scatterers and radiators that are relatively far from each other are bounded separately and solved using a multi-region method (MRM), which leads to savings in computational cost. A simple analytical way to decide whether the MRM or the single region method will be computationally cheaper is also described. To validate the accuracy and savings in computation time, different shaped metallic and dielectric obstacles (spheres, ogives, cube, flat plate, multi-layer slab etc.) are used for the scattering problems. For the radiation problems, waveguide excited antennas (horn antenna, waveguide with flange, microstrip patch antenna) are used. Using the AEC the peak reduction in computation time during the iteration is typically a factor of 2, compared to the IABC using the same element orders throughout. In some cases, it can be as high as a factor of 4.

  18. Estimating health benefits and cost-savings for achieving the Healthy People 2020 objective of reducing invasive colorectal cancer.

    PubMed

    Hung, Mei-Chuan; Ekwueme, Donatus U; White, Arica; Rim, Sun Hee; King, Jessica B; Wang, Jung-Der; Chang, Su-Hsin

    2018-01-01

    This study aims to quantify the aggregate potential life-years (LYs) saved and healthcare cost-savings if the Healthy People 2020 objective were met to reduce invasive colorectal cancer (CRC) incidence by 15%. We identified patients (n=886,380) diagnosed with invasive CRC between 2001 and 2011 from a nationally representative cancer dataset. We stratified these patients by sex, race/ethnicity, and age. Using these data and data from the 2001-2011 U.S. life tables, we estimated a survival function for each CRC group and the corresponding reference group and computed per-person LYs saved. We estimated per-person annual healthcare cost-savings using the 2008-2012 Medical Expenditure Panel Survey. We calculated aggregate LYs saved and cost-savings by multiplying the reduced number of CRC patients by the per-person LYs saved and lifetime healthcare cost-savings, respectively. We estimated an aggregate of 84,569 and 64,924 LYs saved for men and women, respectively, accounting for healthcare cost-savings of $329.3 and $294.2 million (in 2013$), respectively. Per person, we estimated 6.3 potential LYs saved related to those who developed CRC for both men and women, and healthcare cost-savings of $24,000 for men and $28,000 for women. Non-Hispanic whites and those aged 60-64 had the highest aggregate potential LYs saved and cost-savings. Achieving the HP2020 objective of reducing invasive CRC incidence by 15% by year 2020 would potentially save nearly 150,000 life-years and $624 million on healthcare costs. Copyright © 2017. Published by Elsevier Inc.

  19. 78 FR 69925 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Fiscal Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching... savings securities. C. Authority for Conducting the Matching Program This computer matching agreement sets... amended by the Computer Matching and Privacy Protection Act of 1988, as amended, and the regulations and...

  20. Estimated time of arrival and debiasing the time saving bias.

    PubMed

    Eriksson, Gabriella; Patten, Christopher J D; Svenson, Ola; Eriksson, Lars

    2015-01-01

    The time saving bias predicts that the time saved when increasing speed from a high speed is overestimated, and underestimated when increasing speed from a slow speed. In a questionnaire, time saving judgements were investigated when information of estimated time to arrival was provided. In an active driving task, an alternative meter indicating the inverted speed was used to debias judgements. The simulated task was to first drive a distance at a given speed, and then drive the same distance again at the speed the driver judged was required to gain exactly 3 min in travel time compared with the first drive. A control group performed the same task with a speedometer and saved less than the targeted 3 min when increasing speed from a high speed, and more than 3 min when increasing from a low speed. Participants in the alternative meter condition were closer to the target. The two studies corroborate a time saving bias and show that biased intuitive judgements can be debiased by displaying the inverted speed. Practitioner Summary: Previous studies have shown a cognitive bias in judgements of the time saved by increasing speed. This simulator study aims to improve driver judgements by introducing a speedometer indicating the inverted speed in active driving. The results show that the bias can be reduced by presenting the inverted speed and this finding can be used when designing in-car information systems.

  1. Balancing the benefits and detriments among women targeted by the Norwegian Breast Cancer Screening Program.

    PubMed

    Hofvind, Solveig; Román, Marta; Sebuødegård, Sofie; Falk, Ragnhild S

    2016-12-01

    To compute a ratio between the estimated numbers of lives saved from breast cancer death and the number of women diagnosed with a breast cancer that never would have been diagnosed during the woman's lifetime had she not attended screening (epidemiologic over-diagnosis) in the Norwegian Breast Cancer Screening Program. The Norwegian Breast Cancer Screening Program invites women aged 50-69 to biennial mammographic screening. Results from published studies using individual level data from the programme for estimating breast cancer mortality and epidemiologic over-diagnosis comprised the basis for the ratio. The mortality reduction varied from 36.8% to 43% among screened women, while estimates on epidemiologic over-diagnosis ranged from 7% to 19.6%. We computed the average estimates for both values. The benefit-detriment ratio, number of lives saved, and number of women over-diagnosed were computed for different scenarios of reduction in breast cancer mortality and epidemiologic over-diagnosis. For every 10,000 biennially screened women, followed until age 79, we estimated that 53-61 (average 57) women were saved from breast cancer death, and 45-126 (average 82) were over-diagnosed. The benefit-detriment ratio using average estimates was 1:1.4, indicating that the programme saved about one life per 1-2 women with epidemiologic over-diagnosis. The benefit-detriment ratio estimates of the Norwegian Breast Cancer Screening Program, expressed as lives saved from breast cancer death and epidemiologic over-diagnosis, should be interpreted with care due to substantial uncertainties in the estimates, and the differences in the scale of values of the events compared. © The Author(s) 2016.

  2. Potential savings in prescription drug costs for hypertension, hyperlipidemia, and diabetes mellitus by equivalent drug substitution in Austria: a nationwide cohort study.

    PubMed

    Heinze, Georg; Hronsky, Milan; Reichardt, Berthold; Baumgärtel, Christoph; Müllner, Marcus; Bucsics, Anna; Winkelmayer, Wolfgang C

    2015-04-01

    Healthcare systems spend considerable proportions of their budgets on pharmaceutical treatment of hypertension, hyperlipidemia, and diabetes mellitus. From data on almost all residents of Austria, a country with mandatory health insurance and universal health coverage, we estimated potential cost savings by substituting prescribed medicines with the cheapest medicines that were of the same chemical substance and strength, and available during the same time. Data from 8.3 million persons (98.5 % of the total Austrian insured population) from 2009-2012 were analyzed. Real prescription costs for antihypertensive, lipid-lowering, and hypoglycemic medicines achievable by same-substance, same-strength drug substitution were computed for each active ingredient, and per gender and 1-year age category of patients. In 2012, health insurance providers spent 231.3 million, 77.8 million, and 91.9 million for antihypertensive, lipid-lowering, and diabetes medications, of which 52.2 million (22.6 %), 15.9 million (20.5 %), and 4.1 million (4.5 %), respectively, could have been saved by same-substance drug substitution. Highest potential savings were calculated for amlodipine (8.0 million, 65.4 %), simvastatin (12.2 million, 59.3 %), and metformin (2.4 million, 54.6 %), respectively. Higher savings for men than for women resulted from differing prescribed cumulative dosages and proportions of patients with co-payment waiver. Potential cost savings in antihypertensive and lipid-lowering drugs increased from 2009-2012. Our study highlights the cost-savings potential from arguably the most acceptable of interventions, simply switching to the cheapest available same-substance, same-strength product. In 2012, this strategy could have reduced costs for antihypertensive, lipid-lowering, and hypoglycemic treatment by up to 18.0 %.

  3. Ground testing and simulation. II - Aerodynamic testing and simulation: Saving lives, time, and money

    NASA Technical Reports Server (NTRS)

    Dayman, B., Jr.; Fiore, A. W.

    1974-01-01

    The present work discusses in general terms the various kinds of ground facilities, in particular, wind tunnels, which support aerodynamic testing. Since not all flight parameters can be simulated simultaneously, an important problem consists in matching parameters. It is pointed out that there is a lack of wind tunnels for a complete Reynolds-number simulation. Using a computer to simulate flow fields can result in considerable reduction of wind-tunnel hours required to develop a given flight vehicle.

  4. Effects of daylight savings time changes on stock market volatility.

    PubMed

    Berument, M Hakan; Dogan, Nukhet; Onar, Bahar

    2010-04-01

    The presence of daylight savings time effects on stock returns and on stock volatility was investigated using an EGARCH specification to model the conditional variance. The evidence gathered from the major United States stock markets for the period between 1967 and 2007 did not support the existence of the daylight savings time effect on stock returns or on volatility. Returns on the first business day following daylight savings time changes were not lower nor was the volatility higher, as would be expected if there were an effect.

  5. Computer Cache. Environmental Protection: Websites on the Environment

    ERIC Educational Resources Information Center

    Byerly, Greg; Brodie, Carolyn S.

    2005-01-01

    "Give a hoot, don't pollute!" "Save the environment!" "Save the Whales!" Ranger Rick. Recycle. These are all well-known phrases and emblems of the fight to "protect the environment." Young children seem to understand almost intuitively the need to do those simple things that will make the Earth a better place to live and play. However, especially…

  6. DEEP: A Database of Energy Efficiency Performance to Accelerate Energy Retrofitting of Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof

    The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less

  7. The NIST Internet time service

    NASA Astrophysics Data System (ADS)

    Levine, Judah

    1994-05-01

    We will describe the NIST Network Time Service which provides time and frequency information over the Internet. Our first time server is located in Boulder, Colorado, a second backup server is under construction there, and we plan to install a third server on the East Coast later this year. The servers are synchronized to UTC(NIST) with an uncertainty of about 0.8 ms RMS and they will respond to time requests from any client on the Internet in several different formats including the DAYTIME, TIME and NTP protocols. The DAYTIME and TIME protocols are the easiest to use and are suitable for providing time to PC's and other small computers. In addition to UTC(NIST), the DAYTIME message provides advance notice of leap seconds and of the transitions to and from Daylight Saving Time. The Daylight Saving Time notice is based on the US transition dates of the first Sunday in April and the last one in October. The NTP is a more complex protocol that is suitable for larger machines; it is normally run as a 'daemon' process in the background and can keep the time of the client to within a few milliseconds of UTC(NIST). We will describe the operating principles of various kinds of client software ranging from a simple program that queries the server once and sets the local clock to more complex 'daemon' processes (such as NTP) that continuously correct the time of the local clock based on periodic calibrations.

  8. The NIST Internet time service

    NASA Technical Reports Server (NTRS)

    Levine, Judah

    1994-01-01

    We will describe the NIST Network Time Service which provides time and frequency information over the Internet. Our first time server is located in Boulder, Colorado, a second backup server is under construction there, and we plan to install a third server on the East Coast later this year. The servers are synchronized to UTC(NIST) with an uncertainty of about 0.8 ms RMS and they will respond to time requests from any client on the Internet in several different formats including the DAYTIME, TIME and NTP protocols. The DAYTIME and TIME protocols are the easiest to use and are suitable for providing time to PC's and other small computers. In addition to UTC(NIST), the DAYTIME message provides advance notice of leap seconds and of the transitions to and from Daylight Saving Time. The Daylight Saving Time notice is based on the US transition dates of the first Sunday in April and the last one in October. The NTP is a more complex protocol that is suitable for larger machines; it is normally run as a 'daemon' process in the background and can keep the time of the client to within a few milliseconds of UTC(NIST). We will describe the operating principles of various kinds of client software ranging from a simple program that queries the server once and sets the local clock to more complex 'daemon' processes (such as NTP) that continuously correct the time of the local clock based on periodic calibrations.

  9. CPU architecture for a fast and energy-saving calculation of convolution neural networks

    NASA Astrophysics Data System (ADS)

    Knoll, Florian J.; Grelcke, Michael; Czymmek, Vitali; Holtorf, Tim; Hussmann, Stephan

    2017-06-01

    One of the most difficult problem in the use of artificial neural networks is the computational capacity. Although large search engine companies own specially developed hardware to provide the necessary computing power, for the conventional user only remains the state of the art method, which is the use of a graphic processing unit (GPU) as a computational basis. Although these processors are well suited for large matrix computations, they need massive energy. Therefore a new processor on the basis of a field programmable gate array (FPGA) has been developed and is optimized for the application of deep learning. This processor is presented in this paper. The processor can be adapted for a particular application (in this paper to an organic farming application). The power consumption is only a fraction of a GPU application and should therefore be well suited for energy-saving applications.

  10. New Computational Approach to Electron Transport in Irregular Graphene Nanostructures

    NASA Astrophysics Data System (ADS)

    Mason, Douglas; Heller, Eric; Prendergast, David; Neaton, Jeffrey

    2009-03-01

    For novel graphene devices of nanoscale-to-macroscopic scale, many aspects of their transport properties are not easily understood due to difficulties in fabricating devices with regular edges. Here we develop a framework to efficiently calculate and potentially screen electronic transport properties of arbitrary nanoscale graphene device structures. A generalization of the established recursive Green's function method is presented, providing access to arbitrary device and lead geometries with substantial computer-time savings. Using single-orbital nearest-neighbor tight-binding models and the Green's function-Landauer scattering formalism, we will explore the transmission function of irregular two-dimensional graphene-based nanostructures with arbitrary lead orientation. Prepared by LBNL under contract DE-AC02-05CH11231 and supported by the U.S. Dept. of Energy Computer Science Graduate Fellowship under grant DE-FG02-97ER25308.

  11. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  12. Los Alamos Plutonium Facility Waste Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, K.; Montoya, A.; Wieneke, R.

    1997-02-01

    This paper describes the new computer-based transuranic (TRU) Waste Management System (WMS) being implemented at the Plutonium Facility at Los Alamos National Laboratory (LANL). The Waste Management System is a distributed computer processing system stored in a Sybase database and accessed by a graphical user interface (GUI) written in Omnis7. It resides on the local area network at the Plutonium Facility and is accessible by authorized TRU waste originators, count room personnel, radiation protection technicians (RPTs), quality assurance personnel, and waste management personnel for data input and verification. Future goals include bringing outside groups like the LANL Waste Management Facilitymore » on-line to participate in this streamlined system. The WMS is changing the TRU paper trail into a computer trail, saving time and eliminating errors and inconsistencies in the process.« less

  13. Remote multi-function fire alarm system based on internet of things

    NASA Astrophysics Data System (ADS)

    Wang, Lihui; Zhao, Shuai; Huang, Jianqing; Ji, Jianyu

    2018-05-01

    This project uses MCU STC15W408AS (stable, energy saving, high speed), temperature sensor DS18B20 (cheap, high efficiency, stable), MQ2 resistance type semiconductor smog sensor (high stability, fast response and economy) and NRF24L01 wireless transmitting and receiving module (energy saving, small volume, reliable) as the main body to achieve concentration temperature data presentation, intelligent voice alarming and short distance wireless transmission. The whole system is safe, reliable, cheap, quick reaction and good performance. This project uses the MCU STM32F103RCT6 as the main control chip, and use WIFI module ESP8266, wireless module NRF24L01 to make the gateway. Users can remotely check and control the related devices in real-time on smartphones or computers. We can also realize the functions of intelligent fire monitoring, remote fire extinguishing, cloud data storage through the third party server Big IOT.

  14. Interactive home telehealth and burns: A pilot study.

    PubMed

    Hickey, Sean; Gomez, Jason; Meller, Benjamin; Schneider, Jeffery C; Cheney, Meredith; Nejad, Shamim; Schulz, John; Goverman, Jeremy

    2017-09-01

    The objective of this study is to review our experience incorporating Interactive Home Telehealth (IHT) visits into follow-up burn care. A retrospective review of all burn patients participating in IHT encounters over the course of 15 months was performed. Connections were established through secure video conferencing and call-routing software. Patients connected with a personal computer or tablet and providers connected with a desktop computer with a high-definition web camera. In some cases, high-definition digital images were emailed to the provider prior to the virtual consultation. For each patient, the following was collected: (1) patient and injury demographics (diagnosis, prognosis, and clinical management), (2) total number of encounters, (3) service for each encounter (burn, psychiatry, and rehabilitation), (4) length of visit, including travel distance and time saved and, (5) complications, including re-admissions and connectivity issues. 52 virtual encounters were performed with 31 patients during the first year of the pilot project from March 2015 to June 2016. Mean age of the participant was 44 years (range 18-83 years). Mean total burn surface area of the participant was 12% (range 1-80%). Average roundtrip travel distance saved was 188 miles (range 4-822 miles). Average round trip travel time saved was 201min (range 20-564min). There were no unplanned re-admissions and no complications. Five connectivity issues were reported, none of which prevented completion of the visit. Interactive Home Telehealth is a safe and feasible modality for delivering follow-up care to burn patients. Burn care providers benefit from the potential to improve outpatient clinic utilization. Patients benefit from improved access to multiple members of their specialized burn care team, as well as cost-reductions for patient travel expenses. Future studies are needed to ensure patient and provider satisfaction and to further validate the significance, cost-effectiveness and safety. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  15. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  16. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hector, Jr., Louis G.; McCarty, Eric D.

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowingmore » objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.« less

  17. Re-Computation of Numerical Results Contained in NACA Report No. 496

    NASA Technical Reports Server (NTRS)

    Perry, Boyd, III

    2015-01-01

    An extensive examination of NACA Report No. 496 (NACA 496), "General Theory of Aerodynamic Instability and the Mechanism of Flutter," by Theodore Theodorsen, is described. The examination included checking equations and solution methods and re-computing interim quantities and all numerical examples in NACA 496. The checks revealed that NACA 496 contains computational shortcuts (time- and effort-saving devices for engineers of the time) and clever artifices (employed in its solution methods), but, unfortunately, also contains numerous tripping points (aspects of NACA 496 that have the potential to cause confusion) and some errors. The re-computations were performed employing the methods and procedures described in NACA 496, but using modern computational tools. With some exceptions, the magnitudes and trends of the original results were in fair-to-very-good agreement with the re-computed results. The exceptions included what are speculated to be computational errors in the original in some instances and transcription errors in the original in others. Independent flutter calculations were performed and, in all cases, including those where the original and re-computed results differed significantly, were in excellent agreement with the re-computed results. Appendix A contains NACA 496; Appendix B contains a Matlab(Reistered) program that performs the re-computation of results; Appendix C presents three alternate solution methods, with examples, for the two-degree-of-freedom solution method of NACA 496; Appendix D contains the three-degree-of-freedom solution method (outlined in NACA 496 but never implemented), with examples.

  18. Large-scale 3-D EM modelling with a Block Low-Rank multifrontal direct solver

    NASA Astrophysics Data System (ADS)

    Shantsev, Daniil V.; Jaysaval, Piyoosh; de la Kethulle de Ryhove, Sébastien; Amestoy, Patrick R.; Buttari, Alfredo; L'Excellent, Jean-Yves; Mary, Theo

    2017-06-01

    We put forward the idea of using a Block Low-Rank (BLR) multifrontal direct solver to efficiently solve the linear systems of equations arising from a finite-difference discretization of the frequency-domain Maxwell equations for 3-D electromagnetic (EM) problems. The solver uses a low-rank representation for the off-diagonal blocks of the intermediate dense matrices arising in the multifrontal method to reduce the computational load. A numerical threshold, the so-called BLR threshold, controlling the accuracy of low-rank representations was optimized by balancing errors in the computed EM fields against savings in floating point operations (flops). Simulations were carried out over large-scale 3-D resistivity models representing typical scenarios for marine controlled-source EM surveys, and in particular the SEG SEAM model which contains an irregular salt body. The flop count, size of factor matrices and elapsed run time for matrix factorization are reduced dramatically by using BLR representations and can go down to, respectively, 10, 30 and 40 per cent of their full-rank values for our largest system with N = 20.6 million unknowns. The reductions are almost independent of the number of MPI tasks and threads at least up to 90 × 10 = 900 cores. The BLR savings increase for larger systems, which reduces the factorization flop complexity from O(N2) for the full-rank solver to O(Nm) with m = 1.4-1.6. The BLR savings are significantly larger for deep-water environments that exclude the highly resistive air layer from the computational domain. A study in a scenario where simulations are required at multiple source locations shows that the BLR solver can become competitive in comparison to iterative solvers as an engine for 3-D controlled-source electromagnetic Gauss-Newton inversion that requires forward modelling for a few thousand right-hand sides.

  19. Daylight Saving Time Transitions and Road Traffic Accidents

    PubMed Central

    Lahti, Tuuli; Nysten, Esa; Haukka, Jari; Sulander, Pekka; Partonen, Timo

    2010-01-01

    Circadian rhythm disruptions may have harmful impacts on health. Circadian rhythm disruptions caused by jet lag compromise the quality and amount of sleep and may lead to a variety of symptoms such as fatigue, headache, and loss of attention and alertness. Even a minor change in time schedule may cause considerable stress for the body. Transitions into and out of daylight saving time alter the social and environmental timing twice a year. According to earlier studies, this change in time-schedule leads to sleep disruption and fragmentation of the circadian rhythm. Since sleep deprivation decreases motivation, attention, and alertness, transitions into and out of daylight saving time may increase the amount of accidents during the following days after the transition. We studied the amount of road traffic accidents one week before and one week after transitions into and out of daylight saving time during years from 1981 to 2006. Our results demonstrated that transitions into and out of daylight saving time did not increase the number of traffic road accidents. PMID:20652036

  20. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  1. Energy 101: Energy Efficient Data Centers

    ScienceCinema

    None

    2018-04-16

    Data centers provide mission-critical computing functions vital to the daily operation of top U.S. economic, scientific, and technological organizations. These data centers consume large amounts of energy to run and maintain their computer systems, servers, and associated high-performance components—up to 3% of all U.S. electricity powers data centers. And as more information comes online, data centers will consume even more energy. Data centers can become more energy efficient by incorporating features like power-saving "stand-by" modes, energy monitoring software, and efficient cooling systems instead of energy-intensive air conditioners. These and other efficiency improvements to data centers can produce significant energy savings, reduce the load on the electric grid, and help protect the nation by increasing the reliability of critical computer operations.

  2. Operational flood control of a low-lying delta system using large time step Model Predictive Control

    NASA Astrophysics Data System (ADS)

    Tian, Xin; van Overloop, Peter-Jules; Negenborn, Rudy R.; van de Giesen, Nick

    2015-01-01

    The safety of low-lying deltas is threatened not only by riverine flooding but by storm-induced coastal flooding as well. For the purpose of flood control, these deltas are mostly protected in a man-made environment, where dikes, dams and other adjustable infrastructures, such as gates, barriers and pumps are widely constructed. Instead of always reinforcing and heightening these structures, it is worth considering making the most of the existing infrastructure to reduce the damage and manage the delta in an operational and overall way. In this study, an advanced real-time control approach, Model Predictive Control, is proposed to operate these structures in the Dutch delta system (the Rhine-Meuse delta). The application covers non-linearity in the dynamic behavior of the water system and the structures. To deal with the non-linearity, a linearization scheme is applied which directly uses the gate height instead of the structure flow as the control variable. Given the fact that MPC needs to compute control actions in real-time, we address issues regarding computational time. A new large time step scheme is proposed in order to save computation time, in which different control variables can have different control time steps. Simulation experiments demonstrate that Model Predictive Control with the large time step setting is able to control a delta system better and much more efficiently than the conventional operational schemes.

  3. Girls Save the World through Computer Science

    ERIC Educational Resources Information Center

    Murakami, Christine

    2011-01-01

    It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

  4. Practical Considerations of Waste Heat Reuse for a Mars Mission Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Levri, Julie; Finn, Cory; Luna, Bernadette (Technical Monitor)

    2000-01-01

    Energy conservation is a key issue in design optimization of Advanced Life Support Systems (ALSS) for long-term space missions. By considering designs for conservation at the system level, energy saving opportunities arise that would otherwise go unnoticed. This paper builds on a steady-state investigation of system-level waste heat reuse in an ALSS with a low degree of crop growth for a Mars mission. In past studies, such a system has been defined in terms of technology types, hot and cold stream identification and stream energy content. The maximum steady-state potential for power and cooling savings within the system was computed via the Pinch Method. In this paper, several practical issues are considered for achieving a pragmatic estimate of total system savings in terms of equivalent system mass (ESM), rather than savings solely in terms of power and cooling. In this paper, more realistic ESM savings are computed by considering heat transfer inefficiencies during material transfer. An estimate of the steady-state mass, volume and crewtime requirements associated with heat exchange equipment is made by considering heat exchange equipment material type and configuration, stream flow characteristics and associated energy losses during the heat exchange process. Also, previously estimated power and cooling savings are adjusted to reflect the impact of such energy losses. This paper goes one step further than the traditional Pinch Method of considering waste heat reuse in heat exchangers to include ESM savings that occur with direct reuse of a stream. For example, rather than exchanging heat between crop growth lamp cooling air and air going to a clothes dryer, air used to cool crop lamps might be reused directly for clothes drying purposes. When thermodynamically feasible, such an approach may increase ESM savings by minimizing the mass, volume and crewtime requirements associated with stream routing equipment.

  5. Modeling an enhanced ridesharing system with meet points and time windows

    PubMed Central

    Li, Xin; Hu, Sangen; Deng, Kai

    2018-01-01

    With the rising of e-hailing services in urban areas, ride sharing is becoming a common mode of transportation. This paper presents a mathematical model to design an enhanced ridesharing system with meet points and users’ preferable time windows. The introduction of meet points allows ridesharing operators to trade off the benefits of saving en-route delays and the cost of additional walking for some passengers to be collectively picked up or dropped off. This extension to the traditional door-to-door ridesharing problem brings more operation flexibility in urban areas (where potential requests may be densely distributed in neighborhood), and thus could achieve better system performance in terms of reducing the total travel time and increasing the served passengers. We design and implement a Tabu-based meta-heuristic algorithm to solve the proposed mixed integer linear program (MILP). To evaluate the validation and effectiveness of the proposed model and solution algorithm, several scenarios are designed and also resolved to optimality by CPLEX. Results demonstrate that (i) detailed route plan associated with passenger assignment to meet points can be obtained with en-route delay savings; (ii) as compared to CPLEX, the meta-heuristic algorithm bears the advantage of higher computation efficiency and produces good quality solutions with 8%~15% difference from the global optima; and (iii) introducing meet points to ridesharing system saves the total travel time by 2.7%-3.8% for small-scale ridesharing systems. More benefits are expected for ridesharing systems with large size of fleet. This study provides a new tool to efficiently operate the ridesharing system, particularly when the ride sharing vehicles are in short supply during peak hours. Traffic congestion mitigation will also be expected. PMID:29715302

  6. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 2: Residual-fired nocogeneration process boiler

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    Computer generated data on the performance of the cogeneration energy conversion system are presented. Performance parameters included fuel consumption and savings, capital costs, economics, and emissions of residual fired process boilers.

  7. An Energy Saving System for a Beam Pumping Unit

    PubMed Central

    Lv, Hongqiang; Liu, Jun; Han, Jiuqiang; Jiang, An

    2016-01-01

    Beam pumping units are widely used in the oil production industry, but the energy efficiency of this artificial lift machinery is generally low, especially for the low-production well and high-production well in the later stage. There are a number of ways for energy savings in pumping units, with the periodic adjustment of stroke speed and rectification of balance deviation being two important methods. In the paper, an energy saving system for a beam pumping unit (ESS-BPU) based on the Internet of Things (IoT) was proposed. A total of four types of sensors, including load sensor, angle sensor, voltage sensor, and current sensor, were used to detect the operating conditions of the pumping unit. Data from these sensors was fed into a controller installed in an oilfield to adjust the stroke speed automatically and estimate the degree of balance in real-time. Additionally, remote supervision could be fulfilled using a browser on a computer or smartphone. Furthermore, the data from a practical application was recorded and analyzed, and it can be seen that ESS-BPU is helpful in reducing energy loss caused by unnecessarily high stroke speed and a poor degree of balance. PMID:27187402

  8. Facilitating energy savings with programmable thermostats: evaluation and guidelines for the thermostat user interface.

    PubMed

    Peffer, Therese; Perry, Daniel; Pritoni, Marco; Aragon, Cecilia; Meier, Alan

    2013-01-01

    Thermostats control heating and cooling in homes - representing a major part of domestic energy use - yet, poor ergonomics of these devices has thwarted efforts to reduce energy consumption. Theoretically, programmable thermostats can reduce energy by 5-15%, but in practice little to no savings compared to manual thermostats are found. Several studies have found that programmable thermostats are not installed properly, are generally misunderstood and have poor usability. After conducting a usability study of programmable thermostats, we reviewed several guidelines from ergonomics, general device usability, computer-human interfaces and building control sources. We analysed the characteristics of thermostats that enabled or hindered successfully completing tasks and in a timely manner. Subjects had higher success rates with thermostat displays with positive examples of guidelines, such as visibility of possible actions, consistency and standards, and feedback. We suggested other guidelines that seemed missing, such as navigation cues, clear hierarchy and simple decision paths. Our evaluation of a usability test of five residential programmable thermostats led to the development of a comprehensive set of specific guidelines for thermostat design including visibility of possible actions, consistency, standards, simple decision paths and clear hierarchy. Improving the usability of thermostats may facilitate energy savings.

  9. An efficient hybrid pseudospectral/finite-difference scheme for solving the TTI pure P-wave equation

    NASA Astrophysics Data System (ADS)

    Zhan, Ge; Pestana, Reynam C.; Stoffa, Paul L.

    2013-04-01

    The pure P-wave equation for modelling and migration in tilted transversely isotropic (TTI) media has attracted more and more attention in imaging seismic data with anisotropy. The desirable feature is that it is absolutely free of shear-wave artefacts and the consequent alleviation of numerical instabilities generally suffered by some systems of coupled equations. However, due to several forward-backward Fourier transforms in wavefield updating at each time step, the computational cost is significant, and thereby hampers its prevalence. We propose to use a hybrid pseudospectral (PS) and finite-difference (FD) scheme to solve the pure P-wave equation. In the hybrid solution, most of the cost-consuming wavenumber terms in the equation are replaced by inexpensive FD operators, which in turn accelerates the computation and reduces the computational cost. To demonstrate the benefit in cost saving of the new scheme, 2D and 3D reverse-time migration (RTM) examples using the hybrid solution to the pure P-wave equation are carried out, and respective runtimes are listed and compared. Numerical results show that the hybrid strategy demands less computation time and is faster than using the PS method alone. Furthermore, this new TTI RTM algorithm with the hybrid method is computationally less expensive than that with the FD solution to conventional TTI coupled equations.

  10. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    PubMed Central

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389

  11. SEMICONDUCTOR INTEGRATED CIRCUITS: A quasi-3-dimensional simulation method for a high-voltage level-shifting circuit structure

    NASA Astrophysics Data System (ADS)

    Jizhi, Liu; Xingbi, Chen

    2009-12-01

    A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  13. Investigation and evaluation of a computer program to minimize three-dimensional flight time tracks

    NASA Technical Reports Server (NTRS)

    Parke, F. I.

    1981-01-01

    The program for the DC 8-D3 flight planning was slightly modified for the three dimensional flight planning for DC 10 aircrafts. Several test runs of the modified program over the North Atlantic and North America were made for verifying the program. While geopotential height and temperature were used in a previous program as meteorological data, the modified program uses wind direction and speed and temperature received from the National Weather Service. A scanning program was written to collect required weather information from the raw data received in a packed decimal format. Two sets of weather data, the 12-hour forecast and 24-hour forecast based on 0000 GMT, are used for dynamic processes in testruns. In order to save computing time only the weather data of the North Atlantic and North America is previously stored in a PCF file and then scanned one by one.

  14. The effectiveness of element downsizing on a three-dimensional finite element model of bone trabeculae in implant biomechanics.

    PubMed

    Sato, Y; Wadamoto, M; Tsuga, K; Teixeira, E R

    1999-04-01

    More validity of finite element analysis in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To investigate the effectiveness of element downsizing on the construction of a three-dimensional finite element bone trabeculae model, with different element sizes (600, 300, 150 and 75 microm) models were constructed and stress induced by vertical 10 N loading was analysed. The difference in von Mises stress values between the models with 600 and 300 microm element sizes was larger than that between 300 and 150 microm. On the other hand, no clear difference of stress values was detected among the models with 300, 150 and 75 microm element sizes. Downsizing of elements from 600 to 300 microm is suggested to be effective in the construction of a three-dimensional finite element bone trabeculae model for possible saving of computer memory and calculation time in the laboratory.

  15. Software Solution Saves Dollars

    ERIC Educational Resources Information Center

    Trotter, Andrew

    2004-01-01

    This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…

  16. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  17. Best Manufacturing Practices: Report of Survey Conducted at UNISYS corporation Computer Systems Division, St. Paul, Minnesota

    DTIC Science & Technology

    1987-11-01

    assistance to the ATE test technicians by means of computer generated graphics on a 19" display terminal. The TEG presents colorized annotations on ACCA ...perform outstanding acts to meet goals. Savings and goals are auditable from reports, charts, SPC, and Oregon Matrix. COMPUTER-AIDED MANUFACTURING

  18. Wang-Landau sampling: Saving CPU time

    NASA Astrophysics Data System (ADS)

    Ferreira, L. S.; Jorge, L. N.; Leão, S. A.; Caparica, A. A.

    2018-04-01

    In this work we propose an improvement to the Wang-Landau (WL) method that allows an economy in CPU time of about 60% leading to the same results with the same accuracy. We used the 2D Ising model to show that one can initiate all WL simulations using the outputs of an advanced WL level from a previous simulation. We showed that up to the seventh WL level (f6) the simulations are not biased yet and can proceed to any value that the simulation from the very beginning would reach. As a result the initial WL levels can be simulated just once. It was also observed that the saving in CPU time is larger for larger lattice sizes, exactly where the computational cost is considerable. We carried out high-resolution simulations beginning initially from the first WL level (f0) and another beginning from the eighth WL level (f7) using all the data at the end of the previous level and showed that the results for the critical temperature Tc and the critical static exponents β and γ coincide within the error bars. Finally we applied the same procedure to the 1/2-spin Baxter-Wu model and the economy in CPU time was of about 64%.

  19. Development of a small-scale computer cluster

    NASA Astrophysics Data System (ADS)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  20. Computer-aided vs. tutor-delivered teaching of exposure therapy for phobia/panic: randomized controlled trial with pre-registration nursing students.

    PubMed

    Gega, L; Norman, I J; Marks, I M

    2007-03-01

    Exposure therapy is effective for phobic anxiety disorders (specific phobias, agoraphobia, social phobia) and panic disorder. Despite their high prevalence in the community, sufferers often get no treatment or if they do, it is usually after a long delay. This is largely due to the scarcity of healthcare professionals trained in exposure therapy, which is due, in part, to the high cost of training. Traditional teaching methods employed are labour intensive, being based mainly on role-play in small groups with feedback and coaching from experienced trainers. In an attempt to increase knowledge and skills in exposure therapy, there is now some interest in providing relevant teaching as part of pre-registration nurse education. Computers have been developed to teach terminology and simulate clinical scenarios for health professionals, and offer a potentially cost effective alternative to traditional teaching methods. To test whether student nurses would learn about exposure therapy for phobia/panic as well by computer-aided self-instruction as by face-to-face teaching, and to compare the individual and combined effects of two educational methods, traditional face-to-face teaching comprising a presentation with discussion and questions/answers by a specialist cognitive behaviour nurse therapist, and a computer-aided self-instructional programme based on a self-help programme for patients with phobia/panic called FearFighter, on students' knowledge, skills and satisfaction. Randomised controlled trial, with a crossover, completed in 2 consecutive days over a period of 4h per day. Ninety-two mental health pre-registration nursing students, of mixed gender, age and ethnic origin, with no previous training in cognitive behaviour therapy studying at one UK university. The two teaching methods led to similar improvements in knowledge and skills, and to similar satisfaction, when used alone. Using them in tandem conferred no added benefit. Computer-aided self-instruction was more efficient as it saved teacher preparation and delivery time, and needed no specialist tutor. Computer-aided self-instruction saved almost all preparation time and delivery effort for the expert teacher. When added to past results in medical students, the present results in nurses justify the use of computer-aided self-instruction for learning about exposure therapy and phobia/panic and of research into its value for other areas of health education.

  1. Impact of a University-Based Outpatient Telemedicine Program on Time Savings, Travel Costs, and Environmental Pollutants.

    PubMed

    Dullet, Navjit W; Geraghty, Estella M; Kaufman, Taylor; Kissee, Jamie L; King, Jesse; Dharmar, Madan; Smith, Anthony C; Marcin, James P

    2017-04-01

    The objective of this study was to estimate travel-related and environmental savings resulting from the use of telemedicine for outpatient specialty consultations with a university telemedicine program. The study was designed to retrospectively analyze the telemedicine consultation database at the University of California Davis Health System (UCDHS) between July 1996 and December 2013. Travel distances and travel times were calculated between the patient home, the telemedicine clinic, and the UCDHS in-person clinic. Travel cost savings and environmental impact were calculated by determining differences in mileage reimbursement rate and emissions between those incurred in attending telemedicine appointments and those that would have been incurred if a visit to the hub site had been necessary. There were 19,246 consultations identified among 11,281 unique patients. Telemedicine visits resulted in a total travel distance savings of 5,345,602 miles, a total travel time savings of 4,708,891 minutes or 8.96 years, and a total direct travel cost savings of $2,882,056. The mean per-consultation round-trip distance savings were 278 miles, average travel time savings were 245 minutes, and average cost savings were $156. Telemedicine consultations resulted in a total emissions savings of 1969 metric tons of CO 2 , 50 metric tons of CO, 3.7 metric tons of NO x , and 5.5 metric tons of volatile organic compounds. This study demonstrates the positive impact of a health system's outpatient telemedicine program on patient travel time, patient travel costs, and environmental pollutants. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. Building Specifications

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The building in the top photo is the new home of the National Permanent Savings Bank in Washington, D.C., designed by Hartman-Cox Architects. Its construction was based on a money-saving method of preparing building specifications which derived from NASA technology developed to obtain quality construction while holding down cost of launch facilities, test centers and other structures. Written technical specifications spell out materials and components to be used on construction projects and identify the quality tests each item must pass. Specifications can have major impact on construction costs. Poorly formulated specifications can lead to unacceptable construction which must be replaced, unnecessarily high materials costs, safety hazards, disputes and often additional costs due to delays and litigation. NASA's Langley Research Center developed a novel approach to providing accurate, uniform, cost-effective specifications which can be readily updated to incorporate new building technologies. Called SPECSINTACT, it is a computerized - system accessible to all NASA centers involved in construction programs. The system contains a comprehensive catalog of master specifications applicable to many types of construction. It enables designers of any structure to call out relevant sections from computer storage and modify them to fit the needs of the project at hand. Architects and engineers can save time by concentrating their efforts on needed modifications rather than developing all specifications from scratch. Successful use of SPECSINTACT has led to a number of spinoff systems. One of the first was MASTERSPEC, developed from NASA's experience by Production Systems for Architects and Engineers, Inc., an organization established by the American Institute of Architects. MASTERSPEC, used in construction of the bank building pictured, follows the same basic format as SPECSINTACT and can be used in either automated or manual modes. The striking appearance of the bank building shows that, while MASTERSPEC saves time and money, its use involves no sacrfice in architectural design freedom. The Naval Engineering Facilities Command employs an automated specifications system based on SPECSINTACT. The Public Buildings Service of the General Services Administration used SPECSINTACT as a starting point in a plan to make its guideline specifications available to architects and engineers on a nationwide computer network. Public Technology, Inc., a NASA Technology Application Team, is working with Production Systems for Architects and Engineers, Inc., to promote widespread use of the system by state and local governments for cost benefits to taxpayers.

  3. Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.

    1993-01-01

    A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.

  4. Mathematical description of complex chemical kinetics and application to CFD modeling codes

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.

    1993-01-01

    A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.

  5. Asynchronous sampled-data approach for event-triggered systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  6. A VLBI variance-covariance analysis interactive computer program. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bock, Y.

    1980-01-01

    An interactive computer program (in FORTRAN) for the variance covariance analysis of VLBI experiments is presented for use in experiment planning, simulation studies and optimal design problems. The interactive mode is especially suited to these types of analyses providing ease of operation as well as savings in time and cost. The geodetic parameters include baseline vector parameters and variations in polar motion and Earth rotation. A discussion of the theroy on which the program is based provides an overview of the VLBI process emphasizing the areas of interest to geodesy. Special emphasis is placed on the problem of determining correlations between simultaneous observations from a network of stations. A model suitable for covariance analyses is presented. Suggestions towards developing optimal observation schedules are included.

  7. Onboard Short Term Plan Viewer

    NASA Technical Reports Server (NTRS)

    Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason

    2011-01-01

    Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.

  8. Exhaustive Versus Randomized Searchers for Nonlinear Optimization in 21st Century Computing: Solar Application

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; AliShaykhian, Gholam

    2010-01-01

    We present a simple multi-dimensional exhaustive search method to obtain, in a reasonable time, the optimal solution of a nonlinear programming problem. It is more relevant in the present day non-mainframe computing scenario where an estimated 95% computing resources remains unutilized and computing speed touches petaflops. While the processor speed is doubling every 18 months, the band width is doubling every 12 months, and the hard disk space is doubling every 9 months. A randomized search algorithm or, equivalently, an evolutionary search method is often used instead of an exhaustive search algorithm. The reason is that a randomized approach is usually polynomial-time, i.e., fast while an exhaustive search method is exponential-time i.e., slow. We discuss the increasing importance of exhaustive search in optimization with the steady increase of computing power for solving many real-world problems of reasonable size. We also discuss the computational error and complexity of the search algorithm focusing on the fact that no measuring device can usually measure a quantity with an accuracy greater than 0.005%. We stress the fact that the quality of solution of the exhaustive search - a deterministic method - is better than that of randomized search. In 21 st century computing environment, exhaustive search cannot be left aside as an untouchable and it is not always exponential. We also describe a possible application of these algorithms in improving the efficiency of solar cells - a real hot topic - in the current energy crisis. These algorithms could be excellent tools in the hands of experimentalists and could save not only large amount of time needed for experiments but also could validate the theory against experimental results fast.

  9. Provider-Independent Use of the Cloud

    NASA Astrophysics Data System (ADS)

    Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron

    Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.

  10. Syringeless power injector versus dual-syringe power injector: economic evaluation of user performance, the impact on contrast enhanced computed tomography (CECT) workflow exams, and hospital costs.

    PubMed

    Colombo, Giorgio L; Andreis, Ivo A Bergamo; Di Matteo, Sergio; Bruno, Giacomo M; Mondellini, Claudio

    2013-01-01

    The utilization of diagnostic imaging has substantially increased over the past decade in Europe and North America and continues to grow worldwide. The purpose of this study was to develop an economic evaluation of a syringeless power injector (PI) versus a dual-syringe PI for contrast enhanced computed tomography (CECT) in a hospital setting. Patients (n=2379) were enrolled at the Legnano Hospital between November 2012 and January 2013. They had been referred to the hospital for a CECT analysis and were randomized into two groups. The first group was examined with a 256-MDCT (MultiDetector Computed Tomography) scanner using a syringeless power injector, while the other group was examined with a 64-MDCT scanner using a dual-syringe. Data on the operators' time required in the patient analysis steps as well as on the quantity of consumable materials used were collected. The radiologic technologists' satisfaction with the use of the PIs was rated on a 10-point scale. A budget impact analysis and sensitivity analysis were performed under the base-case scenario. A total of 1,040 patients were examined using the syringeless system, and 1,339 with the dual-syringe system; the CECT examination quality was comparable for both PI systems. Equipment preparation time and releasing time per examination for syringeless PIs versus dual-syringe PIs were 100±30 versus 180±30 seconds and 90±30 and 140±20 seconds, respectively. On average, 10±3 mL of contrast media (CM) wastage per examination was observed with the dual-syringe PI and 0±1 mL with the syringeless PI. Technologists had higher satisfaction with the syringeless PI than with the dual-syringe system (8.8 versus 8.0). The syringeless PI allows a saving of about €6.18 per patient, both due to the lower cost of the devices and to the better performance of the syringeless system. The univariate sensitivity analysis carried out on the base-case results within the standard deviation range confirmed the saving generated by using the syringeless device, with saving values between €5.40 and €6.20 per patient. The syringeless PI was found to be more user-friendly and efficient, minimizing contrast wastage and providing similar contrast enhancement quality compared to the dual-syringe injector, with comparable CECT examination quality.

  11. Compression of 3D Point Clouds Using a Region-Adaptive Hierarchical Transform.

    PubMed

    De Queiroz, Ricardo; Chou, Philip A

    2016-06-01

    In free-viewpoint video, there is a recent trend to represent scene objects as solids rather than using multiple depth maps. Point clouds have been used in computer graphics for a long time and with the recent possibility of real time capturing and rendering, point clouds have been favored over meshes in order to save computation. Each point in the cloud is associated with its 3D position and its color. We devise a method to compress the colors in point clouds which is based on a hierarchical transform and arithmetic coding. The transform is a hierarchical sub-band transform that resembles an adaptive variation of a Haar wavelet. The arithmetic encoding of the coefficients assumes Laplace distributions, one per sub-band. The Laplace parameter for each distribution is transmitted to the decoder using a custom method. The geometry of the point cloud is encoded using the well-established octtree scanning. Results show that the proposed solution performs comparably to the current state-of-the-art, in many occasions outperforming it, while being much more computationally efficient. We believe this work represents the state-of-the-art in intra-frame compression of point clouds for real-time 3D video.

  12. Fatal alcohol-related traffic crashes increase subsequent to changes to and from daylight savings time.

    PubMed

    Hicks, G J; Davis, J W; Hicks, R A

    1998-06-01

    On the hypothesis that sleepiness and alcohol interact to increase the risk of alcohol-related traffic fatalities, the percentages of alcohol-related fatal traffic crashes were assessed for the entire state of New Mexico for the years 1989-1992, for each of the seven days that preceded the changes to and from Daylight Savings Time and for each of the 14 days which followed the changes to and from Daylight Savings Time. Consistent with our hypothesis the percentage of alcohol-related fatal crashes increased significantly during the first seven days after these changes in Daylight Savings Time.

  13. Daylight saving time transitions and hospital treatments due to accidents or manic episodes

    PubMed Central

    Lahti, Tuuli A; Haukka, Jari; Lönnqvist, Jouko; Partonen, Timo

    2008-01-01

    Background Daylight saving time affects millions of people annually but its impacts are still widely unknown. Sleep deprivation and the change of circadian rhythm can trigger mental illness and cause higher accident rates. Transitions into and out of daylight saving time changes the circadian rhythm and may cause sleep deprivation. Thus it seems plausible that the prevalence of accidents and/or manic episodes may be higher after transition into and out of daylight saving time. The aim of this study was to explore the effects of transitions into and out of daylight saving time on the incidence of accidents and manic episodes in the Finnish population during the years of 1987 to 2003. Methods The nationwide data were derived from the Finnish Hospital Discharge Register. From the register we obtained the information about the hospital-treated accidents and manic episodes during two weeks before and two weeks after the transitions in 1987–2003. Results The results were negative, as the transitions into or out of daylight saving time had no significant effect on the incidence of accidents or manic episodes. Conclusion One-hour transitions do not increase the incidence of manic episodes or accidents which require hospital treatment. PMID:18302734

  14. Automated procedures for sizing aerospace vehicle structures /SAVES/

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  15. Power Saving Control for Battery-Powered Portable WLAN APs

    NASA Astrophysics Data System (ADS)

    Ogawa, Masakatsu; Hiraguri, Takefumi

    This paper proposes a power saving control function for battery-powered portable wireless LAN (WLAN) access points (APs) to extend the battery life. The IEEE802.11 standard does not support power saving control for APs. To enable a sleep state for an AP, the AP forces the stations (STAs) to refrain from transmitting frames using the network allocation vector (NAV) while the AP is sleeping. Thus the sleep state for the AP can be employed without causing frame loss at the STAs. Numerical analysis and computer simulation reveal that the newly proposed control technique conserves power compared to the conventional control.

  16. Identification of cost effective energy conservation measures

    NASA Technical Reports Server (NTRS)

    Bierenbaum, H. S.; Boggs, W. H.

    1978-01-01

    In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed

  17. Numerical study of hydrogen-air supersonic combustion by using elliptic and parabolized equations

    NASA Technical Reports Server (NTRS)

    Chitsomboon, T.; Tiwari, S. N.

    1986-01-01

    The two-dimensional Navier-Stokes and species continuity equations are used to investigate supersonic chemically reacting flow problems which are related to scramjet-engine configurations. A global two-step finite-rate chemistry model is employed to represent the hydrogen-air combustion in the flow. An algebraic turbulent model is adopted for turbulent flow calculations. The explicit unsplit MacCormack finite-difference algorithm is used to develop a computer program suitable for a vector processing computer. The computer program developed is then used to integrate the system of the governing equations in time until convergence is attained. The chemistry source terms in the species continuity equations are evaluated implicitly to alleviate stiffness associated with fast chemical reactions. The problems solved by the elliptic code are re-investigated by using a set of two-dimensional parabolized Navier-Stokes and species equations. A linearized fully-coupled fully-implicit finite difference algorithm is used to develop a second computer code which solves the governing equations by marching in spce rather than time, resulting in a considerable saving in computer resources. Results obtained by using the parabolized formulation are compared with the results obtained by using the fully-elliptic equations. The comparisons indicate fairly good agreement of the results of the two formulations.

  18. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duque, Earl P.N.; Whitlock, Brad J.

    High performance computers have for many years been on a trajectory that gives them extraordinary compute power with the addition of more and more compute cores. At the same time, other system parameters such as the amount of memory per core and bandwidth to storage have remained constant or have barely increased. This creates an imbalance in the computer, giving it the ability to compute a lot of data that it cannot reasonably save out due to time and storage constraints. While technologies have been invented to mitigate this problem (burst buffers, etc.), software has been adapting to employ inmore » situ libraries which perform data analysis and visualization on simulation data while it is still resident in memory. This avoids the need to ever have to pay the costs of writing many terabytes of data files. Instead, in situ enables the creation of more concentrated data products such as statistics, plots, and data extracts, which are all far smaller than the full-sized volume data. With the increasing popularity of in situ, multiple in situ infrastructures have been created, each with its own mechanism for integrating with a simulation. To make it easier to instrument a simulation with multiple in situ infrastructures and include custom analysis algorithms, this project created the SENSEI framework.« less

  19. Efficient pairwise RNA structure prediction using probabilistic alignment constraints in Dynalign

    PubMed Central

    2007-01-01

    Background Joint alignment and secondary structure prediction of two RNA sequences can significantly improve the accuracy of the structural predictions. Methods addressing this problem, however, are forced to employ constraints that reduce computation by restricting the alignments and/or structures (i.e. folds) that are permissible. In this paper, a new methodology is presented for the purpose of establishing alignment constraints based on nucleotide alignment and insertion posterior probabilities. Using a hidden Markov model, posterior probabilities of alignment and insertion are computed for all possible pairings of nucleotide positions from the two sequences. These alignment and insertion posterior probabilities are additively combined to obtain probabilities of co-incidence for nucleotide position pairs. A suitable alignment constraint is obtained by thresholding the co-incidence probabilities. The constraint is integrated with Dynalign, a free energy minimization algorithm for joint alignment and secondary structure prediction. The resulting method is benchmarked against the previous version of Dynalign and against other programs for pairwise RNA structure prediction. Results The proposed technique eliminates manual parameter selection in Dynalign and provides significant computational time savings in comparison to prior constraints in Dynalign while simultaneously providing a small improvement in the structural prediction accuracy. Savings are also realized in memory. In experiments over a 5S RNA dataset with average sequence length of approximately 120 nucleotides, the method reduces computation by a factor of 2. The method performs favorably in comparison to other programs for pairwise RNA structure prediction: yielding better accuracy, on average, and requiring significantly lesser computational resources. Conclusion Probabilistic analysis can be utilized in order to automate the determination of alignment constraints for pairwise RNA structure prediction methods in a principled fashion. These constraints can reduce the computational and memory requirements of these methods while maintaining or improving their accuracy of structural prediction. This extends the practical reach of these methods to longer length sequences. The revised Dynalign code is freely available for download. PMID:17445273

  20. Modeling Pilot Behavior for Assessing Integrated Alert and Notification Systems on Flight Decks

    NASA Technical Reports Server (NTRS)

    Cover, Mathew; Schnell, Thomas

    2010-01-01

    Numerous new flight deck configurations for caution, warning, and alerts can be conceived; yet testing them with human-in-the-Ioop experiments to evaluate each one would not be practical. New sensors, instruments, and displays are being put into cockpits every day and this is particularly true as we enter the dawn of the Next Generation Air Transportation System (NextGen). By modeling pilot behavior in a computer simulation, an unlimited number of unique caution, warning, and alert configurations can be evaluated 24/7 by a computer. These computer simulations can then identify the most promising candidate formats to further evaluate in higher fidelity, but more costly, Human-in-the-Ioop (HITL) simulations. Evaluations using batch simulations with human performance models saves time, money, and enables a broader consideration of possible caution, warning, and alerting configurations for future flight decks.

  1. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  2. Flow in nonrotating passages of radial inflow turbines

    NASA Technical Reports Server (NTRS)

    Baskharone, E.; Hamed, A.; Tabakoff, W.

    1979-01-01

    The analysis of irrotational incompressible flow field in the stator unit of a radial inflow turbine is presented. The solution in the combined scroll-nozzle assembly is complicated by the domain geometry and by its multiconnectivity. This model is necessary, however, in order to provide a better understanding of the mutual interaction effects of these two components on the flow field. The finite element method is used in the solution which is limited to the two dimensional case. A substructuring technique is adopted in the computational procedure and results in considerable savings in both computer time and core storage requirements. The results are presented for the flow velocity magnitude and direction in the scroll and through the various nozzles, for two nozzle blade geometries. In addition, the mass flow rates in the different nozzles are computed and their deviations from the mean value determined.

  3. Software error detection

    NASA Technical Reports Server (NTRS)

    Buechler, W.; Tucker, A. G.

    1981-01-01

    Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.

  4. Turbine Blade and Endwall Heat Transfer Measured in NASA Glenn's Transonic Turbine Blade Cascade

    NASA Technical Reports Server (NTRS)

    Giel, Paul W.

    2000-01-01

    Higher operating temperatures increase the efficiency of aircraft gas turbine engines, but can also degrade internal components. High-pressure turbine blades just downstream of the combustor are particularly susceptible to overheating. Computational fluid dynamics (CFD) computer programs can predict the flow around the blades so that potential hot spots can be identified and appropriate cooling schemes can be designed. Various blade and cooling schemes can be examined computationally before any hardware is built, thus saving time and effort. Often though, the accuracy of these programs has been found to be inadequate for predicting heat transfer. Code and model developers need highly detailed aerodynamic and heat transfer data to validate and improve their analyses. The Transonic Turbine Blade Cascade was built at the NASA Glenn Research Center at Lewis Field to help satisfy the need for this type of data.

  5. GPU accelerated implementation of NCI calculations using promolecular density.

    PubMed

    Rubez, Gaëtan; Etancelin, Jean-Matthieu; Vigouroux, Xavier; Krajecki, Michael; Boisson, Jean-Charles; Hénon, Eric

    2017-05-30

    The NCI approach is a modern tool to reveal chemical noncovalent interactions. It is particularly attractive to describe ligand-protein binding. A custom implementation for NCI using promolecular density is presented. It is designed to leverage the computational power of NVIDIA graphics processing unit (GPU) accelerators through the CUDA programming model. The code performances of three versions are examined on a test set of 144 systems. NCI calculations are particularly well suited to the GPU architecture, which reduces drastically the computational time. On a single compute node, the dual-GPU version leads to a 39-fold improvement for the biggest instance compared to the optimal OpenMP parallel run (C code, icc compiler) with 16 CPU cores. Energy consumption measurements carried out on both CPU and GPU NCI tests show that the GPU approach provides substantial energy savings. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Occupational risk identification using hand-held or laptop computers.

    PubMed

    Naumanen, Paula; Savolainen, Heikki; Liesivuori, Jyrki

    2008-01-01

    This paper describes the Work Environment Profile (WEP) program and its use in risk identification by computer. It is installed into a hand-held computer or a laptop to be used in risk identification during work site visits. A 5-category system is used to describe the identified risks in 7 groups, i.e., accidents, biological and physical hazards, ergonomic and psychosocial load, chemicals, and information technology hazards. Each group contains several qualifying factors. These 5 categories are colour-coded at this stage to aid with visualization. Risk identification produces visual summary images the interpretation of which is facilitated by colours. The WEP program is a tool for risk assessment which is easy to learn and to use both by experts and nonprofessionals. It is especially well adapted to be used both in small and in larger enterprises. Considerable time is saved as no paper notes are needed.

  7. Examining Effects of Virtual Machine Settings on Voice over Internet Protocol in a Private Cloud Environment

    ERIC Educational Resources Information Center

    Liao, Yuan

    2011-01-01

    The virtualization of computing resources, as represented by the sustained growth of cloud computing, continues to thrive. Information Technology departments are building their private clouds due to the perception of significant cost savings by managing all physical computing resources from a single point and assigning them to applications or…

  8. The New Film Technologies: Computerized Video-Assisted Film Production.

    ERIC Educational Resources Information Center

    Mott, Donald R.

    Over the past few years, video technology has been used to assist film directors after they have shot a scene, to control costs, and to create special effects, especially computer assisted graphics. At present, a computer based editing system called "Film 5" combines computer technology and video tape with film to save as much as 50% of…

  9. The long term financial impacts of CVD: living standards in retirement.

    PubMed

    Schofield, Deborah; Kelly, Simon; Shrestha, Rupendra; Passey, Megan; Callander, Emily; Percival, Richard

    2012-03-22

    Cardiovascular disease (CVD) has significant economic costs, however these are generally estimated for the present-time and little consideration is given to the long term economic consequences. This study estimates the value of savings those who retire early due to CVD will have accumulated by the time they reach the traditional retirement age of 65 years, and how much lower the value of these savings are compared to those who remained healthy and in the workforce. Using Health&WealthMOD - a microsimulation model of Australians aged 45 to 64 years, regression models were used to analyse the differences between the projected savings and the retirement incomes of people by the time they reach age 65 for those currently working with no chronic condition, and people not in the labour force due to CVD. Over 99% of individuals who are employed full-time will have accumulated some savings at age 65; whereas only 77% of those who are out of the labour force due to CVD will have done so. Those who retire early due to CVD will have a median value of total savings by the time they are 65 of $1833. This is far lower than the expected median value of savings for those who remained in the labour force full-time, who will have $281841 of savings. Not only will early retirement due to cardiovascular disease limit the immediate income and wealth available to individuals, but also reduce their long term financial capacity by reducing their savings. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  10. Irregular large-scale computed tomography on multiple graphics processors improves energy-efficiency metrics for industrial applications

    NASA Astrophysics Data System (ADS)

    Jimenez, Edward S.; Goodman, Eric L.; Park, Ryeojin; Orr, Laurel J.; Thompson, Kyle R.

    2014-09-01

    This paper will investigate energy-efficiency for various real-world industrial computed-tomography reconstruction algorithms, both CPU- and GPU-based implementations. This work shows that the energy required for a given reconstruction is based on performance and problem size. There are many ways to describe performance and energy efficiency, thus this work will investigate multiple metrics including performance-per-watt, energy-delay product, and energy consumption. This work found that irregular GPU-based approaches1 realized tremendous savings in energy consumption when compared to CPU implementations while also significantly improving the performance-per- watt and energy-delay product metrics. Additional energy savings and other metric improvement was realized on the GPU-based reconstructions by improving storage I/O by implementing a parallel MIMD-like modularization of the compute and I/O tasks.

  11. A Web-based home welfare and care services support system using a pen type image sensor.

    PubMed

    Ogawa, Hidekuni; Yonezawa, Yoshiharu; Maki, Hiromichi; Sato, Haruhiko; Hahn, Allen W; Caldwell, W Morton

    2003-01-01

    A long-term care insurance law for elderly persons was put in force two years ago in Japan. The Home Helpers, who are employed by hospitals, care companies or the welfare office, provide home welfare and care services for the elderly, such as cooking, bathing, washing, cleaning, shopping, etc. We developed a web-based home welfare and care services support system using wireless Internet mobile phones and Internet client computers, which employs a pen type image sensor. The pen type image sensor is used by the elderly people as the entry device for their care requests. The client computer sends the requests to the server computer in the Home Helper central office, and then the server computer automatically transfers them to the Home Helper's mobile phone. This newly-developed home welfare and care services support system is easily operated by elderly persons and enables Homes Helpers to save a significant amount of time and extra travel.

  12. Fast H.264/AVC FRExt intra coding using belief propagation.

    PubMed

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  13. Streaming support for data intensive cloud-based sequence analysis.

    PubMed

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  14. Use of agents to implement an integrated computing environment

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.

  15. [THE FAILURE MODES AND EFFECTS ANALYSIS FACILITATES A SAFE, TIME AND MONEY SAVING OPEN ACCESS COLONOSCOPY SERVICE].

    PubMed

    Gingold-Belfer, Rachel; Niv, Yaron; Horev, Nehama; Gross, Shuli; Sahar, Nadav; Dickman, Ram

    2017-04-01

    Failure modes and effects analysis (FMEA) is used for the identification of potential risks in health care processes. We used a specific FMEA - based form for direct referral for colonoscopy and assessed it for procedurerelated perforations. Ten experts in endoscopy evaluated and computed the entire referral process, modes of preparation for the endoscopic procedure, the endoscopic procedure itself and the discharge process. We used FMEA assessing for likelihood of occurrence, detection and severity and calculated the risk profile number (RPN) for each of the above points. According to the highest RPN results we designed a specific open access referral form and then compared the occurrence of colonic perforations (between 2010 and 2013) in patients who were referred through the open access arm (Group 1) to those who had a prior clinical consultation (non-open access, Group 2). Our experts in endoscopy (5 physicians and 5 nurses) identified 3 categories of failure modes that, on average, reached the highest RPNs. We identified 9,558 colonoscopies in group 1, and 12,567 in group 2. Perforations were identified in three patients from the open access group (1:3186, 0.03%) and in 10 from group 2 (1:1256, 0.07%) (p = 0.024). Direct referral for colonoscopy saved 9,558 pre-procedure consultations and the sum of $850,000. The FMEA tool-based specific referral form facilitates a safe, time and money saving open access colonoscopy service. Our form may be adopted by other gastroenterological clinics in Israel.

  16. Structure and application of an interface program between a geographic-information system and a ground-water flow model

    USGS Publications Warehouse

    Van Metre, P.C.

    1990-01-01

    A computer-program interface between a geographic-information system and a groundwater flow model links two unrelated software systems for use in developing the flow models. The interface program allows the modeler to compile and manage geographic components of a groundwater model within the geographic information system. A significant savings of time and effort is realized in developing, calibrating, and displaying the groundwater flow model. Four major guidelines were followed in developing the interface program: (1) no changes to the groundwater flow model code were to be made; (2) a data structure was to be designed within the geographic information system that follows the same basic data structure as the groundwater flow model; (3) the interface program was to be flexible enough to support all basic data options available within the model; and (4) the interface program was to be as efficient as possible in terms of computer time used and online-storage space needed. Because some programs in the interface are written in control-program language, the interface will run only on a computer with the PRIMOS operating system. (USGS)

  17. [Saving motives in young, middle-aged, and older adults. Preliminary results of a new inventory for exploring lifespan saving motives].

    PubMed

    Rager, B; Lang, F R; Wagner, G G

    2012-12-01

    There is some research on personal reasons for saving money in the economic sciences. However, not much is known about the age differences of saving motives. In this vein, the future time perspective (FTP) is known to play a critical role for motivation across the life span. In this study, we introduce a new Saving Motive Inventory (SMI), which also covers saving goals after retirement. Furthermore, it is argued that additional saving motives that are not based on economic models of life-cycle saving also exist. In accordance with the socio-emotional selectivity theory, we explored age differences in an online survey with 496 participants from young (19-44 years), middle-aged (45-64 years), and older (65-86 years) adulthood, who completed a questionnaire on saving motives, personality, and future-related thinking (e.g., Future Time Perspective Scale, Life Orientation Test). Results of the explorative Factor Analysis (EFA) are consistent with the theoretical expectations. The factors are generativity, educational investment, consumption, indifference, and provision for death and dying. Together these five factors account for 67% of the variance. In general, the inventory is reliable and valid with respect to the expected internal and external criteria. It contributes to better understanding of saving motives over the lifespan, especially with respect to effects of the future time perspective.

  18. Standardized ultrasound templates for diagnosing appendicitis reduce annual imaging costs.

    PubMed

    Nordin, Andrew B; Sales, Stephen; Nielsen, Jason W; Adler, Brent; Bates, David Gregory; Kenney, Brian

    2018-01-01

    Ultrasound is preferred over computed tomography (CT) for diagnosing appendicitis in children to avoid undue radiation exposure. We previously reported our experience in instituting a standardized appendicitis ultrasound template, which decreased CT rates by 67.3%. In this analysis, we demonstrate the ongoing cost savings associated with using this template. Retrospective chart review for the time period preceding template implementation (June 2012-September 2012) was combined with prospective review through December 2015 for all patients in the emergency department receiving diagnostic imaging for appendicitis. The type of imaging was recorded, and imaging rates and ultrasound test statistics were calculated. Estimated annual imaging costs based on pretemplate ultrasound and CT utilization rates were compared with post-template annual costs to calculate annual and cumulative savings. In the pretemplate period, ultrasound and CT rates were 80.2% and 44.3%, respectively, resulting in a combined annual cost of $300,527.70. Similar calculations were performed for each succeeding year, accounting for changes in patient volume. Using pretemplate rates, our projected 2015 imaging cost was $371,402.86; however, our ultrasound rate had increased to 98.3%, whereas the CT rate declined to 9.6%, yielding an annual estimated cost of $224,853.00 and a savings of $146,549.86. Since implementation, annual savings have steadily increased for a cumulative cost savings of $336,683.83. Standardizing ultrasound reports for appendicitis not only reduces the use of CT scans and the associated radiation exposure but also decreases annual imaging costs despite increased numbers of imaging studies. Continued cost reduction may be possible by using diagnostic algorithms. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. The effectiveness of the 55 MPH national maximum speed limit as a life saving benefit

    DOT National Transportation Integrated Search

    1980-10-01

    The report contains an analysis of the life saving benefits resulting from the 55 mph NMSL from 1974-1979. Monthly fatality data from 1970-1979 was used in a time series model to arrive at the estimated safety benefits (lives saved). The time series ...

  20. An energy saving mechanism of EPON networks for real time video transmission

    NASA Astrophysics Data System (ADS)

    Liu, Chien-Ping; Wu, Ho-Ting; Chiang, Yun-Ting; Chien, Shieh-Chieh; Ke, Kai-Wei

    2015-07-01

    Modern access networks are constructed widely by passive optical networks (PONs) to meet the growing bandwidth demand. However, higher bandwidth means more energy consumption. To save energy, a few research works propose the dual-mode energy saving mechanism that allows the ONU to operate between active and sleep modes periodically. However, such dual-mode energy saving design may induce unnecessary power consumption or packet delay increase in the case where only downstream data exist for most of the time. In this paper, we propose a new tri-mode energy saving scheme for Ethernet PON (EPON). The new tri-mode energy saving design, combining the dual-mode saving mechanism with the doze mode, allows the ONU to switch among these three modes alternatively. In the doze mode, the ONU may receive downstream data while keeping its transmitter close. Such scenario is often observed for real time video downstream transmission. Furthermore, the low packet delay of high priority upstream data can be attained through the use of early wake-up mechanism employed in both energy saving modes. The energy saving and system efficiency can thus be achieved jointly while maintaining the differentiated QoS for data with various priorities. Performance results via simulation have demonstrated the effectiveness of such mechanism.

  1. Accurate acceleration of kinetic Monte Carlo simulations through the modification of rate constants.

    PubMed

    Chatterjee, Abhijit; Voter, Arthur F

    2010-05-21

    We present a novel computational algorithm called the accelerated superbasin kinetic Monte Carlo (AS-KMC) method that enables a more efficient study of rare-event dynamics than the standard KMC method while maintaining control over the error. In AS-KMC, the rate constants for processes that are observed many times are lowered during the course of a simulation. As a result, rare processes are observed more frequently than in KMC and the time progresses faster. We first derive error estimates for AS-KMC when the rate constants are modified. These error estimates are next employed to develop a procedure for lowering process rates with control over the maximum error. Finally, numerical calculations are performed to demonstrate that the AS-KMC method captures the correct dynamics, while providing significant CPU savings over KMC in most cases. We show that the AS-KMC method can be employed with any KMC model, even when no time scale separation is present (although in such cases no computational speed-up is observed), without requiring the knowledge of various time scales present in the system.

  2. Prefetching in file systems for MIMD multiprocessors

    NASA Technical Reports Server (NTRS)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  3. Reinforcement Learning and Savings Behavior.

    PubMed

    Choi, James J; Laibson, David; Madrian, Brigitte C; Metrick, Andrew

    2009-12-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)-a high average and/or low variance return-increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-varying investor-level heterogeneity that is correlated with portfolio allocations to stock, bond, and cash asset classes. We discuss implications for the equity premium puzzle and interventions aimed at improving household financial outcomes.

  4. MARVEL: A knowledge-based productivity enhancement tool for real-time multi-mission and multi-subsystem spacecraft operations

    NASA Astrophysics Data System (ADS)

    Schwuttke, Ursula M.; Veregge, John, R.; Angelino, Robert; Childs, Cynthia L.

    1990-10-01

    The Monitor/Analyzer of Real-time Voyager Engineering Link (MARVEL) is described. It is the first automation tool to be used in an online mode for telemetry monitoring and analysis in mission operations. MARVEL combines standard automation techniques with embedded knowledge base systems to simultaneously provide real time monitoring of data from subsystems, near real time analysis of anomaly conditions, and both real time and non-real time user interface functions. MARVEL is currently capable of monitoring the Computer Command Subsystem (CCS), Flight Data Subsystem (FDS), and Attitude and Articulation Control Subsystem (AACS) for both Voyager spacecraft, simultaneously, on a single workstation. The goal of MARVEL is to provide cost savings and productivity enhancement in mission operations and to reduce the need for constant availability of subsystem expertise.

  5. Automatic target detection using binary template matching

    NASA Astrophysics Data System (ADS)

    Jun, Dong-San; Sun, Sun-Gu; Park, HyunWook

    2005-03-01

    This paper presents a new automatic target detection (ATD) algorithm to detect targets such as battle tanks and armored personal carriers in ground-to-ground scenarios. Whereas most ATD algorithms were developed for forward-looking infrared (FLIR) images, we have developed an ATD algorithm for charge-coupled device (CCD) images, which have superior quality to FLIR images in daylight. The proposed algorithm uses fast binary template matching with an adaptive binarization, which is robust to various light conditions in CCD images and saves computation time. Experimental results show that the proposed method has good detection performance.

  6. On numerical solution of the Schrödinger equation: the shooting method revisited

    NASA Astrophysics Data System (ADS)

    Indjin, D.; Todorović, G.; Milanović, V.; Ikonić, Z.

    1995-09-01

    An alternative formulation of the "shooting" method for a numerical solution of the Schrödinger equation is described for cases of general asymmetric one-dimensional potential (planar geometry), and spherically symmetric potential. The method relies on matching the asymptotic wavefunctions and the potential core region wavefunctions, in course of finding bound states energies. It is demonstrated in the examples of Morse and Kratzer potentials, where a high accuracy of the calculated eigenvalues is found, together with a considerable saving of the computation time.

  7. Computational tools for multi-linked flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.

    1990-01-01

    A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.

  8. SLAM, a Mathematica interface for SUSY spectrum generators

    NASA Astrophysics Data System (ADS)

    Marquard, Peter; Zerf, Nikolai

    2014-03-01

    We present and publish a Mathematica package, which can be used to automatically obtain any numerical MSSM input parameter from SUSY spectrum generators, which follow the SLHA standard, like SPheno, SOFTSUSY, SuSeFLAV or Suspect. The package enables a very comfortable way of numerical evaluations within the MSSM using Mathematica. It implements easy to use predefined high scale and low scale scenarios like mSUGRA or mhmax and if needed enables the user to directly specify the input required by the spectrum generators. In addition it supports an automatic saving and loading of SUSY spectra to and from a SQL data base, avoiding the rerun of a spectrum generator for a known spectrum. Catalogue identifier: AERX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERX_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4387 No. of bytes in distributed program, including test data, etc.: 37748 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer where Mathematica version 6 or higher is running providing bash and sed. Operating system: Linux. Classification: 11.1. External routines: A SUSY spectrum generator such as SPheno, SOFTSUSY, SuSeFLAV or SUSPECT Nature of problem: Interfacing published spectrum generators for automated creation, saving and loading of SUSY particle spectra. Solution method: SLAM automatically writes/reads SLHA spectrum generator input/output and is able to save/load generated data in/from a data base. Restrictions: No general restrictions, specific restrictions are given in the manuscript. Running time: A single spectrum calculation takes much less than one second on a modern PC.

  9. Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors

    PubMed Central

    Zhan, Zehui; Zhang, Lei; Mei, Hu; Fong, Patrick S. W.

    2016-01-01

    The detection of university online learners’ reading ability is generally problematic and time-consuming. Thus the eye-tracking sensors have been employed in this study, to record temporal and spatial human eye movements. Learners’ pupils, blinks, fixation, saccade, and regression are recognized as primary indicators for detecting reading abilities. A computational model is established according to the empirical eye-tracking data, and applying the multi-feature regularization machine learning mechanism based on a Low-rank Constraint. The model presents good generalization ability with an error of only 4.9% when randomly running 100 times. It has obvious advantages in saving time and improving precision, with only 20 min of testing required for prediction of an individual learner’s reading ability. PMID:27626418

  10. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  11. Energy Savings Lifetimes and Persistence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Ian M.; Schiller, Steven R.; Todd, Annika

    2016-02-01

    This technical brief explains the concepts of energy savings lifetimes and savings persistence and discusses how program administrators use these factors to calculate savings for efficiency measures, programs and portfolios. Savings lifetime is the length of time that one or more energy efficiency measures or activities save energy, and savings persistence is the change in savings throughout the functional life of a given efficiency measure or activity. Savings lifetimes are essential for assessing the lifecycle benefits and cost effectiveness of efficiency activities and for forecasting loads in resource planning. The brief also provides estimates of savings lifetimes derived from amore » national collection of costs and savings for electric efficiency programs and portfolios.« less

  12. A New Evaluation Method of Stored Heat Effect of Reinforced Concrete Wall of Cold Storage

    NASA Astrophysics Data System (ADS)

    Nomura, Tomohiro; Murakami, Yuji; Uchikawa, Motoyuki

    Today it has become imperative to save energy by operating a refrigerator in a cold storage executed by external insulate reinforced concrete wall intermittently. The theme of the paper is to get the evaluation method to be capable of calculating, numerically, interval time for stopping the refrigerator, in applying reinforced concrete wall as source of stored heat. The experiments with the concrete models were performed in order to examine the time variation of internal temperature after refrigerator stopped. In addition, the simulation method with three dimensional unsteady FEM for personal-computer type was introduced for easily analyzing the internal temperature variation. Using this method, it is possible to obtain the time variation of internal temperature and to calculate the interval time for stopping the refrigerator.

  13. Development of energy-saving devices for a full slow-speed ship through improving propulsion performance

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Hun; Choi, Jung-Eun; Choi, Bong-Jun; Chung, Seok-Ho; Seo, Heung-Won

    2015-06-01

    Energy-saving devices for 317K VLCC have been developed from a propulsion standpoint. Two ESD candidates were designed via computational tools. The first device WAFon composes of flow-control fins adapted for the ship wake to reduce the loss of rotational energy. The other is WAFon-D, which is a WAFon with a duct to obtain additional thrust and to distribute the inflow velocity on the propeller plane uniform. After selecting the candidates from the computed results, the speed performances were validated with model-tests. The hydrodynamic characteristics of the ESDs may be found in improved hull and propulsive efficiencies through increased wake fraction.

  14. Operating Room Time Savings with the Use of Splint Packs: A Randomized Controlled Trial

    PubMed Central

    Gonzalez, Tyler A.; Bluman, Eric M.; Palms, David; Smith, Jeremy T.; Chiodo, Christopher P.

    2016-01-01

    Background: The most expensive variable in the operating room (OR) is time. Lean Process Management is being used in the medical field to improve efficiency in the OR. Streamlining individual processes within the OR is crucial to a comprehensive time saving and cost-cutting health care strategy. At our institution, one hour of OR time costs approximately $500, exclusive of supply and personnel costs. Commercially prepared splint packs (SP) contain all components necessary for plaster-of-Paris short-leg splint application and have the potential to decrease splint application time and overall costs by making it a more lean process. We conducted a randomized controlled trial comparing OR time savings between SP use and bulk supply (BS) splint application. Methods: Fifty consecutive adult operative patients on whom post-operative short-leg splint immobilization was indicated were randomized to either a control group using BS or an experimental group using SP. One orthopaedic surgeon (EMB) prepared and applied all of the splints in a standardized fashion. Retrieval time, preparation time, splint application time, and total splinting time for both groups were measured and statistically analyzed. Results: The retrieval time, preparation time and total splinting time were significantly less (p<0.001) in the SP group compared with the BS group. There was no significant difference in application time between the SP group and BS group. Conclusion: The use of SP made the process of splinting more lean. This has resulted in an average of 2 minutes 52 seconds saved in total splinting time compared to BS, making it an effective cost-cutting and time saving technique. For high volume ORs, use of splint packs may contribute to substantial time and cost savings without impacting patient safety. PMID:26894212

  15. The increasing importance of a continence nurse specialist to improve outcomes and save costs of urinary incontinence care: an analysis of future policy scenarios.

    PubMed

    Franken, Margreet G; Corro Ramos, Isaac; Los, Jeanine; Al, Maiwenn J

    2018-02-17

    In an ageing population, it is inevitable to improve the management of care for community-dwelling elderly with incontinence. A previous study showed that implementation of the Optimum Continence Service Specification (OCSS) for urinary incontinence in community-dwelling elderly with four or more chronic diseases results in a reduction of urinary incontinence, an improved quality of life, and lower healthcare and lower societal costs. The aim of this study was to explore future consequences of the OCSS strategy of various healthcare policy scenarios in an ageing population. We adapted a previously developed decision analytical model in which the OCSS new care strategy was operationalised as the appointment of a continence nurse specialist located within the general practice in The Netherlands. We used a societal perspective including healthcare costs (healthcare providers, treatment costs, insured containment products, insured home care), and societal costs (informal caregiving, containment products paid out-of-pocket, travelling expenses, home care paid out-of-pocket). All outcomes were computed over a three-year time period using two different base years (2014 and 2030). Settings for future policy scenarios were based on desk-research and expert opinion. Our results show that implementation of the OSCC new care strategy for urinary incontinence would yield large health gains in community dwelling elderly (2030: 2592-2618 QALYs gained) and large cost-savings in The Netherlands (2030: health care perspective: €32.4 Million - €72.5 Million; societal perspective: €182.0 Million - €250.6 Million). Savings can be generated in different categories which depends on healthcare policy. The uncertainty analyses and extreme case scenarios showed the robustness of the results. Implementation of the OCSS new care strategy for urinary incontinence results in an improvement in the quality of life of community-dwelling elderly, a reduction of the costs for payers and affected elderly, and a reduction in time invested by carers. Various realistic policy scenarios even forecast larger health gains and cost-savings in the future. More importantly, the longer the implementation is postponed the larger the savings foregone. The future organisation of healthcare affects the category in which the greatest savings will be generated.

  16. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  17. [Measures to reduce lighting-related energy use and costs at hospital nursing stations].

    PubMed

    Su, Chiu-Ching; Chen, Chen-Hui; Chen, Shu-Hwa; Ping, Tsui-Chu

    2011-06-01

    Hospitals have long been expected to deliver medical services in an environment that is comfortable and bright. This expectation keeps hospital energy demand stubbornly high and energy costs spiraling due to escalating utility fees. Hospitals must identify appropriate strategies to control electricity usage in order to control operating costs effectively. This paper proposes several electricity saving measures that both support government policies aimed at reducing global warming and help reduce energy consumption at the authors' hospital. The authors held educational seminars, established a website teaching energy saving methods, maximized facility and equipment use effectiveness (e.g., adjusting lamp placements, power switch and computer saving modes), posted signs promoting electricity saving, and established a regularized energy saving review mechanism. After implementation, average nursing staff energy saving knowledge had risen from 71.8% to 100% and total nursing station electricity costs fell from NT$16,456 to NT$10,208 per month, representing an effective monthly savings of 37.9% (NT$6,248). This project demonstrated the ability of a program designed to slightly modify nursing staff behavior to achieve effective and meaningful results in reducing overall electricity use.

  18. 31 CFR 315.10 - Limitations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... each series of bonds and savings notes for each specific year, which has varied from time to time, can..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT REGULATIONS GOVERNING U.S. SAVINGS BONDS, SERIES A, B, C.... Specific limitations have been placed on the amounts of bonds of each series and savings notes that might...

  19. 31 CFR 315.10 - Limitations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... each series of bonds and savings notes for each specific year, which has varied from time to time, can..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT REGULATIONS GOVERNING U.S. SAVINGS BONDS, SERIES A, B, C.... Specific limitations have been placed on the amounts of bonds of each series and savings notes that might...

  20. 31 CFR 315.10 - Limitations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... each series of bonds and savings notes for each specific year, which has varied from time to time, can..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT REGULATIONS GOVERNING U.S. SAVINGS BONDS, SERIES A, B, C.... Specific limitations have been placed on the amounts of bonds of each series and savings notes that might...

  1. 31 CFR 315.10 - Limitations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... each series of bonds and savings notes for each specific year, which has varied from time to time, can..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT REGULATIONS GOVERNING U.S. SAVINGS BONDS, SERIES A, B, C.... Specific limitations have been placed on the amounts of bonds of each series and savings notes that might...

  2. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  3. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  4. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  5. Overcoming the Power Wall by Exploiting Application Inexactness and Emerging COTS Architectural Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fagan, Mike; Schlachter, Jeremy; Yoshii, Kazutomo

    Abstract—Energy and power consumption are major limitations to continued scaling of computing systems. Inexactness where the quality of the solution can be traded for energy savings has been proposed as a counterintuitive approach to overcoming those limitation. However, in the past, inexactness has been necessitated the need for highly customized or specialized hardware. In order to move away from customization, in earlier work [4], it was shown that by interpreting precision in the computation to be the parameter to trade to achieve inexactness, weather prediction and page rank could both benefit in terms of yielding energy savings through reduced precision,more » while preserving the quality of the application. However, this required representations of numbers that were not readily available on commercial off-the-shelf (COTS) processors. In this paper, we provide opportunities for extending the the notion of trading precision for energy savings into the world COTS. We provide a model and analyze the opportunities and behavior of all three IEEE compliant precision values available on COTS processors: (i) double (ii) single, and (iii) half. Through measurements, we show through a limit study energy savings in going from double to half precision can potentially exceed a factor of four, largely due to memory and cache effects.« less

  6. Value of three-dimensional volume rendering images in the assessment of the centrality index for preoperative planning in patients with renal masses.

    PubMed

    Sofia, C; Magno, C; Silipigni, S; Cantisani, V; Mucciardi, G; Sottile, F; Inferrera, A; Mazziotti, S; Ascenti, G

    2017-01-01

    To evaluate the precision of the centrality index (CI) measurement on three-dimensional (3D) volume rendering technique (VRT) images in patients with renal masses, compared to its standard measurement on axial images. Sixty-five patients with renal lesions underwent contrast-enhanced multidetector (MD) computed tomography (CT) for preoperative imaging. Two readers calculated the CI on two-dimensional axial images and on VRT images, measuring it in the plane that the tumour and centre of the kidney were lying in. Correlation and agreement of interobserver measurements and inter-method results were calculated using intraclass correlation (ICC) coefficients and the Bland-Altman method. Time saving was also calculated. The correlation coefficients were r=0.99 (p<0.05) and r=0.99 (p<0.05) for both the CI on axial and VRT images, with an ICC of 0.99, and 0.99, respectively. Correlation between the two methods of measuring the CI on VRT and axial CT images was r=0.99 (p<0.05). The two methods showed a mean difference of -0.03 (SD 0.13). Mean time saving per each examination with VRT was 45.5%. The present study showed that VRT and axial images produce almost identical values of CI, with the advantages of greater ease of execution and a time saving of almost 50% for 3D VRT images. In addition, VRT provides an integrated perspective that can better assist surgeons in clinical decision making and in operative planning, suggesting this technique as a possible standard method for CI measurement. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  7. 31 CFR 321.27 - Supplements, amendments, or revisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FINANCIAL INSTITUTIONS OF UNITED STATES SAVINGS BONDS AND UNITED STATES SAVINGS NOTES (FREEDOM SHARES... any time or from time to time, revise, supplement, amend or withdraw, in whole or in part, the...

  8. Reinforcement Learning and Savings Behavior*

    PubMed Central

    Choi, James J.; Laibson, David; Madrian, Brigitte C.; Metrick, Andrew

    2009-01-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)—a high average and/or low variance return—increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-varying investor-level heterogeneity that is correlated with portfolio allocations to stock, bond, and cash asset classes. We discuss implications for the equity premium puzzle and interventions aimed at improving household financial outcomes. PMID:20352013

  9. Saving in cycles: how to get people to save more money.

    PubMed

    Tam, Leona; Dholakia, Utpal

    2014-02-01

    Low personal savings rates are an important social issue in the United States. We propose and test one particular method to get people to save more money that is based on the cyclical time orientation. In contrast to conventional, popular methods that encourage individuals to ignore past mistakes, focus on the future, and set goals to save money, our proposed method frames the savings task in cyclical terms, emphasizing the present. Across the studies, individuals who used our proposed cyclical savings method, compared with individuals who used a linear savings method, provided an average of 74% higher savings estimates and saved an average of 78% more money. We also found that the cyclical savings method was more efficacious because it increased implementation planning and lowered future optimism regarding saving money.

  10. Efficacy of the SU(3) scheme for ab initio large-scale calculations beyond the lightest nuclei

    DOE PAGES

    Dytrych, T.; Maris, P.; Launey, K. D.; ...

    2016-06-22

    We report on the computational characteristics of ab initio nuclear structure calculations in a symmetry-adapted no-core shell model (SA-NCSM) framework. We examine the computational complexity of the current implementation of the SA-NCSM approach, dubbed LSU3shell, by analyzing ab initio results for 6Li and 12C in large harmonic oscillator model spaces and SU3-selected subspaces. We demonstrate LSU3shell’s strong-scaling properties achieved with highly-parallel methods for computing the many-body matrix elements. Results compare favorably with complete model space calculations and significant memory savings are achieved in physically important applications. In particular, a well-chosen symmetry-adapted basis affords memory savings in calculations of states withmore » a fixed total angular momentum in large model spaces while exactly preserving translational invariance.« less

  11. Efficacy of the SU(3) scheme for ab initio large-scale calculations beyond the lightest nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dytrych, T.; Maris, Pieter; Launey, K. D.

    2016-06-09

    We report on the computational characteristics of ab initio nuclear structure calculations in a symmetry-adapted no-core shell model (SA-NCSM) framework. We examine the computational complexity of the current implementation of the SA-NCSM approach, dubbed LSU3shell, by analyzing ab initio results for 6Li and 12C in large harmonic oscillator model spaces and SU(3)-selected subspaces. We demonstrate LSU3shell's strong-scaling properties achieved with highly-parallel methods for computing the many-body matrix elements. Results compare favorably with complete model space calculations and signi cant memory savings are achieved in physically important applications. In particular, a well-chosen symmetry-adapted basis a ords memory savings in calculations ofmore » states with a fixed total angular momentum in large model spaces while exactly preserving translational invariance.« less

  12. Direct numerical simulation of sheared turbulent flow

    NASA Technical Reports Server (NTRS)

    Harris, Vascar G.

    1994-01-01

    The summer assignment to study sheared turbulent flow was divided into three phases which were: (1) literature survey, (2) computational familiarization, and (3) pilot computational studies. The governing equations of fluid dynamics or Navier-Stokes equations describe the velocity, pressure, and density as functions of position and time. In principle, when combined with conservation equations for mass, energy, and thermodynamic state of the fluid a determinate system could be obtained. In practice the Navier-Stokes equations have not been solved due to the nonlinear nature and complexity of these equations. Consequently, the importance of experiments in gaining insight for understanding the physics of the problem has been an ongoing process. Reasonable computer simulations of the problem have occured as the computational speed and storage of computers has evolved. The importance of the microstructure of the turbulence dictates the need for high resolution grids in extracting solutions which contain the physical mechanisms which are essential to a successful simulation. The recognized breakthrough occurred as a result of the pioneering work of Orzag and Patterson in which the Navier-Stokes equations were solved numerically utilizing a time saving toggling technique between physical and wave space, known as a spectral method. An equally analytically unsolvable problem, containing the same quasi-chaotic nature as turbulence, is known as the three body problem which was studied computationally as a first step this summer. This study was followed by computations of a two dimensional (2D) free shear layer.

  13. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Youngsoo; Carlberg, Kevin Thomas

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over allmore » space and time in a weighted ℓ 2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.« less

  14. 31 CFR 351.6 - When may I redeem my Series EE savings bond?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... before January 1, 2003. You may redeem your Series EE savings bond at any time beginning six months after... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false When may I redeem my Series EE savings... SAVINGS BONDS, SERIES EE Maturities, Redemption Values, and Investment Yields of Series EE Savings Bonds...

  15. 31 CFR 351.6 - When may I redeem my Series EE savings bond?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... before January 1, 2003. You may redeem your Series EE savings bond at any time beginning six months after... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false When may I redeem my Series EE savings... SAVINGS BONDS, SERIES EE Maturities, Redemption Values, and Investment Yields of Series EE Savings Bonds...

  16. 31 CFR 351.6 - When may I redeem my Series EE savings bond?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... before January 1, 2003. You may redeem your Series EE savings bond at any time beginning six months after... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false When may I redeem my Series EE savings... SAVINGS BONDS, SERIES EE Maturities, Redemption Values, and Investment Yields of Series EE Savings Bonds...

  17. Web-based management of diabetes through glucose uploads: has the time come for telemedicine?

    PubMed

    Azar, Madona; Gabbay, Robert

    2009-01-01

    This review focuses on the burgeoning use of web-based systems allowing patient-initiated glucometer uploads to facilitate provider treatment intensification. Studies in type 1 diabetes tended to show equivalent HbA1c improvements in both intervention and control groups without statistically significant difference. In contrast, type 2 patients seemed to do better than controls with significant differences in HbA1c. Patients were the beneficiaries of web-based diabetes management both through savings in time and cost. Major obstacles to wider implementation are patient computer skills, adherence to the technology, architectural and technical design, and the need to reimburse providers for their care.

  18. Approximate method for predicting the permanent set in a beam in vacuo and in water subject to a shock wave

    NASA Technical Reports Server (NTRS)

    Stiehl, A. L.; Haberman, R. C.; Cowles, J. H.

    1988-01-01

    An approximate method to compute the maximum deformation and permanent set of a beam subjected to shock wave laoding in vacuo and in water was investigated. The method equates the maximum kinetic energy of the beam (and water) to the elastic plastic work done by a static uniform load applied to a beam. Results for the water case indicate that the plastic deformation is controlled by the kinetic energy of the water. The simplified approach can result in significant savings in computer time or it can expediently be used as a check of results from a more rigorous approach. The accuracy of the method is demonstrated by various examples of beams with simple support and clamped support boundary conditions.

  19. Multiplexed Predictive Control of a Large Commercial Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Richter, hanz; Singaraju, Anil; Litt, Jonathan S.

    2008-01-01

    Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Byoung Yoon; Roberts, Barry L.

    The three-dimensional finite element mesh capturing realistic geometries of Bayou Choctaw site has been constructed using the sonar and seismic survey data obtained from the field. The mesh is consisting of hexahedral elements because the salt constitutive model is coded using hexahedral elements. Various ideas and techniques to construct finite element mesh capturing artificially and naturally formed geometries are provided. The techniques to reduce the number of elements as much as possible to save on computer run time with maintaining the computational accuracy is also introduced. The steps and methodologies could be applied to construct the meshes of Big Hill,more » Bryan Mound, and West Hackberry strategic petroleum reserve sites. The methodology could be applied to the complicated shape masses for not only various civil and geological structures but also biological applications such as artificial limbs.« less

  1. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    NASA Astrophysics Data System (ADS)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  2. Bidirectional Reflectance of a Macroscopically Flat, High-Albedo Particulate Surface: An Efficient Radiative Transfer Solution and Applications to Regoliths

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadia T.

    1999-01-01

    Many remote sensing applications rely on accurate knowledge of the bidirectional reflection function (BRF) of surfaces composed of discrete, randomly positioned scattering particles. Theoretical computations of BRFs for plane-parallel particulate layers are usually reduced to solving the radiative transfer equation (RTE) using one of existing exact or approximate techniques. Since semi-empirical approximate approaches are notorious for their low accuracy, violation of the energy conservation law, and ability to produce unphysical results, the use of numerically exact solutions of RTE has gained justified popularity. For example, the computation of BRFs for macroscopically flat particulate surfaces in many geophysical publications is based on the adding-doubling (AD) and discrete ordinate (DO) methods. A further saving of computer resources can be achieved by using a more efficient technique to solve the plane-parallel RTE than the AD and DO methods. Since many natural particulate surfaces can be well represented by the model of an optically semi-infinite, homogeneous scattering layer, one can find the BRF directly by solving the Ambartsumian's nonlinear integral equation using a simple iterative technique. In this way, the computation of the internal radiation field is avoided and the computer code becomes highly efficient and very accurate and compact. Furthermore, the BRF thus obtained fully obeys the fundamental physical laws of energy conservation and reciprocity. In this paper, we discuss numerical aspects and the computer implementation of this technique, examine the applicability of the Henyey-Greenstein phase function and the sigma-Eddington approximation in BRF and flux calculations, and describe sample applications demonstrating the potential effect of particle shape on the bidirectional reflectance of flat regolith surfaces. Although the effects of packing density and coherent backscattering are currently neglected, they can also be incorporated. The FORTRAN implementation of the technique is available on the World Wide Web, and can be applied to a wide range of remote sensing problems. BRF computations for undulated (macroscopically rough) surfaces are more complicated and often rely on time consuming Monte Carlo procedures. This approach is especially inefficient for optically thick, weakly absorbing media (e.g., snow and desert surfaces at visible wavelengths since a photon may undergo many internal scattering events before it exists the medium or is absorbed. However, undulated surfaces can often be represented as collections of locally flat tilted facets characterized by the BRF found from the traditional plane parallel RTE. In this way the MOnte Carlo procedure could be used only to evaluate the effects of surface shadowing and multiple surface reflections, thereby bypassing the time-consuming ray tracing inside the medium and providing a great savings of CPU time.

  3. Pharmacy costs associated with nonformulary drug requests.

    PubMed

    Sweet, B V; Stevenson, J G

    2001-09-15

    Pharmacy costs associated with handling nonformulary drug requests were studied. Data for all nonformulary drug orders received at a university hospital between August 1 and October 31, 1999, were evaluated to determine their outcome and the cost differential between the nonformulary drug and formulary alternative. Two sets of data were used to analyze medication costs: data from nonformulary medication request forms, which allowed the cost of nonformulary drugs and their formulary alternatives to be calculated, and data from the pharmacy computer system, which enabled actual nonformulary drug use to be captured. Labor costs associated with processing these requests were determined through time analysis, which included the potential for orders to be received at different times of the day and with different levels of technician and pharmacist support. Economic analysis revealed that the greatest cost saving occurred when converting nonformulary injectable products to formulary alternatives. Interventions were least costly during normal business hours, when all the satellite pharmacies were open and fully staffed. Pharmacists' interventions in oral product orders resulted in a net increase in expenditures. Incremental pharmacy costs associated with processing nonformulary medication requests in an inpatient setting are greater than the drug acquisition cost saving for most agents, particularly oral medications.

  4. Efficiency Benefits Using the Terminal Area Precision Scheduling and Spacing System

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Swenson, Harry N.; Lin, Paul; Seo, Anthony Y.; Bagasol, Leonard N.

    2011-01-01

    NASA has developed a capability for terminal area precision scheduling and spacing (TAPSS) to increase the use of fuel-efficient arrival procedures during periods of traffic congestion at a high-density airport. Sustained use of fuel-efficient procedures throughout the entire arrival phase of flight reduces overall fuel burn, greenhouse gas emissions and noise pollution. The TAPSS system is a 4D trajectory-based strategic planning and control tool that computes schedules and sequences for arrivals to facilitate optimal profile descents. This paper focuses on quantifying the efficiency benefits associated with using the TAPSS system, measured by reduction of level segments during aircraft descent and flight distance and time savings. The TAPSS system was tested in a series of human-in-the-loop simulations and compared to current procedures. Compared to the current use of the TMA system, simulation results indicate a reduction of total level segment distance by 50% and flight distance and time savings by 7% in the arrival portion of flight (200 nm from the airport). The TAPSS system resulted in aircraft maintaining continuous descent operations longer and with more precision, both achieved under heavy traffic demand levels.

  5. A Computer Simulation of Employee Vaccination to Mitigate an Influenza Epidemic

    PubMed Central

    Lee, Bruce Y.; Brown, Shawn T.; Cooley, Philip C.; Zimmerman, Richard K.; Wheaton, William D.; Zimmer, Shanta M.; Grefenstette, John J.; Assi, Tina-Marie; Furphy, Timothy J.; Wagener, Diane K.; Burke, Donald S.

    2010-01-01

    Background Determining the effects of varying vaccine coverage, compliance, administration rates, prioritization, and timing among employees during an influenza pandemic. Methods As part of the Models of Infectious Disease Agent Study (MIDAS) network’s H1N1 influenza planning efforts, an agent-based computer simulation model (ABM) was developed of the Washington, DC metropolitan region, encompassing five metropolitan statistical areas. Each simulation run involved introducing 100 infectious individuals to initiate a 1.3 reproductive rate (R0) epidemic, consistent with H1N1 parameters to date. Another set of scenarios represented a R0=1.6 epidemic. Results An unmitigated epidemic resulted in substantial productivity losses (a mean of $112.6 million for a serologic 15% attack rate and $193.8 million for a serologic 25% attack rate), even with the relatively low estimated mortality impact of H1N1. While vaccinating Advisory Committee on Immunization Practices (ACIP) priority groups resulted in the largest savings, vaccinating all remaining workers captured additional savings and, in fact, reduced healthcare workers’ and critical infrastructure workers’ chances of infection. While employee vaccination compliance affected the epidemic, once 20% compliance was achieved, additional increases in compliance provided less incremental benefit. Even though a vast majority of the workplaces in the DC Metro region had fewer than 100 employees, focusing on vaccinating only those in larger firms (≥100 employees) was just as effective in mitigating the epidemic as trying to vaccinate all workplaces. Conclusions Timely vaccination of at least 20% of the large company workforce can play an important role in epidemic mitigation. PMID:20042311

  6. VA Telemedicine: An Analysis of Cost and Time Savings.

    PubMed

    Russo, Jack E; McCool, Ryan R; Davies, Louise

    2016-03-01

    The Veterans Affairs (VA) healthcare system provides beneficiary travel reimbursement ("travel pay") to qualifying patients for traveling to appointments. Travel pay is a large expense for the VA and hence the U.S. Government, projected to cost nearly $1 billion in 2015. Telemedicine in the VA system has the potential to save money by reducing patient travel and thus the amount of travel pay disbursed. In this study, we quantify this savings and also report trends in VA telemedicine volumes over time. All telemedicine visits based at the VA Hospital in White River Junction, VT between 2005 and 2013 were reviewed (5,695 visits). Travel distance and time saved as a result of telemedicine were calculated. Clinical volume in the mental health department, which has had the longest participation in telemedicine, was analyzed. Telemedicine resulted in an average travel savings of 145 miles and 142 min per visit. This led to an average travel payment savings of $18,555 per year. Telemedicine volume grew significantly over the study period such that by the final year the travel pay savings had increased to $63,804, or about 3.5% of the total travel pay disbursement for that year. The number of mental health telemedicine visits rose over the study period but remained small relative to the number of face-to-face visits. A higher proportion of telemedicine visits involved new patients. Telemedicine at the VA saves travel distance and time, although the reduction in travel payments remains modest at current telemedicine volumes.

  7. Entrepreneurial systems. Do it yourself for profit and pleasure.

    PubMed

    Manuel, G; Young, K

    1991-11-28

    You don't have to have a degree in computing or a 1 m pounds budget to create a system that improves efficiency or saves money. Gren Manuel introduces a special report which celebrates the small-scale initiatives devised and implemented by health staff armed only with a personal computer.

  8. Computing in the Clouds

    ERIC Educational Resources Information Center

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…

  9. Wireless Computing in the Library: A Successful Model at St. Louis Community College.

    ERIC Educational Resources Information Center

    Patton, Janice K.

    2001-01-01

    Describes the St. Louis Community College (Missouri) library's use of laptop computers in the instruction lab as a way to save space and wiring costs. Discusses the pros and cons of wireless library instruction-advantages include its flexibility and its ability to eliminate cabling. (NB)

  10. Influence of Implementation of Composite Materials in Civil Aircraft Industry on reduction of Environmental Pollution and Greenhouse Effect

    NASA Astrophysics Data System (ADS)

    Beck, A. J.; Hodzic, A.; Soutis, C.; Wilson, C. W.

    2011-12-01

    Computer-based Life Cycle Analysis (LCA) models were carried out to compare lightweight composites with the traditional aluminium over their useful lifetime. The analysis included raw materials, production, useful life in operation and disposal at the end of the material's useful life. The carbon fibre epoxy resin composite could in some cases reduce the weight of a component by up to 40 % compared to aluminium. As the fuel consumption of an aircraft is strongly influenced by its total weight, the emissions can be significantly reduced by increasing the proportion of composites used in the aircraft structure. Higher emissions, compared to aluminium, produced during composites production meet their 'break even' point after certain number of time units when used in aircraft structures, and continue to save emissions over their long-term operation. The study highlighted the environmental benefits of using lightweight structures in aircraft design, and also showed that utilisation of composites in products without energy saving may lead to increased emissions in the environment.

  11. Discretization of the induced-charge boundary integral equation.

    PubMed

    Bardhan, Jaydeep P; Eisenberg, Robert S; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  12. Discretization of the induced-charge boundary integral equation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Eisenberg, Robert S.; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  13. Industrial Productivity

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASTRAN is an offshoot of the computer-design technique used in construction of airplanes and spacecraft. [n this technique engineers create a mathematical model of the aeronautical or space vehicle and "fly" it on the ground by means of computer simulation. The technique enables them to study performance and structural behavior of a number of different designs before settling on the final configuration and proceeding with construction. From this base of aerospace experience, NASA-Goddard developed the NASTRAN general purpose computer program, which offers an exceptionally wide range of analytic capability with regard to structures. NASTRAN has been applied to autos, trucks, railroad cars, ships, nuclear power reactors, steam turbines, bridges, and office buildings. NASA-Langley provides program maintenance services regarded as vital by many NASTRAN users. NASTRAN is essentially a predictive tool. It takes an electronic look at a computerire$.dedgn and reports how the structure will react under a great many different conditions. It can, for example, note areas where high stress levels will occur-potential failure points that need strengthening. Conversely, it can identify over-designed areas where weight and material might be saved safely. NASTRAN can tell how pipes stand up under strong fluid flow, how metals are affected by high temperatures, how a building will fare in an earthquake or how powerful winds will cause a bridge to oscillate. NASTRAN analysis is quick and inexpensive. It minimizes trial-and-error in the design process and makes possible better, safe, lighter structures affording large-scale savings in development time and materials. Some examples of the broad utility NASTRAN is finding among industrial firms are shown on these pages.

  14. Leyla loop: a time-saving suture technique for robotic atrial closure

    PubMed Central

    Kılıç, Leyla; Şenay, Şahin; Ümit Güllü, A.; Alhan, Cem

    2013-01-01

    The longer durations of cardiopulmonary bypass and aortic cross-clamp times remain the disadvantages of robotic or minimally invasive cardiac surgery. For this reason, every small contribution to speeding up these procedures is of the utmost importance. Here, we present a practical, easy and time-saving suture technique for atrial closure. It consists of a hand-made loop at one end of the suture and saves the time otherwise consumed by knotting. It may also be used during conventional or minimally invasive cardiac surgery. PMID:23760357

  15. Energy Efficiency in Public Buildings through Context-Aware Social Computing.

    PubMed

    García, Óscar; Alonso, Ricardo S; Prieto, Javier; Corchado, Juan M

    2017-04-11

    The challenge of promoting behavioral changes in users that leads to energy savings in public buildings has become a complex task requiring the involvement of multiple technologies. Wireless sensor networks have a great potential for the development of tools, such as serious games, that encourage acquiring good energy and healthy habits among users in the workplace. This paper presents the development of a serious game using CAFCLA, a framework that allows for integrating multiple technologies, which provide both context-awareness and social computing. Game development has shown that the data provided by sensor networks encourage users to reduce energy consumption in their workplace and that social interactions and competitiveness allow for accelerating the achievement of good results and behavioral changes that favor energy savings.

  16. Visualization of Unsteady Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1997-01-01

    The current compute environment that most researchers are using for the calculation of 3D unsteady Computational Fluid Dynamic (CFD) results is a super-computer class machine. The Massively Parallel Processors (MPP's) such as the 160 node IBM SP2 at NAS and clusters of workstations acting as a single MPP (like NAS's SGI Power-Challenge array and the J90 cluster) provide the required computation bandwidth for CFD calculations of transient problems. If we follow the traditional computational analysis steps for CFD (and we wish to construct an interactive visualizer) we need to be aware of the following: (1) Disk space requirements. A single snap-shot must contain at least the values (primitive variables) stored at the appropriate locations within the mesh. For most simple 3D Euler solvers that means 5 floating point words. Navier-Stokes solutions with turbulence models may contain 7 state-variables. (2) Disk speed vs. Computational speeds. The time required to read the complete solution of a saved time frame from disk is now longer than the compute time for a set number of iterations from an explicit solver. Depending, on the hardware and solver an iteration of an implicit code may also take less time than reading the solution from disk. If one examines the performance improvements in the last decade or two, it is easy to see that depending on disk performance (vs. CPU improvement) may not be the best method for enhancing interactivity. (3) Cluster and Parallel Machine I/O problems. Disk access time is much worse within current parallel machines and cluster of workstations that are acting in concert to solve a single problem. In this case we are not trying to read the volume of data, but are running the solver and the solver outputs the solution. These traditional network interfaces must be used for the file system. (4) Numerics of particle traces. Most visualization tools can work upon a single snap shot of the data but some visualization tools for transient problems require dealing with time.

  17. Spinoff 2015

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.

  18. Extraction of linear features on SAR imagery

    NASA Astrophysics Data System (ADS)

    Liu, Junyi; Li, Deren; Mei, Xin

    2006-10-01

    Linear features are usually extracted from SAR imagery by a few edge detectors derived from the contrast ratio edge detector with a constant probability of false alarm. On the other hand, the Hough Transform is an elegant way of extracting global features like curve segments from binary edge images. Randomized Hough Transform can reduce the computation time and memory usage of the HT drastically. While Randomized Hough Transform will bring about a great deal of cells invalid during the randomized sample. In this paper, we propose a new approach to extract linear features on SAR imagery, which is an almost automatic algorithm based on edge detection and Randomized Hough Transform. The presented improved method makes full use of the directional information of each edge candidate points so as to solve invalid cumulate problems. Applied result is in good agreement with the theoretical study, and the main linear features on SAR imagery have been extracted automatically. The method saves storage space and computational time, which shows its effectiveness and applicability.

  19. Graphical Interface for the Study of Gas-Phase Reaction Kinetics: Cyclopentene Vapor Pyrolysis

    NASA Astrophysics Data System (ADS)

    Marcotte, Ronald E.; Wilson, Lenore D.

    2001-06-01

    The undergraduate laboratory experiment on the pyrolysis of gaseous cyclopentene has been modernized to improve safety, speed, and precision and to better reflect the current practice of physical chemistry. It now utilizes virtual instrument techniques to create a graphical computer interface for the collection and display of experimental data. An electronic pressure gauge has replaced the mercury manometer formerly needed in proximity to the 500 °C pyrolysis oven. Students have much better real-time information available to them and no longer require multiple lab periods to get rate constants and acceptable Arrhenius parameters. The time saved on manual data collection is used to give the students a tour of the computer interfacing hardware and software and a hands-on introduction to gas-phase reagent preparation using a research-grade high-vacuum system. This includes loading the sample, degassing it by the freeze-pump-thaw technique, handling liquid nitrogen and working through the logic necessary for each reconfiguration of the diffusion pump section and the submanifolds.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Vega, F F; Cantu-Paz, E; Lopez, J I

    The population size of genetic algorithms (GAs) affects the quality of the solutions and the time required to find them. While progress has been made in estimating the population sizes required to reach a desired solution quality for certain problems, in practice the sizing of populations is still usually performed by trial and error. These trials might lead to find a population that is large enough to reach a satisfactory solution, but there may still be opportunities to optimize the computational cost by reducing the size of the population. This paper presents a technique called plague that periodically removes amore » number of individuals from the population as the GA executes. Recently, the usefulness of the plague has been demonstrated for genetic programming. The objective of this paper is to extend the study of plagues to genetic algorithms. We experiment with deceptive trap functions, a tunable difficult problem for GAs, and the experiments show that plagues can save computational time while maintaining solution quality and reliability.« less

  1. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  2. The effectiveness of a new algorithm on a three-dimensional finite element model construction of bone trabeculae in implant biomechanics.

    PubMed

    Sato, Y; Teixeira, E R; Tsuga, K; Shindoi, N

    1999-08-01

    More validity of finite element analysis (FEA) in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To evaluate the effectiveness of a new algorithm established for more valid FEA model construction without downsizing, three-dimensional FEA bone trabeculae models with different element sizes (300, 150 and 75 micron) were constructed. Four algorithms of stepwise (1 to 4 ranks) assignment of Young's modulus accorded with bone volume in the individual cubic element was used and then stress distribution against vertical loading was analysed. The model with 300 micron element size, with 4 ranks of Young's moduli accorded with bone volume in each element presented similar stress distribution to the model with the 75 micron element size. These results show that the new algorithm was effective, and the use of the 300 micron element for bone trabeculae representation was proposed, without critical changes in stress values and for possible savings on computer memory and calculation time in the laboratory.

  3. OpenMP parallelization of a gridded SWAT (SWATG)

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  4. Numerical study of the effects of icing on viscous flow over wings

    NASA Technical Reports Server (NTRS)

    Sankar, L. N.

    1994-01-01

    An improved hybrid method for computing unsteady compressible viscous flows is presented. This method divides the computational domain into two zones. In the outer zone, the unsteady full-potential equation (FPE) is solved. In the inner zone, the Navier-Stokes equations are solved using a diagonal form of an alternating-direction implicit (ADI) approximate factorization procedure. The two zones are tightly coupled so that steady and unsteady flows may be efficiently solved. Characteristic-based viscous/inviscid interface boundary conditions are employed to avoid spurious reflections at that interface. The resulting CPU times are less than 60 percent of that required for a full-blown Navier-Stokes analysis for steady flow applications and about 60 percent of the Navier-Stokes CPU times for unsteady flows in non-vector processing machines. Applications of the method are presented for a rectangular NACA 0012 wing in low subsonic steady flow at moderate and high angles of attack, and for an F-5 wing in steady and unsteady subsonic and transonic flows. Steady surface pressures are in very good agreement with experimental data and are essentially identical to Navier-Stokes predictions. Density contours show that shocks cross the viscous/inviscid interface smoothly, so that the accuracy of full Navier-Stokes equations can be retained with a significant savings in computational time.

  5. Bending and stretching finite element analysis of anisotropic viscoelastic composite plates

    NASA Technical Reports Server (NTRS)

    Hilton, Harry H.; Yi, Sung

    1990-01-01

    Finite element algorithms have been developed to analyze linear anisotropic viscoelastic plates, with or without holes, subjected to mechanical (bending, tension), temperature, and hygrothermal loadings. The analysis is based on Laplace transforms rather than direct time integrations in order to improve the accuracy of the results and save on extensive computational time and storage. The time dependent displacement fields in the transverse direction for the cross ply and angle ply laminates are calculated and the stacking sequence effects of the laminates are discussed in detail. Creep responses for the plates with or without a circular hole are also studied. The numerical results compare favorably with analytical solutions, i.e. within 1.8 percent for bending and 10(exp -3) 3 percent for tension. The tension results of the present method are compared with those using the direct time integration scheme.

  6. A fast sequence assembly method based on compressed data structures.

    PubMed

    Liang, Peifeng; Zhang, Yancong; Lin, Kui; Hu, Jinglu

    2014-01-01

    Assembling a large genome using next generation sequencing reads requires large computer memory and a long execution time. To reduce these requirements, a memory and time efficient assembler is presented from applying FM-index in JR-Assembler, called FMJ-Assembler, where FM stand for FMR-index derived from the FM-index and BWT and J for jumping extension. The FMJ-Assembler uses expanded FM-index and BWT to compress data of reads to save memory and jumping extension method make it faster in CPU time. An extensive comparison of the FMJ-Assembler with current assemblers shows that the FMJ-Assembler achieves a better or comparable overall assembly quality and requires lower memory use and less CPU time. All these advantages of the FMJ-Assembler indicate that the FMJ-Assembler will be an efficient assembly method in next generation sequencing technology.

  7. Evaluation of subgrid-scale turbulence models using a fully simulated turbulent flow

    NASA Technical Reports Server (NTRS)

    Clark, R. A.; Ferziger, J. H.; Reynolds, W. C.

    1977-01-01

    An exact turbulent flow field was calculated on a three-dimensional grid with 64 points on a side. The flow simulates grid-generated turbulence from wind tunnel experiments. In this simulation, the grid spacing is small enough to include essentially all of the viscous energy dissipation, and the box is large enough to contain the largest eddy in the flow. The method is limited to low-turbulence Reynolds numbers, in our case R sub lambda = 36.6. To complete the calculation using a reasonable amount of computer time with reasonable accuracy, a third-order time-integration scheme was developed which runs at about the same speed as a simple first-order scheme. It obtains this accuracy by saving the velocity field and its first-time derivative at each time step. Fourth-order accurate space-differencing is used.

  8. Understanding survival analysis: Kaplan-Meier estimate.

    PubMed

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  9. Identification of Linear and Nonlinear Aerodynamic Impulse Responses Using Digital Filter Techniques

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1997-01-01

    This paper discusses the mathematical existence and the numerically-correct identification of linear and nonlinear aerodynamic impulse response functions. Differences between continuous-time and discrete-time system theories, which permit the identification and efficient use of these functions, will be detailed. Important input/output definitions and the concept of linear and nonlinear systems with memory will also be discussed. It will be shown that indicial (step or steady) responses (such as Wagner's function), forced harmonic responses (such as Theodorsen's function or those from doublet lattice theory), and responses to random inputs (such as gusts) can all be obtained from an aerodynamic impulse response function. This paper establishes the aerodynamic impulse response function as the most fundamental, and, therefore, the most computationally efficient, aerodynamic function that can be extracted from any given discrete-time, aerodynamic system. The results presented in this paper help to unify the understanding of classical two-dimensional continuous-time theories with modern three-dimensional, discrete-time theories. First, the method is applied to the nonlinear viscous Burger's equation as an example. Next the method is applied to a three-dimensional aeroelastic model using the CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code and then to a two-dimensional model using the CFL3D Navier-Stokes code. Comparisons of accuracy and computational cost savings are presented. Because of its mathematical generality, an important attribute of this methodology is that it is applicable to a wide range of nonlinear, discrete-time problems.

  10. Identification of Linear and Nonlinear Aerodynamic Impulse Responses Using Digital Filter Techniques

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1997-01-01

    This paper discusses the mathematical existence and the numerically-correct identification of linear and nonlinear aerodynamic impulse response functions. Differences between continuous-time and discrete-time system theories, which permit the identification and efficient use of these functions, will be detailed. Important input/output definitions and the concept of linear and nonlinear systems with memory will also be discussed. It will be shown that indicial (step or steady) responses (such as Wagner's function), forced harmonic responses (such as Tbeodorsen's function or those from doublet lattice theory), and responses to random inputs (such as gusts) can all be obtained from an aerodynamic impulse response function. This paper establishes the aerodynamic impulse response function as the most fundamental, and, therefore, the most computationally efficient, aerodynamic function that can be extracted from any given discrete-time, aerodynamic system. The results presented in this paper help to unify the understanding of classical two-dimensional continuous-time theories with modem three-dimensional, discrete-time theories. First, the method is applied to the nonlinear viscous Burger's equation as an example. Next the method is applied to a three-dimensional aeroelastic model using the CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code and then to a two-dimensional model using the CFL3D Navier-Stokes code. Comparisons of accuracy and computational cost savings are presented. Because of its mathematical generality, an important attribute of this methodology is that it is applicable to a wide range of nonlinear, discrete-time problems.

  11. Determining the Cost-Savings Threshold and Alignment Accuracy of Patient-Specific Instrumentation in Total Ankle Replacements.

    PubMed

    Hamid, Kamran S; Matson, Andrew P; Nwachukwu, Benedict U; Scott, Daniel J; Mather, Richard C; DeOrio, James K

    2017-01-01

    Traditional intraoperative referencing for total ankle replacements (TARs) involves multiple steps and fluoroscopic guidance to determine mechanical alignment. Recent adoption of patient-specific instrumentation (PSI) allows for referencing to be determined preoperatively, resulting in less steps and potentially decreased operative time. We hypothesized that usage of PSI would result in decreased operating room time that would offset the additional cost of PSI compared with standard referencing (SR). In addition, we aimed to compare postoperative radiographic alignment between PSI and SR. Between August 2014 and September 2015, 87 patients undergoing TAR were enrolled in a prospectively collected TAR database. Patients were divided into cohorts based on PSI vs SR, and operative times were reviewed. Radiographic alignment parameters were retrospectively measured at 6 weeks postoperatively. Time-driven activity-based costing (TDABC) was used to derive direct costs. Cost vs operative time-savings were examined via 2-way sensitivity analysis to determine cost-saving thresholds for PSI applicable to a range of institution types. Cost-saving thresholds defined the price of PSI below which PSI would be cost-saving. A total of 35 PSI and 52 SR cases were evaluated with no significant differences identified in patient characteristics. Operative time from incision to completion of casting in cases without adjunct procedures was 127 minutes with PSI and 161 minutes with SR ( P < .05). PSI demonstrated similar postoperative accuracy to SR in coronal tibial-plafond alignment (1.1 vs 0.3 degrees varus, P = .06), tibial-plafond alignment (0.3 ± 2.1 vs 1.1 ± 2.1 degrees varus, P = .06), and tibial component sagittal alignment (0.7 vs 0.9 degrees plantarflexion, P = .14). The TDABC method estimated a PSI cost-savings threshold range at our institution of $863 below which PSI pricing would provide net cost-savings. Two-way sensitivity analysis generated a globally applicable cost-savings threshold model based on institution-specific costs and surgeon-specific time-savings. This study demonstrated equivalent postoperative TAR alignment with PSI and SR referencing systems but with a significant decrease in operative time with PSI. Based on TDABC and associated sensitivity analysis, a cost-savings threshold of $863 was identified for PSI pricing at our institution below which PSI was less costly than SR. Similar internal cost accounting may benefit health care systems for identifying cost drivers and obtaining leverage during price negotiations. Level III, therapeutic study.

  12. I-SAVE: AN INTERACTIVE REAL-TIME MONITOR AND CONTROLLER TO INFLUENCE ENERGY CONSERVATION BEHAVIOR BY IMPULSE SAVING

    EPA Science Inventory

    Simulation-based model to explore the benefits of monitoring and control to energy saving opportunities in residential homes; an adaptive algorithm to predict the type of electrical loads; a prototype user friendly interface monitoring and control device to save energy; a p...

  13. A Parallel Pipelined Renderer for the Time-Varying Volume Data

    NASA Technical Reports Server (NTRS)

    Chiueh, Tzi-Cker; Ma, Kwan-Liu

    1997-01-01

    This paper presents a strategy for efficiently rendering time-varying volume data sets on a distributed-memory parallel computer. Time-varying volume data take large storage space and visualizing them requires reading large files continuously or periodically throughout the course of the visualization process. Instead of using all the processors to collectively render one volume at a time, a pipelined rendering process is formed by partitioning processors into groups to render multiple volumes concurrently. In this way, the overall rendering time may be greatly reduced because the pipelined rendering tasks are overlapped with the I/O required to load each volume into a group of processors; moreover, parallelization overhead may be reduced as a result of partitioning the processors. We modify an existing parallel volume renderer to exploit various levels of rendering parallelism and to study how the partitioning of processors may lead to optimal rendering performance. Two factors which are important to the overall execution time are re-source utilization efficiency and pipeline startup latency. The optimal partitioning configuration is the one that balances these two factors. Tests on Intel Paragon computers show that in general optimal partitionings do exist for a given rendering task and result in 40-50% saving in overall rendering time.

  14. Improving value of travel time savings estimation for more effective transportation project evaluation.

    DOT National Transportation Integrated Search

    2012-12-01

    Estimates of value of time (VOT) and value of travel time savings (VTTS) are critical elements in benefitcost : analyses of transportation projects and in developing congestion pricing policies. In addition, : differences in VTTS among various modes ...

  15. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  16. Controlling the error on target motion through real-time mesh adaptation: Applications to deep brain stimulation.

    PubMed

    Bui, Huu Phuoc; Tomar, Satyendra; Courtecuisse, Hadrien; Audette, Michel; Cotin, Stéphane; Bordas, Stéphane P A

    2018-05-01

    An error-controlled mesh refinement procedure for needle insertion simulations is presented. As an example, the procedure is applied for simulations of electrode implantation for deep brain stimulation. We take into account the brain shift phenomena occurring when a craniotomy is performed. We observe that the error in the computation of the displacement and stress fields is localised around the needle tip and the needle shaft during needle insertion simulation. By suitably and adaptively refining the mesh in this region, our approach enables to control, and thus to reduce, the error whilst maintaining a coarser mesh in other parts of the domain. Through academic and practical examples we demonstrate that our adaptive approach, as compared with a uniform coarse mesh, increases the accuracy of the displacement and stress fields around the needle shaft and, while for a given accuracy, saves computational time with respect to a uniform finer mesh. This facilitates real-time simulations. The proposed methodology has direct implications in increasing the accuracy, and controlling the computational expense of the simulation of percutaneous procedures such as biopsy, brachytherapy, regional anaesthesia, or cryotherapy. Moreover, the proposed approach can be helpful in the development of robotic surgeries because the simulation taking place in the control loop of a robot needs to be accurate, and to occur in real time. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Results of Computer Based Training.

    ERIC Educational Resources Information Center

    1978

    This report compares the projected savings of using computer based training to conduct training for newly hired pilots to the results of that application. New Hire training, one of a number of programs conducted continuously at the United Airline Flight Operations Training Center, is designed to assure that any newly hired pilot will be able to…

  18. Expanding HPC and Research Computing--The Sustainable Way

    ERIC Educational Resources Information Center

    Grush, Mary

    2009-01-01

    Increased demands for research and high-performance computing (HPC)--along with growing expectations for cost and environmental savings--are putting new strains on the campus data center. More and more, CIOs like the University of Notre Dame's (Indiana) Gordon Wishon are seeking creative ways to build more sustainable models for data center and…

  19. Drowning in PC Management: Could a Linux Solution Save Us?

    ERIC Educational Resources Information Center

    Peters, Kathleen A.

    2004-01-01

    Short on funding and IT staff, a Western Canada library struggled to provide adequate public computing resources. Staff turned to a Linux-based solution that supports up to 10 users from a single computer, and blends Web browsing and productivity applications with session management, Internet filtering, and user authentication. In this article,…

  20. Economics of Computing: The Case of Centralized Network File Servers.

    ERIC Educational Resources Information Center

    Solomon, Martin B.

    1994-01-01

    Discusses computer networking and the cost effectiveness of decentralization, including local area networks. A planned experiment with a centralized approach to the operation and management of file servers at the University of South Carolina is described that hopes to realize cost savings and the avoidance of staffing problems. (Contains four…

  1. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE PAGES

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-25

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  2. [Introduction of a bar coding pharmacy stock replenishment system in a prehospital emergency medical unit: economical impact].

    PubMed

    Dupuis, S; Fecci, J-L; Noyer, P; Lecarpentier, E; Chollet-Xémard, C; Margenet, A; Marty, J; Combes, X

    2009-01-01

    To assess economical impact after introduction of a bar coding pharmacy stock replenishment system in a prehospital emergency medical unit. Observational before and after study. A computer system using specific software and bare-code technology was introduced in the pre hospital emergency medical unit (Smur). Overall activity and costs related to pharmacy were recorded annually during two periods: the first 2 years period before computer system introduction and the second one during the 4 years following this system installation. The overall clinical activity increased by 10% between the two periods whereas pharmacy related costs continuously decreased after the start of pharmacy management computer system use. Pharmacy stock management was easier after introduction of the new stock replenishment system. The mean pharmacy related cost of one patient management was 13 Euros before and 9 Euros after the introduction of the system. The overall cost savings during the studied period was calculated to reach 134,000 Euros. The introduction of a specific pharmacy management computer system allowed to do important costs savings in a prehospital emergency medical unit.

  3. Extension of a nonlinear systems theory to general-frequency unsteady transonic aerodynamic responses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    A methodology for modeling nonlinear unsteady aerodynamic responses, for subsequent use in aeroservoelastic analysis and design, using the Volterra-Wiener theory of nonlinear systems is presented. The methodology is extended to predict nonlinear unsteady aerodynamic responses of arbitrary frequency. The Volterra-Wiener theory uses multidimensional convolution integrals to predict the response of nonlinear systems to arbitrary inputs. The CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code is used to generate linear and nonlinear unit impulse responses that correspond to each of the integrals for a rectangular wing with a NACA 0012 section with pitch and plunge degrees of freedom. The computed kernels then are used to predict linear and nonlinear unsteady aerodynamic responses via convolution and compared to responses obtained using the CAP-TSD code directly. The results indicate that the approach can be used to predict linear unsteady aerodynamic responses exactly for any input amplitude or frequency at a significant cost savings. Convolution of the nonlinear terms results in nonlinear unsteady aerodynamic responses that compare reasonably well with those computed using the CAP-TSD code directly but at significant computational cost savings.

  4. Software for Testing Electroactive Structural Components

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar

    2003-01-01

    A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.

  5. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krogel, Jaron T.; Reboredo, Fernando A.

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  6. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-01

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this work, we explore alternatives to reduce the memory usage of splined orbitals without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. For production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.

  7. Theorems on symmetries and flux conservation in radiative transfer using the matrix operator theory.

    NASA Technical Reports Server (NTRS)

    Kattawar, G. W.

    1973-01-01

    The matrix operator approach to radiative transfer is shown to be a very powerful technique in establishing symmetry relations for multiple scattering in inhomogeneous atmospheres. Symmetries are derived for the reflection and transmission operators using only the symmetry of the phase function. These results will mean large savings in computer time and storage for performing calculations for realistic planetary atmospheres using this method. The results have also been extended to establish a condition on the reflection matrix of a boundary in order to preserve reciprocity. Finally energy conservation is rigorously proven for conservative scattering in inhomogeneous atmospheres.

  8. A Boon for the Architect Engineer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Langley Research Center's need for an improved construction specification system led to an automated system called SPECSINTACT. A catalog of specifications, the system enables designers to retrieve relevant sections from computer storage and modify them as needed. SPECSINTACT has also been adopted by government agencies. The system is an integral part of the Construction Criteria Base (CCB), a single disc containing design and construction information for 10 government agencies including the American Institute of Architects' MASTERSPEC. CCB employs CD- ROM technologies and is available from the National Institute of Building Sciences. Users report substantial savings in time and productivity.

  9. EBR-II and TREAT Digitization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffith, George W.; Rabiti, Cristian

    2015-09-01

    Digitizing the technical drawings for EBR-II and TREAT provides multiple benefits. Moving the scanned or hard copy drawings to modern 3-D CAD (Computer Aided Drawing) format saves data that could be lost over time. The 3-D drawings produce models that can interface with other drawings to make complex assemblies. The 3-D CAD format can also include detailed material properties and parametric coding that can tie critical dimensions together allowing easier modification. Creating the new files from the old drawings has found multiple inconsistencies that are being flagged or corrected improving understanding of the reactor(s).

  10. Shaggy aorta syndrome after acute arterial macroembolism: report of a case.

    PubMed

    Hayashida, Naoki; Murayama, Hirokazu; Pearce, Yoko; Asano, Souichi; Ohashi, Yukio; Kohno, Hiroki; Handa, Takemi; Matsuo, Kozo; Nakagawa, Yasutsugu; Tatsuno, Katsuhiko

    2004-01-01

    We report the case of a patient who underwent treatment for a macroembolism in the right lower leg, which led to shaggy aorta syndrome. Anticoagulant therapy for the macroembolism and intra-aortic catheterization exacerbated the patient's renal function and triggered another massive microembolization of the visceral arteries, with a fatal outcome. To minimize the incremental complications inherent to this syndrome, awareness and prompt diagnosis with enhanced computed tomography or intravenous digital subtraction aortography are essential. Axillo-bifemoral bypass with bilateral external iliac artery ligations, performed with optimal timing, could save patients with shaggy aorta syndrome.

  11. Adaptive implicit-explicit and parallel element-by-element iteration schemes

    NASA Technical Reports Server (NTRS)

    Tezduyar, T. E.; Liou, J.; Nguyen, T.; Poole, S.

    1989-01-01

    Adaptive implicit-explicit (AIE) and grouped element-by-element (GEBE) iteration schemes are presented for the finite element solution of large-scale problems in computational mechanics and physics. The AIE approach is based on the dynamic arrangement of the elements into differently treated groups. The GEBE procedure, which is a way of rewriting the EBE formulation to make its parallel processing potential and implementation more clear, is based on the static arrangement of the elements into groups with no inter-element coupling within each group. Various numerical tests performed demonstrate the savings in the CPU time and memory.

  12. [Research on a non-invasive pulse wave detection and analysis system].

    PubMed

    Li, Ting; Yu, Gang

    2008-10-01

    A novel non-invasive pulse wave detection and analysis system has been developed, including the software and the hardware. Bi-channel signals can be acquired, stored and shown on the screen dynamically at the same time. Pulse wave can be reshown and printed after pulse wave analysis and pulse wave velocity analysis. This system embraces a computer which is designed for fast data saving, analyzing and processing, and a portable data sampling machine which is based on a singlechip. Experimental results have shown that the system is stable and easy to use, and the parameters are calculated accurately.

  13. Interactive program for analysis and design problems in advanced composites technology

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Swedlow, J. L.

    1971-01-01

    During the past year an experimental program in the fracture of advanced fiber composites has been completed. The experimental program has given direction to additional experimental and theoretical work. A synthesis program for designing low weight multifastener joints in composites is proposed, based on extensive analytical background. A number of failed joints have been thoroughly analyzed to evaluate the failure hypothesis used in the synthesis procedure. Finally, a new solution is reported for isotropic and anisotropic laminates using the boundary-integral method. The solution method offers significant savings of computer core and time for important problems.

  14. Computer Labs Get Rebooted as Lounges: New Gathering Places for Laptop Users Help Colleges Save on Upkeep

    ERIC Educational Resources Information Center

    Terris, Ben

    2010-01-01

    Colleges are looking for ways to cut costs, and most students now own laptops. As a result, many campus technology leaders are taking a hard look at those brightly lit rooms with rows of networked computers, which cost hundreds of thousands of dollars a year to maintain. More than 11% of colleges and universities are phasing out computer labs or…

  15. A modeling approach to energy savings of flying Canada geese using computational fluid dynamics.

    PubMed

    Maeng, Joo-Sung; Park, Jae-Hyung; Jang, Seong-Min; Han, Seog-Young

    2013-03-07

    A flapping flight mechanism of the Canada goose (Branta canadensis) was estimated using a two-jointed arm model in unsteady aerodynamic performance to examine how much energy can be saved in migration. Computational fluid dynamics (CFD) was used to evaluate airflow fields around the wing and in the wake. From the distributions of velocity and pressure on the wing, it was found that about 15% of goose flight energy could be saved by drag reduction from changing the morphology of the wing. From the airflow field in the wake, it was found that a pair of three-dimensional spiral flapping advantage vortices (FAV) was alternately generated. We quantitatively deduced that the optimal depth (the distance along the flight path between birds) was around 4m from the wing tip of a goose ahead, and optimal wing tip spacing (WTS, the distance between wing tips of adjacent birds perpendicular to the flight path) ranged between 0 and -0.40m in the spanwise section. It was found that a goose behind can save about 16% of its energy by induced power from FAV in V-formation. The phase difference of flapping between the goose ahead and behind was estimated at around 90.7° to take full aerodynamic benefit caused by FAV. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Electrode channel selection based on backtracking search optimization in motor imagery brain-computer interfaces.

    PubMed

    Dai, Shengfa; Wei, Qingguo

    2017-01-01

    Common spatial pattern algorithm is widely used to estimate spatial filters in motor imagery based brain-computer interfaces. However, use of a large number of channels will make common spatial pattern tend to over-fitting and the classification of electroencephalographic signals time-consuming. To overcome these problems, it is necessary to choose an optimal subset of the whole channels to save computational time and improve the classification accuracy. In this paper, a novel method named backtracking search optimization algorithm is proposed to automatically select the optimal channel set for common spatial pattern. Each individual in the population is a N-dimensional vector, with each component representing one channel. A population of binary codes generate randomly in the beginning, and then channels are selected according to the evolution of these codes. The number and positions of 1's in the code denote the number and positions of chosen channels. The objective function of backtracking search optimization algorithm is defined as the combination of classification error rate and relative number of channels. Experimental results suggest that higher classification accuracy can be achieved with much fewer channels compared to standard common spatial pattern with whole channels.

  17. A Kirchhoff approach to seismic modeling and prestack depth migration

    NASA Astrophysics Data System (ADS)

    Liu, Zhen-Yue

    1993-05-01

    The Kirchhoff integral provides a robust method for implementing seismic modeling and prestack depth migration, which can handle lateral velocity variation and turning waves. With a little extra computation cost, the Kirchoff-type migration can obtain multiple outputs that have the same phase but different amplitudes, compared with that of other migration methods. The ratio of these amplitudes is helpful in computing some quantities such as reflection angle. I develop a seismic modeling and prestack depth migration method based on the Kirchhoff integral, that handles both laterally variant velocity and a dip beyond 90 degrees. The method uses a finite-difference algorithm to calculate travel times and WKBJ amplitudes for the Kirchhoff integral. Compared to ray-tracing algorithms, the finite-difference algorithm gives an efficient implementation and single-valued quantities (first arrivals) on output. In my finite difference algorithm, the upwind scheme is used to calculate travel times, and the Crank-Nicolson scheme is used to calculate amplitudes. Moreover, interpolation is applied to save computation cost. The modeling and migration algorithms require a smooth velocity function. I develop a velocity-smoothing technique based on damped least-squares to aid in obtaining a successful migration.

  18. Time-varying value of electric energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mims, Natalie A.; Eckman, Tom; Goldman, Charles

    Electric energy efficiency resources save energy and may reduce peak demand. Historically, quantification of energy efficiency benefits has largely focused on the economic value of energy savings during the first year and lifetime of the installed measures. Due in part to the lack of publicly available research on end-use load shapes (i.e., the hourly or seasonal timing of electricity savings) and energy savings shapes, consideration of the impact of energy efficiency on peak demand reduction (i.e., capacity savings) has been more limited. End-use load research and the hourly valuation of efficiency savings are used for a variety of electricity planningmore » functions, including load forecasting, demand-side management and evaluation, capacity and demand response planning, long-term resource planning, renewable energy integration, assessing potential grid modernization investments, establishing rates and pricing, and customer service. This study reviews existing literature on the time-varying value of energy efficiency savings, provides examples in four geographically diverse locations of how consideration of the time-varying value of efficiency savings impacts the calculation of power system benefits, and identifies future research needs to enhance the consideration of the time-varying value of energy efficiency in cost-effectiveness screening analysis. Findings from this study include: -The time-varying value of individual energy efficiency measures varies across the locations studied because of the physical and operational characteristics of the individual utility system (e.g., summer or winter peaking, load factor, reserve margin) as well as the time periods during which savings from measures occur. -Across the four locations studied, some of the largest capacity benefits from energy efficiency are derived from the deferral of transmission and distribution system infrastructure upgrades. However, the deferred cost of such upgrades also exhibited the greatest range in value of all the components of avoided costs across the locations studied. -Of the five energy efficiency measures studied, those targeting residential air conditioning in summer-peaking electric systems have the most significant added value when the total time-varying value is considered. -The increased use of rooftop solar systems, storage, and demand response, and the addition of electric vehicles and other major new electricity-consuming end uses are anticipated to significantly alter the load shape of many utility systems in the future. Data used to estimate the impact of energy efficiency measures on electric system peak demands will need to be updated periodically to accurately reflect the value of savings as system load shapes change. -Publicly available components of electric system costs avoided through energy efficiency are not uniform across states and utilities. Inclusion or exclusion of these components and differences in their value affect estimates of the time-varying value of energy efficiency. -Publicly available data on end-use load and energy savings shapes are limited, are concentrated regionally, and should be expanded.« less

  19. Optimization Algorithm for Kalman Filter Exploiting the Numerical Characteristics of SINS/GPS Integrated Navigation Systems.

    PubMed

    Hu, Shaoxing; Xu, Shike; Wang, Duhu; Zhang, Aiwu

    2015-11-11

    Aiming at addressing the problem of high computational cost of the traditional Kalman filter in SINS/GPS, a practical optimization algorithm with offline-derivation and parallel processing methods based on the numerical characteristics of the system is presented in this paper. The algorithm exploits the sparseness and/or symmetry of matrices to simplify the computational procedure. Thus plenty of invalid operations can be avoided by offline derivation using a block matrix technique. For enhanced efficiency, a new parallel computational mechanism is established by subdividing and restructuring calculation processes after analyzing the extracted "useful" data. As a result, the algorithm saves about 90% of the CPU processing time and 66% of the memory usage needed in a classical Kalman filter. Meanwhile, the method as a numerical approach needs no precise-loss transformation/approximation of system modules and the accuracy suffers little in comparison with the filter before computational optimization. Furthermore, since no complicated matrix theories are needed, the algorithm can be easily transplanted into other modified filters as a secondary optimization method to achieve further efficiency.

  20. Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses

    PubMed Central

    Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.

    2010-01-01

    Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573

Top