Fast single-pass alignment and variant calling using sequencing data
USDA-ARS?s Scientific Manuscript database
Sequencing research requires efficient computation. Few programs use already known information about DNA variants when aligning sequence data to the reference map. New program findmap.f90 reads the previous variant list before aligning sequence, calling variant alleles, and summing the allele counts...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-02-16
Building Science Corporation collaborated with ICI Homes in Daytona Beach, FL on a 2008 prototype Showcase House that demonstrates the energy efficiency and durability upgrades that ICI currently promotes through its in-house efficiency program called EFactor.
Rim, Matthew H; Thomas, Karen C; Chandramouli, Jane; Barrus, Stephanie A; Nickman, Nancy A
2018-05-15
The implementation and quality assessment of a pharmacy services call center (PSCC) for outpatient pharmacies and specialty pharmacy services within an academic health system are described. Prolonged wait times in outpatient pharmacies or hold times on the phone affect the ability of pharmacies to capture and retain prescriptions. To support outpatient pharmacy operations and improve quality, a PSCC was developed to centralize handling of all outpatient and specialty pharmacy calls. The purpose of the PSCC was to improve the quality of pharmacy telephone services by (1) decreasing the call abandonment rate, (2) improving the speed of answer, (3) increasing first-call resolution, (4) centralizing all specialty pharmacy and prior authorization calls, (5) increasing labor efficiency and pharmacy capacities, (6) implementing a quality evaluation program, and (7) improving workplace satisfaction and retention of outpatient pharmacy staff. The PSCC centralized pharmacy calls from 9 pharmacy locations, 2 outpatient clinics, and a specialty pharmacy. Since implementation, the PSCC has achieved and maintained program goals, including improved abandonment rate, speed of answer, and first-call resolution. A centralized 24-7 support line for specialty pharmacy patients was also successfully established. A quality calibration program was implemented to ensure service quality and excellent patient experience. Additional ongoing evaluations measure the impact of the PSCC on improving workplace satisfaction and retention of outpatient pharmacy staff. The design and implementation of the PSCC have significantly improved the health system's patient experiences, efficiency, and quality. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
China’s R&D for Energy Efficient Buildings: Insights for U.S. Cooperation with China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Sha; Evans, Meredydd
2010-04-01
This report includes an evaluation of China’s current activities and future direction in building energy efficiency R&D and its relevance to DOE’s R&D activities under the Building Technologies Program in the Office of Energy Efficiency and Renewable Energy. The researchers reviewed the major R&D programs in China including the so-called 973 Program, the 863 Program, and the Key Technology R&D Program1 as well as the research activities of major research institutes. The report also reviewed several relevant documents of the Chinese government, websites (including the International Energy Agency and national and local governments in China), newsletters, and financial information listedmore » in the program documents and websites.« less
77 FR 55218 - Homeland Security Advisory Council
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
... childhood arrivals program. The HSAC will also receive a report from the Sustainability and Efficiency Task Force, review and discuss the task forces' report, and formulate recommendations for the Department. The.... HSAC conference call details and the Sustainability and Efficiency Task Force report will be provided...
Ju, Melody; Berman, Abigail T; Vapiwala, Neha
2015-09-01
Several key medical and oncologic professional societies have endorsed the importance of physician communication as a quality improvement metric. Despite this clear message, there remain substantial barriers to communication skills training (CST) in oncologic specialties. Herein, we describe the major barriers to communications training and propose standardized patient (SP) programs as efficient and strategic starting points and as expansion opportunities for new and existing CSTs.
Visualization of Concurrent Program Executions
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi
2007-01-01
Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.
NASA Technical Reports Server (NTRS)
Sandy, Michael
2015-01-01
The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.
Slurry combustion. Volume 2: Appendices, Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essenhigh, R.
1993-06-01
Volume II contains the following appendices: coal analyses and slurryability characteristics; listings of programs used to call and file experimental data, and to reduce data in enthalpy and efficiency calculations; and tabulated data sets.
OPTIGRAMI: Optimum lumber grade mix program for hardwood dimension parts
David G. Martens; Jr., Robert L. Nevel; Jr. Nevel
1985-01-01
With rapidly increasing lumber prices and shortages of some grades and species, the furniture industry must find ways to use its hardwood lumber resource more efficiently. A computer program called OPTIGRAMI is designed to help managers determine the best lumber to use in producing furniture parts. OPTIGRAMI determines the least-cost grade mix of lumber required to...
''Do-it-yourself'' software program calculates boiler efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-03-01
An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.
48 CFR 1823.7101 - Contract clause.
Code of Federal Regulations, 2013 CFR
2013-10-01
... SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Frequency Authorization 1823.7101 Contract clause. The contracting... calling for developing, producing, constructing, testing, or operating a device for which a radio...
A Cache Design to Exploit Structural Locality
1991-12-01
memory and secondary storage. Main memory was used to store the instructions and data of an executing pro- gram, while secondary storage held programs ...efficiency of the CPU and faster turnaround of executing programs . In addition to the well known spatial and temporal aspects of locality, Hobart has...identified a third aspect, which he has called structural locality (9). This type of locality is defined as the tendency of an executing program to
The Creation of a CPU Timer for High Fidelity Programs
NASA Technical Reports Server (NTRS)
Dick, Aidan A.
2011-01-01
Using C and C++ programming languages, a tool was developed that measures the efficiency of a program by recording the amount of CPU time that various functions consume. By inserting the tool between lines of code in the program, one can receive a detailed report of the absolute and relative time consumption associated with each section. After adapting the generic tool for a high-fidelity launch vehicle simulation program called MAVERIC, the components of a frequently used function called "derivatives ( )" were measured. Out of the 34 sub-functions in "derivatives ( )", it was found that the top 8 sub-functions made up 83.1% of the total time spent. In order to decrease the overall run time of MAVERIC, a launch vehicle simulation program, a change was implemented in the sub-function "Event_Controller ( )". Reformatting "Event_Controller ( )" led to a 36.9% decrease in the total CPU time spent by that sub-function, and a 3.2% decrease in the total CPU time spent by the overarching function "derivatives ( )".
75 FR 45123 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... proposed projects or to obtain a copy of the information collection plans, call the SAMHSA Reports... effectiveness, efficiency and sustainability of the GBHI project services for client abstinence, housing... the implementation and sustainability of project activities under the GBHI program. Each survey...
ERIC Educational Resources Information Center
Erickson, Paul
2010-01-01
As student enrollment drops, school districts need less learning space and fewer facilities. With cuts in funding, budgets cannot sustain existing building operations and program costs, and buildings must be taken offline or repurposed for financial efficiency. How does a community address this issue? Whether a district is having to shutter…
The University of Wisconsin OAO operating system
NASA Technical Reports Server (NTRS)
Heacox, H. C.; Mcnall, J. F.
1972-01-01
The Wisconsin OAO operating system is presented which consists of two parts: a computer program called HARUSPEX, which makes possible reasonably efficient and convenient operation of the package and ground operations equipment which provides real-time status monitoring, commanding and a quick-look at the data.
Transparent Ada rendezvous in a fault tolerant distributed system
NASA Technical Reports Server (NTRS)
Racine, Roger
1986-01-01
There are many problems associated with distributing an Ada program over a loosely coupled communication network. Some of these problems involve the various aspects of the distributed rendezvous. The problems addressed involve supporting the delay statement in a selective call and supporting the else clause in a selective call. Most of these difficulties are compounded by the need for an efficient communication system. The difficulties are compounded even more by considering the possibility of hardware faults occurring while the program is running. With a hardware fault tolerant computer system, it is possible to design a distribution scheme and communication software which is efficient and allows Ada semantics to be preserved. An Ada design for the communications software of one such system will be presented, including a description of the services provided in the seven layers of an International Standards Organization (ISO) Open System Interconnect (OSI) model communications system. The system capabilities (hardware and software) that allow this communication system will also be described.
2012-01-09
NASA Goddard Space Flight Center Financial Manager and White House 2011 SAVE award winner Matthew Ritsko is seen during a television interview at NASA Headquarters shortly after meeting with President Obama at the White House on Monday, Jan. 9, 2011, in Washington. The Presidential Securing Americans' Value and Efficiency (SAVE) program gives front-line federal workers the chance to submit their ideas on how their agencies can save money and work more efficiently. Matthew's proposal calls for NASA to create a "lending library" where specialized space tools and hardware purchased by one NASA organization will be made available to other NASA programs and projects. Photo Credit: (NASA/Bill Ingalls)
Nakato, Ryuichiro; Itoh, Tahehiko; Shirahige, Katsuhiko
2013-07-01
Chromatin immunoprecipitation with high-throughput sequencing (ChIP-seq) can identify genomic regions that bind proteins involved in various chromosomal functions. Although the development of next-generation sequencers offers the technology needed to identify these protein-binding sites, the analysis can be computationally challenging because sequencing data sometimes consist of >100 million reads/sample. Herein, we describe a cost-effective and time-efficient protocol that is generally applicable to ChIP-seq analysis; this protocol uses a novel peak-calling program termed DROMPA to identify peaks and an additional program, parse2wig, to preprocess read-map files. This two-step procedure drastically reduces computational time and memory requirements compared with other programs. DROMPA enables the identification of protein localization sites in repetitive sequences and efficiently identifies both broad and sharp protein localization peaks. Specifically, DROMPA outputs a protein-binding profile map in pdf or png format, which can be easily manipulated by users who have a limited background in bioinformatics. © 2013 The Authors Genes to Cells © 2013 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.
Uchida, Takahiro; Ikeno, Fumiaki; Ikeda, Koji; Suzuki, Yuka; Todaka, Koji; Yokoi, Hiroyoshi; Thompson, Gary; Krucoff, Mitchel; Saito, Shigeru
2013-01-01
Global medical devices have become more popular, but investment money for medical device development is not easily available in the market. Worldwide health-care budget constraints mean that efficient medical device development has become essential. To achieve efficient development, globalization is a key to success. Spending large amounts of money in different regions for medical device development is no longer feasible. In order to streamline processes of global medical device development, an academic, governmental, and industrial consortium, called the Harmonization by Doing program, has been set up. The program has been operating between Japan and the USA since 2003. The program has 4 working groups: (1) Global Cardiovascular Device Trials; (2) Study on Post-Market Registry; (3) Clinical Trials; and (4) Infrastructure and Methodology Regulatory Convergence and Communication. Each working group has as its goals the achievement of speedy and efficient medical device development in Japan and the USA. The program has held multiple international meetings to deal with obstacles against efficient medical device development. This kind of program is very important to deliver novel medical devices. Involvement of physicians in this type of activity is also very helpful to achieve these goals.
de Castro Lacaze, Denise Helena; Sacco, Isabel de C. N.; Rocha, Lys Esther; de Bragança Pereira, Carlos Alberto; Casarotto, Raquel Aparecida
2010-01-01
AIM: We sought to evaluate musculoskeletal discomfort and mental and physical fatigue in the call-center workers of an airline company before and after a supervised exercise program compared with rest breaks during the work shift. INTRODUCTION: This was a longitudinal pilot study conducted in a flight-booking call-center for an airline in São Paulo, Brazil. Occupational health activities are recommended to decrease the negative effects of the call-center working conditions. In practice, exercise programs are commonly recommended for computer workers, but their effects have not been studied in call-center operators. METHODS: Sixty-four call-center operators participated in this study. Thirty-two subjects were placed into the experimental group and attended a 10-min daily exercise session for 2 months. Conversely, 32 participants were placed into the control group and took a 10-min daily rest break during the same period. Each subject was evaluated once a week by means of the Corlett-Bishop body map with a visual analog discomfort scale and the Chalder fatigue questionnaire. RESULTS: Musculoskeletal discomfort decreased in both groups, but the reduction was only statistically significant for the spine and buttocks (p=0.04) and the sum of the segments (p=0.01) in the experimental group. In addition, the experimental group showed significant differences in the level of mental fatigue, especially in questions related to memory Rienzo, #181ff and tiredness (p=0.001). CONCLUSIONS: Our preliminary results demonstrate that appropriately designed and supervised exercise programs may be more efficient than rest breaks in decreasing discomfort and fatigue levels in call-center operators. PMID:20668622
A mechanism for efficient debugging of parallel programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, B.P.; Choi, J.D.
1988-01-01
This paper addresses the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors (SMMP). The authors describe the use of flowback analysis to provide information on causal relationships between events in a program's execution without re-executing the program for debugging. The authors introduce a mechanism called incremental tracing that, by using semantic analyses of the debugged program, makes the flowback analysis practical with only a small amount of trace generated during execution. The extend flowback analysis to apply to parallel programs and describe a method to detect race conditions in the interactions ofmore » the co-operating processes.« less
The improving efficiency frontier of inpatient rehabilitation hospitals.
Harrison, Jeffrey P; Kirkpatrick, Nicole
2011-01-01
This study uses a linear programming technique called data envelopment analysis to identify changes in the efficiency frontier of inpatient rehabilitation hospitals after implementation of the prospective payment system. The study provides a time series analysis of the efficiency frontier for inpatient rehabilitation hospitals in 2003 immediately after implementation of PPS and then again in 2006. Results indicate that the efficiency frontier of inpatient rehabilitation hospitals increased from 84% in 2003 to 85% in 2006. Similarly, an analysis of slack or inefficiency shows improvements in output efficiency over the study period. This clearly documents that efficiency in the inpatient rehabilitation hospital industry after implementation of PPS is improving. Hospital executives, health care policymakers, taxpayers, and other stakeholders benefit from studies that improve health care efficiency.
Water Efficient Installations - A New Army Guidance Document
2010-06-01
Toilets 1.28 gpf or less, 50 manuf., 500+ models Required in CA Dual flush options also available WaterSense program provides certification and...lose 8760 to 219,000 gal/year Broken flush valve on toilet can lose 40 gal/hour US Army Corps of Engineers® Engineer Research and Development Center...Engineer Research and Development Center Toilets and Urinals ULFTs Ultra-Low Flush Toilet , also called low flow 1.28 gpf to 1.6 gpf HETs High Efficiency
MPI_XSTAR: MPI-based Parallelization of the XSTAR Photoionization Program
NASA Astrophysics Data System (ADS)
Danehkar, Ashkbiz; Nowak, Michael A.; Lee, Julia C.; Smith, Randall K.
2018-02-01
We describe a program for the parallel implementation of multiple runs of XSTAR, a photoionization code that is used to predict the physical properties of an ionized gas from its emission and/or absorption lines. The parallelization program, called MPI_XSTAR, has been developed and implemented in the C++ language by using the Message Passing Interface (MPI) protocol, a conventional standard of parallel computing. We have benchmarked parallel multiprocessing executions of XSTAR, using MPI_XSTAR, against a serial execution of XSTAR, in terms of the parallelization speedup and the computing resource efficiency. Our experience indicates that the parallel execution runs significantly faster than the serial execution, however, the efficiency in terms of the computing resource usage decreases with increasing the number of processors used in the parallel computing.
User-Defined Data Distributions in High-Level Programming Languages
NASA Technical Reports Server (NTRS)
Diaconescu, Roxana E.; Zima, Hans P.
2006-01-01
One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.
Social Emotional Optimization Algorithm for Nonlinear Constrained Optimization Problems
NASA Astrophysics Data System (ADS)
Xu, Yuechun; Cui, Zhihua; Zeng, Jianchao
Nonlinear programming problem is one important branch in operational research, and has been successfully applied to various real-life problems. In this paper, a new approach called Social emotional optimization algorithm (SEOA) is used to solve this problem which is a new swarm intelligent technique by simulating the human behavior guided by emotion. Simulation results show that the social emotional optimization algorithm proposed in this paper is effective and efficiency for the nonlinear constrained programming problems.
Brain Gym[R]: Building Stronger Brains or Wishful Thinking?
ERIC Educational Resources Information Center
Hyatt, Keith J.
2007-01-01
As part of the accountability movement, schools are increasingly called upon to provide interventions that are based on sound scientific research and that provide measurable outcomes for children. Brain Gym[R] is a popular commercial program claiming that adherence to its regimen will result in more efficient learning in an almost miraculous…
Lean and Efficient Software: Whole Program Optimization of Executables
2016-12-31
format string “ baked in”? (If multiple printf calls pass the same format string, they could share the same new function.) This leads to the...format string becomes baked into the target function. Moving down: o Moving from the first row to the second makes any potential user control of the
Development of a digital camera tree evaluation system
Neil Clark; Daniel L. Schmoldt; Philip A. Araman
2000-01-01
Within the Strategic Plan for Forest Inventory and Monitoring (USDA Forest Service 1998), there is a call to "conduct applied research in the use of [advanced technology] towards the end of increasing the operational efficiency and effectiveness of our program". The digital camera tree evaluation system is part of that research, aimed at decreasing field...
The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.
Pang, Haotian; Liu, Han; Vanderbei, Robert
2014-02-01
We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
Efficient Thread Labeling for Monitoring Programs with Nested Parallelism
NASA Astrophysics Data System (ADS)
Ha, Ok-Kyoon; Kim, Sun-Sook; Jun, Yong-Kee
It is difficult and cumbersome to detect data races occurred in an execution of parallel programs. Any on-the-fly race detection techniques using Lamport's happened-before relation needs a thread labeling scheme for generating unique identifiers which maintain logical concurrency information for the parallel threads. NR labeling is an efficient thread labeling scheme for the fork-join program model with nested parallelism, because its efficiency depends only on the nesting depth for every fork and join operation. This paper presents an improved NR labeling, called e-NR labeling, in which every thread generates its label by inheriting the pointer to its ancestor list from the parent threads or by updating the pointer in a constant amount of time and space. This labeling is more efficient than the NR labeling, because its efficiency does not depend on the nesting depth for every fork and join operation. Some experiments were performed with OpenMP programs having nesting depths of three or four and maximum parallelisms varying from 10,000 to 1,000,000. The results show that e-NR is 5 times faster than NR labeling and 4.3 times faster than OS labeling in the average time for creating and maintaining the thread labels. In average space required for labeling, it is 3.5 times smaller than NR labeling and 3 times smaller than OS labeling.
Sizing of complex structure by the integration of several different optimal design algorithms
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.
1974-01-01
Practical design of large-scale structures can be accomplished with the aid of the digital computer by bringing together in one computer program algorithms of nonlinear mathematical programing and optimality criteria with weight-strength and other so-called engineering methods. Applications of this approach to aviation structures are discussed with a detailed description of how the total problem of structural sizing can be broken down into subproblems for best utilization of each algorithm and for efficient organization of the program into iterative loops. Typical results are examined for a number of examples.
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Carroll, T; Rock, B
2003-01-01
Objective: The study sought to measure the relative efficiency of different television advertisements and types of television programmes in which advertisements were placed, in generating calls to Australia's national Quitline. Design: The study entailed an analysis of the number of calls generated to the Quitline relative to the weight of advertising exposure (in target audience rating points (TARPs) for particular television advertisements and for placement of these advertisements in particular types of television programmes. A total of 238 television advertisement placements and 1769 calls to the Quitline were analysed in Sydney and Melbourne. Results: The more graphic "eye" advertisement conveying new information about the association between smoking and macular degeneration leading to blindness was more efficient in generating quitline calls than the "tar" advertisement, which reinforced the message of tar in a smoker's lungs. Combining the health effects advertisements with a quitline modelling advertisement tended to increase the efficiency of generating Quitline calls. Placing advertisements in lower involvement programmes appears to provide greater efficiency in generating Quitline calls than in higher involvement programmes. Conclusions: Tobacco control campaign planners can increase the number of calls to telephone quitlines by assessing the efficiency of particular advertisements to generate such calls. Pairing of health effect and quitline modelling advertisements can increase efficiency in generating calls. Placement of advertisements in lower involvement programme types may increase efficiency in generating Quitline calls. PMID:12878772
Carroll, T; Rock, B
2003-09-01
The study sought to measure the relative efficiency of different television advertisements and types of television programmes in which advertisements were placed, in generating calls to Australia's national Quitline. The study entailed an analysis of the number of calls generated to the Quitline relative to the weight of advertising exposure (in target audience rating points (TARPs) for particular television advertisements and for placement of these advertisements in particular types of television programmes. A total of 238 television advertisement placements and 1769 calls to the Quitline were analysed in Sydney and Melbourne. The more graphic "eye" advertisement conveying new information about the association between smoking and macular degeneration leading to blindness was more efficient in generating quitline calls than the "tar" advertisement, which reinforced the message of tar in a smoker's lungs. Combining the health effects advertisements with a quitline modelling advertisement tended to increase the efficiency of generating Quitline calls. Placing advertisements in lower involvement programmes appears to provide greater efficiency in generating Quitline calls than in higher involvement programmes. Tobacco control campaign planners can increase the number of calls to telephone quitlines by assessing the efficiency of particular advertisements to generate such calls. Pairing of health effect and quitline modelling advertisements can increase efficiency in generating calls. Placement of advertisements in lower involvement programme types may increase efficiency in generating Quitline calls.
Algorithms and software for nonlinear structural dynamics
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Gilbertsen, Noreen D.; Neal, Mark O.
1989-01-01
The objective of this research is to develop efficient methods for explicit time integration in nonlinear structural dynamics for computers which utilize both concurrency and vectorization. As a framework for these studies, the program WHAMS, which is described in Explicit Algorithms for the Nonlinear Dynamics of Shells (T. Belytschko, J. I. Lin, and C.-S. Tsay, Computer Methods in Applied Mechanics and Engineering, Vol. 42, 1984, pp 225 to 251), is used. There are two factors which make the development of efficient concurrent explicit time integration programs a challenge in a structural dynamics program: (1) the need for a variety of element types, which complicates the scheduling-allocation problem; and (2) the need for different time steps in different parts of the mesh, which is here called mixed delta t integration, so that a few stiff elements do not reduce the time steps throughout the mesh.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Letschert, Virginie E.; McNeil, Michael A.; Leiva Ibanez, Francisco Humberto
2011-06-01
Minimum Efficiency Performance Standards (MEPS) have been chosen as part of Chile's national energy efficiency action plan. As a first MEPS, the Ministry of Energy has decided to focus on a regulation for lighting that would ban the sale of inefficient bulbs, effectively phasing out the use of incandescent lamps. Following major economies such as the US (EISA, 2007) , the EU (Ecodesign, 2009) and Australia (AS/NZS, 2008) who planned a phase out based on minimum efficacy requirements, the Ministry of Energy has undertaken the impact analysis of a MEPS on the residential lighting sector. Fundacion Chile (FC) and Lawrencemore » Berkeley National Laboratory (LBNL) collaborated with the Ministry of Energy and the National Energy Efficiency Program (Programa Pais de Eficiencia Energetica, or PPEE) in order to produce a techno-economic analysis of this future policy measure. LBNL has developed for CLASP (CLASP, 2007) a spreadsheet tool called the Policy Analysis Modeling System (PAMS) that allows for evaluation of costs and benefits at the consumer level but also a wide range of impacts at the national level, such as energy savings, net present value of savings, greenhouse gas (CO2) emission reductions and avoided capacity generation due to a specific policy. Because historically Chile has followed European schemes in energy efficiency programs (test procedures, labelling program definitions), we take the Ecodesign commission regulation No 244/2009 as a starting point when defining our phase out program, which means a tiered phase out based on minimum efficacy per lumen category. The following data were collected in order to perform the techno-economic analysis: (1) Retail prices, efficiency and wattage category in the current market, (2) Usage data (hours of lamp use per day), and (3) Stock data, penetration of efficient lamps in the market. Using these data, PAMS calculates the costs and benefits of efficiency standards from two distinct but related perspectives: (1) The Life-Cycle Cost (LCC) calculation examines costs and benefits from the perspective of the individual household; and (2) The National Perspective projects the total national costs and benefits including both financial benefits, and energy savings and environmental benefits. The national perspective calculations are called the National Energy Savings (NES) and the Net Present Value (NPV) calculations. PAMS also calculate total emission mitigation and avoided generation capacity. This paper describes the data and methodology used in PAMS and presents the results of the proposed phase out of incandescent bulbs in Chile.« less
An MIP model to schedule the call center workforce and organize the breaks
NASA Astrophysics Data System (ADS)
Türker, Turgay; Demiriz, Ayhan
2016-06-01
In modern economies, companies place a premium on managing their workforce efficiently especially in labor intensive service sector, since the services have become the significant portion of the economies. Tour scheduling is an important tool to minimize the overall workforce costs while satisfying the minimum service level constraints. In this study, we consider the workforce management problem of an inbound call-center while satisfying the call demand within the short time periods with the minimum cost. We propose a mixed-integer programming model to assign workers to the daily shifts, to determine the weekly off-days, and to determine the timings of lunch and other daily breaks for each worker. The proposed model has been verified on the weekly demand data observed at a specific call center location of a satellite TV operator. The model was run on both 15 and 10 minutes demand estimation periods (planning time intervals).
DROP: Detecting Return-Oriented Programming Malicious Code
NASA Astrophysics Data System (ADS)
Chen, Ping; Xiao, Hai; Shen, Xiaobin; Yin, Xinchun; Mao, Bing; Xie, Li
Return-Oriented Programming (ROP) is a new technique that helps the attacker construct malicious code mounted on x86/SPARC executables without any function call at all. Such technique makes the ROP malicious code contain no instruction, which is different from existing attacks. Moreover, it hides the malicious code in benign code. Thus, it circumvents the approaches that prevent control flow diversion outside legitimate regions (such as W ⊕ X ) and most malicious code scanning techniques (such as anti-virus scanners). However, ROP has its own intrinsic feature which is different from normal program design: (1) uses short instruction sequence ending in "ret", which is called gadget, and (2) executes the gadgets contiguously in specific memory space, such as standard GNU libc. Based on the features of the ROP malicious code, in this paper, we present a tool DROP, which is focused on dynamically detecting ROP malicious code. Preliminary experimental results show that DROP can efficiently detect ROP malicious code, and have no false positives and negatives.
resource planning (ERP) solution called the Expeditionary Combat Support System (ECSS), a big - bang approach. In early 2012, the ECSS program was cancelled...Repair, and Overhaul initiative (MROi), a small- bang approach, to increase enterprise visibility and efficiency across all three Air Logistics
Ceramic applications in turbine engines
NASA Technical Reports Server (NTRS)
Helms, H. E.; Heitman, P. W.; Lindgren, L. C.; Thrasher, S. R.
1984-01-01
The application of ceramic components to demonstrate improved cycle efficiency by raising the operating temperature of the existing Allison IGI 404 vehicular gas turbine engine is discussed. This effort was called the Ceramic Applications in Turbine Engines (CATE) program and has successfully demonstrated ceramic components. Among these components are two design configurations featuring stationary and rotating caramic components in the IGT 404 engine. A complete discussion of all phases of the program, design, materials development, fabrication of ceramic components, and testing-including rig, engine, and vehicle demonstation test are presented. During the CATE program, a ceramic technology base was established that is now being applied to automotive and other gas turbine engine programs. This technology base is outlined and also provides a description of the CATE program accomplishments.
Technology needs for high speed rotorcraft (3)
NASA Technical Reports Server (NTRS)
Detore, Jack; Conway, Scott
1991-01-01
The spectrum of vertical takeoff and landing (VTOL) type aircraft is examined to determine which aircraft are most likely to achieve high subsonic cruise speeds and have hover qualities similar to a helicopter. Two civil mission profiles are considered: a 600-n.mi. mission for a 15- and a 30-passenger payload. Applying current technology, only the 15- and 30-passenger tiltfold aircraft are capable of attaining the 450-knot design goal. The two tiltfold aircraft at 450 knots and a 30-passenger tiltrotor at 375 knots were further developed for the Task II technology analysis. A program called High-Speed Total Envelope Proprotor (HI-STEP) is recommended to meet several of these issues based on the tiltrotor concept. A program called Tiltfold System (TFS) is recommended based on the tiltrotor concept. A task is identified to resolve the best design speed from productivity and demand considerations based on the technology that emerges from the recommended programs. HI-STEP's goals are to investigate propulsive efficiency, maneuver loads, and aeroelastic stability. Programs currently in progress that may meet the other technology needs include the Integrated High Performance Turbine Engine Technology (IHPTET) (NASA Lewis) and the Advanced Structural Concepts Program funded through NASA Langley.
49 CFR 198.37 - State one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false State one-call damage prevention program. 198.37... REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.37 State one-call damage prevention program. A State must adopt a one-call damage prevention...
ERIC Educational Resources Information Center
Hytönen, Kaisa; Palonen, Tuire; Lehtinen, Erno; Hakkarainen, Kai
2014-01-01
In order to address the requirements of future education in different fields of academic professional activity, a model called Academic Apprenticeship Education was initiated in Finland in 2009. The aim of this article is to analyse the development of expert networks in the context of a 1-year Academic Apprenticeship Education model in the field…
Joint Program Executive Office for Chemical and Biological Defense Strategic Plan FY13-18
2012-06-01
strategy calls for a global Biosurveillance network for timely disease surveillance of biological pathogens whether intentionally made or naturally... Biosurveillance . The President and Secretary of Defense provided renewed emphasis on rapidly and efficiently developing and manufacturing effective medical...and Force Protection Solutions to the Whole of Government and the Nation Provide MCM for pre- and post- radiation exposure application , including
Programming models for energy-aware systems
NASA Astrophysics Data System (ADS)
Zhu, Haitao
Energy efficiency is an important goal of modern computing, with direct impact on system operational cost, reliability, usability and environmental sustainability. This dissertation describes the design and implementation of two innovative programming languages for constructing energy-aware systems. First, it introduces ET, a strongly typed programming language to promote and facilitate energy-aware programming, with a novel type system design called Energy Types. Energy Types is built upon a key insight into today's energy-efficient systems and applications: despite the popular perception that energy and power can only be described in joules and watts, real-world energy management is often based on discrete phases and modes, which in turn can be reasoned about by type systems very effectively. A phase characterizes a distinct pattern of program workload, and a mode represents an energy state the program is expected to execute in. Energy Types is designed to reason about energy phases and energy modes, bringing programmers into the optimization of energy management. Second, the dissertation develops Eco, an energy-aware programming language centering around sustainability. A sustainable program built from Eco is able to adaptively adjusts its own behaviors to stay on a given energy budget, avoiding both deficit that would lead to battery drain or CPU overheating, and surplus that could have been used to improve the quality of the program output. Sustainability is viewed as a form of supply and demand matching, and a sustainable program consistently maintains the equilibrium between supply and demand. ET is implemented as a prototyped compiler for smartphone programming on Android, and Eco is implemented as a minimal extension to Java. Programming practices and benchmarking experiments in these two new languages showed that ET can lead to significant energy savings for Android Apps and Eco can efficiently promote battery awareness and temperature awareness in real-world Java programs.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
EMPRESS: A European Project to Enhance Process Control Through Improved Temperature Measurement
NASA Astrophysics Data System (ADS)
Pearce, J. V.; Edler, F.; Elliott, C. J.; Rosso, L.; Sutton, G.; Andreu, A.; Machin, G.
2017-08-01
A new European project called EMPRESS, funded by the EURAMET program `European Metrology Program for Innovation and Research,' is described. The 3 year project, which started in the summer of 2015, is intended to substantially augment the efficiency of high-value manufacturing processes by improving temperature measurement techniques at the point of use. The project consortium has 18 partners and 5 external collaborators, from the metrology sector, high-value manufacturing, sensor manufacturing, and academia. Accurate control of temperature is key to ensuring process efficiency and product consistency and is often not achieved to the level required for modern processes. Enhanced efficiency of processes may take several forms including reduced product rejection/waste; improved energy efficiency; increased intervals between sensor recalibration/maintenance; and increased sensor reliability, i.e., reduced amount of operator intervention. Traceability of temperature measurements to the International Temperature Scale of 1990 (ITS-90) is a critical factor in establishing low measurement uncertainty and reproducible, consistent process control. Introducing such traceability in situ (i.e., within the industrial process) is a theme running through this project.
Halvade-RNA: Parallel variant calling from transcriptomic data using MapReduce.
Decap, Dries; Reumers, Joke; Herzeel, Charlotte; Costanza, Pascal; Fostier, Jan
2017-01-01
Given the current cost-effectiveness of next-generation sequencing, the amount of DNA-seq and RNA-seq data generated is ever increasing. One of the primary objectives of NGS experiments is calling genetic variants. While highly accurate, most variant calling pipelines are not optimized to run efficiently on large data sets. However, as variant calling in genomic data has become common practice, several methods have been proposed to reduce runtime for DNA-seq analysis through the use of parallel computing. Determining the effectively expressed variants from transcriptomics (RNA-seq) data has only recently become possible, and as such does not yet benefit from efficiently parallelized workflows. We introduce Halvade-RNA, a parallel, multi-node RNA-seq variant calling pipeline based on the GATK Best Practices recommendations. Halvade-RNA makes use of the MapReduce programming model to create and manage parallel data streams on which multiple instances of existing tools such as STAR and GATK operate concurrently. Whereas the single-threaded processing of a typical RNA-seq sample requires ∼28h, Halvade-RNA reduces this runtime to ∼2h using a small cluster with two 20-core machines. Even on a single, multi-core workstation, Halvade-RNA can significantly reduce runtime compared to using multi-threading, thus providing for a more cost-effective processing of RNA-seq data. Halvade-RNA is written in Java and uses the Hadoop MapReduce 2.0 API. It supports a wide range of distributions of Hadoop, including Cloudera and Amazon EMR.
Applying industrial process improvement techniques to increase efficiency in a surgical practice.
Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan
2014-10-01
The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P < .01). Utilizing process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
The past, present, and future of U.S. utility demand-side management programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eto, J.
Demand-side management or DSM refers to active efforts by electric and gas utilities to modify customers` energy use patterns. The experience in the US shows that utilities, when provided with appropriate incentives, can provide a powerful stimulus to energy efficiency in the private sector. This paper describes the range and history of DSM programs offered by US electric utilities, with a focus on the political, economic, and regulatory events that have shaped their evolution. It also describes the changes these programs are undergoing as a result of US electricity industry restructuring. DSM programs began modestly in the 1970s in responsemore » to growing concerns about dependence on foreign sources of oil and environmental consequences of electricity generation, especially nuclear power. The foundation for the unique US partnership between government and utility interests can be traced first to the private-ownership structure of the vertically integrated electricity industry and second to the monopoly franchise granted by state regulators. Electricity industry restructuring calls into question both of these basic conditions, and thus the future of utility DSM programs for the public interest. Future policies guiding ratepayer-funded energy-efficiency DSM programs will need to pay close attention to the specific market objectives of the programs and to the balance between public and private interests.« less
The future of almanac services --- an HMNAO perspective
NASA Astrophysics Data System (ADS)
Bell, S.; Nelmes, S.; Prema, P.; Whittaker, J.
2015-08-01
This talk will explore the means for delivering almanac data currently under consideration by HM Nautical Almanac Office in the near to medium future. While there will be a need to continue printed almanacs, almanac data must be available in a variety of forms ranging from paper almanacs to traditional web services through to applications for mobile devices and smartphones. The supply of data using applications may call for a different philosophy in supplying ephemeris data, one that differentiates between an application that calls on a web server for its data and one that has built-in ephemerides. These ephemerides need to be of a reasonably high precision while maintaining a modest machine footprint. These services also need to provide a wide range of applications ranging from traditional sunrise/set data though to more specialized services such as celestial navigation. The work necessary to meet these goals involves efficient programming, intuitive user interfaces, compact and efficient ephemerides and a suitable range of tools to meet the user's needs.
Antennas for mobile satellite communications
NASA Technical Reports Server (NTRS)
Huang, John
1991-01-01
A NASA sponsored program, called the Mobile Satellite (MSAT) system, has prompted the development of several innovative antennas at L-band frequencies. In the space segment of the MSAT system, an efficient, light weight, circularly polarized microstrip array that uses linearly polarized elements was developed as a multiple beam reflector feed system. In the ground segment, a low-cost, low-profile, and very efficient microstrip Yagi array was developed as a medium-gain mechanically steered vehicle antenna. Circularly shaped microstrip patches excited at higher-order modes were also developed as low-gain vehicle antennas. A more recent effort called for the development of a 20/30 GHz mobile terminal antenna for future-generation mobile satellite communications. To combat the high insertion loss encountered at 20/30 GHz, series-fed Monolithic Microwave Integrated Circuit (MMIC) microstrip array antennas are currently being developed. These MMIC arrays may lead to the development of several small but high-gain Ka-band antennas for the Personal Access Satellite Service planned for the 2000s.
Flexible Environmental Modeling with Python and Open - GIS
NASA Astrophysics Data System (ADS)
Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann
2015-04-01
Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.
Lien, Rebecca K; Schillo, Barbara A; Mast, Jay L; Lukowski, Amy V; Greenseid, Lija O; Keith, Jennifer D; Keller, Paula A
2016-01-01
Tobacco users in all 50 states have access to quitline telephone counseling and cessation medications. While studies show multiple calls relate to quit success, most participants do not complete a full call series. To date, quitline program use studies have analyzed single factors-such as number of calls or counseling minutes. This study combines multiple factors of quitline program use across 2 states to describe how participants use a 5-call program; assess whether intensity of program use is associated with participant subgroups; and assess whether key outcomes (quitting, satisfaction) are associated with intensity. This observational study examines data for quitline participants in Minnesota (n = 2844) and Pennsylvania (n = 14 359) in 2011 and 2012. A subset of participants was surveyed 7 months after registration to assess key outcomes (response rates: Minnesota 65%; Pennsylvania 60%). Quitline utilization data were used to identify program use variables: nicotine replacement therapy provision, number of counseling calls, number of counseling minutes, days from first to last counseling call, and days from registration to first counseling call. Ten program use groups were created using all 5 program use variables, from lowest (1) to highest (10) intensity. Results were similar for both states. Only 11% of Minnesota and 8% of Pennsylvania participants completed all 5 calls. Intensity of quitline program use was associated with several participant characteristics including health conditions and age. Both quit status and program satisfaction were associated with program use intensity. Quit rates peaked in group 9, participants who received the full 5-call program. Quitlines should focus on engaging participants in multiple calls to improve quit outcomes. In addition, it is important to leverage multiple program use factors for a fuller understanding of how quitline participants use a program.
1986 Year End Report for Road Following at Carnegie-Mellon
1987-05-01
how to make them work efficiently. We designed a hierarchical structure and a monitor module which manages all parts of the hierarchy (see figure 1...database, called the Local Map, is managed by a program known as the Local Map Builder (LMB). Each module stores and retrieves information in the...knowledge-intensive modules, and a database manager that synchronizes the modules-is characteristic of a traditional blackboard system. Such a system is
A novel approach in formulation of special transition elements: Mesh interface elements
NASA Technical Reports Server (NTRS)
Sarigul, Nesrin
1991-01-01
The objective of this research program is in the development of more accurate and efficient methods for solution of singular problems encountered in various branches of mechanics. The research program can be categorized under three levels. The first two levels involve the formulation of a new class of elements called 'mesh interface elements' (MIE) to connect meshes of traditional elements either in three dimensions or in three and two dimensions. The finite element formulations are based on boolean sum and blending operators. MEI are being formulated and tested in this research to account for the steep gradients encountered in aircraft and space structure applications. At present, the heat transfer and structural analysis problems are being formulated from uncoupled theory point of view. The status report: (1) summarizes formulation for heat transfer and structural analysis; (2) explains formulation of MEI; (3) examines computational efficiency; and (4) shows verification examples.
ERIC Educational Resources Information Center
Samani, Ebrahim; Baki, Roselan; Razali, Abu Bakar
2014-01-01
Success in implementation of computer-assisted language learning (CALL) programs depends on the teachers' understanding of the roles of CALL programs in education. Consequently, it is also important to understand the barriers teachers face in the use of computer-assisted language learning (CALL) programs. The current study was conducted on 14…
eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes.
Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen
2014-01-01
Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice.
eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes
Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen
2014-01-01
Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice. PMID:25374455
NASA Technical Reports Server (NTRS)
1987-01-01
In a complex computer environment there is ample opportunity for error, a mistake by a programmer, or a software-induced undesirable side effect. In insurance, errors can cost a company heavily, so protection against inadvertent change is a must for the efficient firm. The data processing center at Transport Life Insurance Company has taken a step to guard against accidental changes by adopting a software package called EQNINT (Equations Interpreter Program). EQNINT cross checks the basic formulas in a program against the formulas that make up the major production system. EQNINT assures that formulas are coded correctly and helps catch errors before they affect the customer service or its profitability.
Joseph, G; Kaplan, C; Luce, J; Lee, R; Stewart, S; Guerra, C; Pasick, R
2012-01-01
Identification of low-income women with the rare but serious risk of hereditary cancer and their referral to appropriate services presents an important public health challenge. We report the results of formative research to reach thousands of women for efficient identification of those at high risk and expedient access to free genetic services. External validity is maximized by emphasizing intervention fit with the two end-user organizations who must connect to make this possible. This study phase informed the design of a subsequent randomized controlled trial. We conducted a randomized controlled pilot study (n = 38) to compare two intervention models for feasibility and impact. The main outcome was receipt of genetic counseling during a two-month intervention period. Model 1 was based on the usual outcall protocol of an academic hospital genetic risk program, and Model 2 drew on the screening and referral procedures of a statewide toll-free phone line through which large numbers of high-risk women can be identified. In Model 1, the risk program proactively calls patients to schedule genetic counseling; for Model 2, women are notified of their eligibility for counseling and make the call themselves. We also developed and pretested a family history screener for administration by phone to identify women appropriate for genetic counseling. There was no statistically significant difference in receipt of genetic counseling between women randomized to Model 1 (3/18) compared with Model 2 (3/20) during the intervention period. However, when unresponsive women in Model 2 were called after 2 months, 7 more obtained counseling; 4 women from Model 1 were also counseled after the intervention. Thus, the intervention model that closely aligned with the risk program's outcall to high-risk women was found to be feasible and brought more low-income women to free genetic counseling. Our screener was easy to administer by phone and appeared to identify high-risk callers effectively. The model and screener are now in use in the main trial to test the effectiveness of this screening and referral intervention. A validation analysis of the screener is also underway. Identification of intervention strategies and tools, and their systematic comparison for impact and efficiency in the context where they will ultimately be used are critical elements of practice-based research. Copyright © 2012 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Christopher; Hasanbeigi, Ali; Price, Lynn
Improving the efficiency of energy production and consumption and switching to lower carbon energy sources can significantly decrease carbon dioxide (CO2) emissions and reduce climate change impacts. A growing body of research has found that these measures can also directly mitigate many non-climate change related human health hazards and environmental damage. Positive impacts of policies and programs that occur in addition to the intended primary policy goal are called co-benefits. Policy analysis relies on forecasting and comparing the costs of policy and program implementation and the benefits that accrue to society from implementation. GHG reduction and energy efficiency policies andmore » programs face political resistance in part because of the difficulty of quantifying their benefits. On the one hand, climate change mitigation policy benefits are often global, long-term, and subject to large uncertainties, and subsidized energy pricing can reduce the direct monetary benefits of energy efficiency policies to below their cost. On the other hand, the co-benefits that accrue from these efforts’ resultant reductions in conventional air pollution (such as improved health, agricultural productivity, reduced damage to infrastructure, and local ecosystem improvements) are generally near term, local, and more certain than climate change mitigation benefits and larger than the monetary value of energy savings. The incorporation of co-benefits into energy efficiency and climate mitigation policy and program analysis therefore might significantly increase the uptake of these policies. Faster policy uptake is especially important in developing countries because ongoing development efforts that do not consider co-benefits may lock in suboptimal technologies and infrastructure and result in high costs in future years. Over the past two decades, studies have repeatedly documented that non-climate change related benefits of energy efficiency and fuel conversion efforts, as a part of GHG mitigation strategies, can be from between 30% to over 100% of the costs of such policies and programs strategies. Policy makers around the world are increasingly interested in including both GHG and non-GHG impacts in analyses of energy efficiency and fuel switching policies and programs and a set of methodologies has matured from the efforts of early moving jurisdictions such as the European Union, the United States, and Japan.« less
Cloud-based large-scale air traffic flow optimization
NASA Astrophysics Data System (ADS)
Cao, Yi
The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model that can be used for both offline historical traffic data analysis and online traffic flow optimization. It provides an efficient and robust platform for easy deployment and implementation. A small cloud consisting of five workstations was configured and used to demonstrate the advantages of cloud computing in dealing with large-scale parallelizable traffic problems.
Reliability of programs specified with equational specifications
NASA Astrophysics Data System (ADS)
Nikolik, Borislav
Ultrareliability is desirable (and sometimes a demand of regulatory authorities) for safety-critical applications, such as commercial flight-control programs, medical applications, nuclear reactor control programs, etc. A method is proposed, called the Term Redundancy Method (TRM), for obtaining ultrareliable programs through specification-based testing. Current specification-based testing schemes need a prohibitively large number of testcases for estimating ultrareliability. They assume availability of an accurate program-usage distribution prior to testing, and they assume the availability of a test oracle. It is shown how to obtain ultrareliable programs (probability of failure near zero) with a practical number of testcases, without accurate usage distribution, and without a test oracle. TRM applies to the class of decision Abstract Data Type (ADT) programs specified with unconditional equational specifications. TRM is restricted to programs that do not exceed certain efficiency constraints in generating testcases. The effectiveness of TRM in failure detection and recovery is demonstrated on formulas from the aircraft collision avoidance system TCAS.
A low-cost, practical method for increasing smokers' interest in smoking cessation programs.
McDonald, Paul W
2004-01-01
Low participation rates reduce the public health impact of smoking cessation programs. Two barriers for improving participation are the cost of media campaigns and the proportion of smokers motivated to quit smoking. The objective of this study was to examine the feasibility of using classified newspaper ads and messages aimed at each stage of change to enhance participation in smoking cessation programs. Three classified ads were run concurrently in a local daily newspaper for five consecutive days. The ads were designed to engage smokers in each of Prochaska's five stages of change. Each ad invited smokers or former smokers to call the local health department to participate in a paid focus group to design a new health department program. Calls were received from 181 eligible smokers, including 124 who provided data for the study. Thirty-seven, 34, and 29 percent of smoking respondents were in precontemplation, contemplation and preparation respectively. Half of ex-smokers were in the action stage. Ads cost 174 dollars (Cdn), thus the cost per recruit was less than a dollar. Classified ads can recruit smokers from all stages of change. Compared to traditional mass media, classified ads may also be a highly cost-efficient promotional strategy. Results provide justification for further research.
Optimization methods and silicon solar cell numerical models
NASA Technical Reports Server (NTRS)
Girardini, K.
1986-01-01
The goal of this project is the development of an optimization algorithm for use with a solar cell model. It is possible to simultaneously vary design variables such as impurity concentrations, front junction depth, back junctions depth, and cell thickness to maximize the predicted cell efficiency. An optimization algorithm has been developed and interfaced with the Solar Cell Analysis Program in 1 Dimension (SCAPID). SCAPID uses finite difference methods to solve the differential equations which, along with several relations from the physics of semiconductors, describe mathematically the operation of a solar cell. A major obstacle is that the numerical methods used in SCAPID require a significant amount of computer time, and during an optimization the model is called iteratively until the design variables converge to the value associated with the maximum efficiency. This problem has been alleviated by designing an optimization code specifically for use with numerically intensive simulations, to reduce the number of times the efficiency has to be calculated to achieve convergence to the optimal solution. Adapting SCAPID so that it could be called iteratively by the optimization code provided another means of reducing the cpu time required to complete an optimization. Instead of calculating the entire I-V curve, as is usually done in SCAPID, only the efficiency is calculated (maximum power voltage and current) and the solution from previous calculations is used to initiate the next solution.
1977-01-25
large numbers of refugees materialized. This plan, tentatively called Operation COMPASSION , envisioned that the refugees would at first be evacuated...Mptg ... —nex,- ■>• -g| ._ *M * UNCLASSIFIED * THIS MIGHT INVOLVE OVERNIGHT HOUSING FOR UP TO SDG TRANSIENT PERSONNEL. b- SUBP1IT...DDD REFUGEES AT FT CHAFFEE AND FT INDIANTOUN GAP UAS ACCOMPLISHED WITH COMPASSION AND EFFICIENCY BY THE ARMY TEAMS- ADDITIONALLY-. THE ORDERLY
ERIC Educational Resources Information Center
Junge, Melissa; Krvaric, Sheara
2012-01-01
Title I of the Elementary and Secondary Education Act, a federal program to provide additional assistance to academically struggling students in high-poverty areas, has long contained a provision called the "supplement-not-supplant" requirement. This provision was designed to ensure Title I funds were spent on extra educational services for…
The EMT universe: space between cancer cell dissemination and metastasis initiation.
Ombrato, Luigi; Malanchi, Ilaria
2014-01-01
Tumor metastasis, the cause of more than 90% of cancer cell mortality, is a multistep process by which tumor cells disseminate from their primary site via local invasion and intravasation into blood or lymphatic vessels and reach secondary distant sites, where they survive and reinitiate tumor growth. Activation of a developmental program called the epithelial-to-mesenchymal transition (EMT) has been shown to be a very efficient strategy adopted by epithelial cancer cells to promote local invasion and dissemination at distant organs. Remarkably, the activation of EMT programs in epithelial cells correlates with the appearance of stemness. This finding suggests that the EMT process also drives the initial cancer cell colonization at distant sites. However, recent studies support the concept that its reverse program, a mesenchymal-to-epithelial transition, is required for efficient metastatic colonization and that EMT is not necessarily associated with stemness. This review analyzes the conflicting experimental evidence linking epithelial plasticity to stemness in the light of an "EMT gradient model," according to which the outcome of EMT program activation in epithelial cells would be bimodal: coupled to stemness during initial activation, but when forced to reach an advanced mesenchymal status, it would become incompatible with stem cell abilities.
A Rewriting-Based Approach to Trace Analysis
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.
When seconds count: A study of communication variables in the opening segment of emergency calls.
Penn, Claire; Koole, Tom; Nattrass, Rhona
2017-09-01
The opening sequence of an emergency call influences the efficiency of the ambulance dispatch time. The greeting sequences in 105 calls to a South African emergency service were analysed. Initial results suggested the advantage of a specific two-part opening sequence. An on-site experiment aimed at improving call efficiency was conducted during one shift (1100 calls). Results indicated reduced conversational repairs and a significant reduction of 4 seconds in mean call length. Implications for systems and training are derived.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Constructing linkage maps in the genomics era with MapDisto 2.0.
Heffelfinger, Christopher; Fragoso, Christopher A; Lorieux, Mathias
2017-07-15
Genotyping by sequencing (GBS) generates datasets that are challenging to handle by current genetic mapping software with graphical interface. Geneticists need new user-friendly computer programs that can analyze GBS data on desktop computers. This requires improvements in computation efficiency, both in terms of speed and use of random-access memory (RAM). MapDisto v.2.0 is a user-friendly computer program for construction of genetic linkage maps. It includes several new major features: (i) handling of very large genotyping datasets like the ones generated by GBS; (ii) direct importation and conversion of Variant Call Format (VCF) files; (iii) detection of linkage, i.e. construction of linkage groups in case of segregation distortion; (iv) data imputation on VCF files using a new approach, called LB-Impute. Features i to iv operate through inclusion of new Java modules that are used transparently by MapDisto; (v) QTL detection via a new R/qtl graphical interface. The program is available free of charge at mapdisto.free.fr. mapdisto@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Biyikli, Emre; To, Albert C.
2015-01-01
A new topology optimization method called the Proportional Topology Optimization (PTO) is presented. As a non-sensitivity method, PTO is simple to understand, easy to implement, and is also efficient and accurate at the same time. It is implemented into two MATLAB programs to solve the stress constrained and minimum compliance problems. Descriptions of the algorithm and computer programs are provided in detail. The method is applied to solve three numerical examples for both types of problems. The method shows comparable efficiency and accuracy with an existing optimality criteria method which computes sensitivities. Also, the PTO stress constrained algorithm and minimum compliance algorithm are compared by feeding output from one algorithm to the other in an alternative manner, where the former yields lower maximum stress and volume fraction but higher compliance compared to the latter. Advantages and disadvantages of the proposed method and future works are discussed. The computer programs are self-contained and publicly shared in the website www.ptomethod.org. PMID:26678849
Implementing Implementation Science: An Approach for HIV Prevention, Care and Treatment Programs.
Lambdin, Barrot H; Cheng, Ben; Peter, Trevor; Mbwambo, Jessie; Apollo, Tsitsi; Dunbar, Megan; Udoh, Ifeoma C; Cattamanchi, Adithya; Geng, Elvin H; Volberding, Paul
2015-01-01
Though great progress has been realized over the last decade in extending HIV prevention, care and treatment in some of the least resourced settings of the world, a substantial gap remains between what we know works and what we are actually achieving in HIV programs. To address this, leaders have called for the adoption of an implementation science framework to improve the efficiency and effectiveness of HIV programs. Implementation science (IS) is a multidisciplinary scientific field that seeks generalizable knowledge about the magnitude of, determinants of and strategies to close the gap between evidence and routine practice for health in real-world settings. We propose an IS approach that is iterative in nature and composed of four major components: 1) Identifying Bottlenecks and Gaps, 2) Developing and Implementing Strategies, 3) Measuring Effectiveness and Efficiency, and 4) Utilizing Results. With this framework, IS initiatives draw from a variety of disciplines including qualitative and quantitative methodologies in order to develop new approaches responsive to the complexities of real world program delivery. In order to remain useful for the changing programmatic landscape, IS research should factor in relevant timeframes and engage the multi-sectoral community of stakeholders, including community members, health care teams, program managers, researchers and policy makers, to facilitate the development of programs, practices and polices that lead to a more effective and efficient global AIDS response. The approach presented here is a synthesis of approaches and is a useful model to address IS-related questions for HIV prevention, care and treatment programs. This approach, however, is not a panacea, and we will continue to learn new ways of thinking as we move forward to close the implementation gap.
Implementing Implementation Science: An Approach for HIV Prevention, Care and Treatment Programs
Lambdin, Barrot H.; Cheng, Ben; Peter, Trevor; Mbwambo, Jessie; Apollo, Tsitsi; Dunbar, Megan; Udoh, Ifeoma C.; Cattamanchi, Adithya; Geng, Elvin H.; Volberding, Paul
2015-01-01
Though great progress has been realized over the last decade in extending HIV prevention, care and treatment in some of the least resourced settings of the world, a substantial gap remains between what we know works and what we are actually achieving in HIV programs. To address this, leaders have called for the adoption of an implementation science framework to improve the efficiency and effectiveness of HIV programs. Implementation science (IS) is a multidisciplinary scientific field that seeks generalizable knowledge about the magnitude of, determinants of and strategies to close the gap between evidence and routine practice for health in real-world settings. We propose an IS approach that is iterative in nature and composed of four major components: 1) Identifying Bottlenecks and Gaps, 2) Developing and Implementing Strategies, 3) Measuring Effectiveness and Efficiency, and 4) Utilizing Results. With this framework, IS initiatives draw from a variety of disciplines including qualitative and quantitative methodologies in order to develop new approaches responsive to the complexities of real world program delivery. In order to remain useful for the changing programmatic landscape, IS research should factor in relevant timeframes and engage the multi-sectoral community of stakeholders, including community members, health care teams, program managers, researchers and policy makers, to facilitate the development of programs, practices and polices that lead to a more effective and efficient global AIDS response. The approach presented here is a synthesis of approaches and is a useful model to address IS-related questions for HIV prevention, care and treatment programs. This approach, however, is not a panacea, and we will continue to learn new ways of thinking as we move forward to close the implementation gap. PMID:25986374
Jalilvand, Anahita; Suzo, Andrew; Hornor, Melissa; Layton, Kristina; Abdel-Rasoul, Mahmoud; Macadam, Luke; Mikami, Dean; Needleman, Bradley; Noria, Sabrena
2016-01-01
BACKGROUND Bariatric surgery is well established as an effective means of treating obesity; however 30-day readmission rates remain high. The Bariatric Care Coaching Program was developed in response to a perceived need for better communication with patients upon discharge from hospital, and prior to being seen at their first post-op visit. The lack of communication was apparent from the number of patient phone calls to clinic and readmissions to hospital. OBJECTIVES The aim of this study was to evaluate the impact of the Care Coach Program on hospital length of stay (LOS), readmission rates, patient phone calls, and patient satisfaction. SETTING The study was conducted at The Ohio State University, Wexner Medical Center. METHODS A retrospective review was conducted on patients who had primary bariatric surgery from July 1, 2013 to June 30, 2015. The control group included patients who underwent surgery from July 1, 2013 – June 30, 2014, before development of the program, and the experimental group was comprised of patients who received care coaching from July 1, 2014 – June 30, 2015. Demographics, post-operative complications, LOS, clinic phone calls and hospital readmissions prior to the first post-operative visit were collected from medical records. Patient satisfaction scores were collected from the Hospital Consumer Assessment of Healthcare Providers and Systems Survey [HCAHPS]. Univariate, bivariate co-efficient analysis, and a conditional logistic regression model were performed utilizing SAS software. RESULTS There were 261 and 264 patients in the care-coach and control groups, respectively. The care-coached group had fewer patients with intractable nausea/vomiting (11.11%; [p=0.0164]), and more patients with a shorter LOS (2.3 + 1.1 days; [p=0.032]), related to laparoscopic sleeve gastrectomy (2 + 0.9 days vs. 2.3 + 0.8 days; [p=0.002]). There was no difference in readmission rates [p=0.841] or phone calls to clinic [p=0.407]. HCAHPS scores demonstrated an improvement in patients’ perception of communication regarding medications (59th versus 27th percentile), discharge information (98th versus 93rd percentile), and likelihood of recommending the hospital (85th versus 74th percentile). CONCLUSION: The Bariatric Care Coach Program is an important new adjunct in the care of our bariatric inpatients. It has had the greatest impact on post-operative nausea/vomiting, LOS for sleeve gastrectomy, and patient satisfaction. Further studies are needed to evaluate how to use this program to reduce readmission rates and phone calls to clinic. PMID:27320222
Bilingual parallel programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, I.; Overbeek, R.
1990-01-01
Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less
Multidisciplinary analysis of actively controlled large flexible spacecraft
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Young, John W.; Sutter, Thomas R.
1986-01-01
The control of Flexible Structures (COFS) program has supported the development of an analysis capability at the Langley Research Center called the Integrated Multidisciplinary Analysis Tool (IMAT) which provides an efficient data storage and transfer capability among commercial computer codes to aid in the dynamic analysis of actively controlled structures. IMAT is a system of computer programs which transfers Computer-Aided-Design (CAD) configurations, structural finite element models, material property and stress information, structural and rigid-body dynamic model information, and linear system matrices for control law formulation among various commercial applications programs through a common database. Although general in its formulation, IMAT was developed specifically to aid in the evaluation of the structures. A description of the IMAT system and results of an application of the system are given.
Sustainable NREL Biennial Report, FY 2012 - 2013 (Management Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slovensky, Michelle
2014-03-01
NREL's Sustainability Program plays a vital role bridging research and operations - integrating energy efficiency, water and material resource conservation and cultural change - adding depth in the fulfillment of NREL's mission. The report, per the GRI reporting format, elaborates on multi-year goals relative to executive orders, achievements, and challenges; and success stories provide specific examples. A section called "The Voice of NREL" gives an inside perspective of how to become more sustainable while at the same time addressing climate change.
Validation Data for Mechanical System Algorithms Used in Building Energy Analysis Programs.
1982-02-01
15 Zone Design 15 Built-Up Air Handler 15 Ventilation Requirements 16 The DES 16 Duct Design 17 Air -Delivery System 17 VAV Operation 17 Constant Volume...observed to operate well at reduced air flows, even at low flow in the so- called surge region. Recommendations 1. The HVAC system and component...With Inlet Guide Vanes Operating Within a Built-Up Air Handler 31 Test 2 -- Boiler Operation, Capacity, Efficiency, and Stand-By Losses 32 Test 3
Pickard, Katherine E; Wainer, Allison L; Bailey, Kathryn M; Ingersoll, Brooke R
2016-10-01
Research within the autism spectrum disorder field has called for the use of service delivery models that are able to more efficiently disseminate evidence-based practices into community settings. This study employed telehealth methods in order to deliver an Internet-based, parent training intervention for autism spectrum disorder, ImPACT Online. This study used mixed-methods analysis to create a more thorough understanding of parent experiences likely to influence the adoption and implementation of the program in community settings. Specific research questions included (1) What are parents' perceptions of the online program? (2) How does ImPACT Online compare to other services that parents are accessing for their children? And (3) Do parents' experience in, and perceptions of, the program differ based on whether they received a therapist-assisted version of the program? Results from 28 parents of a child with autism spectrum disorder indicate that parents saw improvements in their child's social communication skills and their own competence during the course of the program, regardless of whether they received therapist assistance. However, qualitative interviews indicate that parents who received therapist assistance were more likely endorse the acceptability and observability of the program. These findings support the potential for Internet-based service delivery to more efficiently disseminate evidence-based parent training interventions for autism spectrum disorder. © The Author(s) 2016.
Creating a vision for your medical call center.
Barr, J L; Laufenberg, S; Sieckman, B L
1998-01-01
MCC technologies and applications that can have a positive impact on managed care delivery are almost limitless. As you determine your vision, be sure to have in mind the following questions: (1) Do you simply want an efficient front end for receiving calls? (2) Do you want to offer triage services? (3) Is your organization ready for a fully functional "electronic physician's office?" Understand your organization's strategy. Where are you going, not only today but five years from now? That information is essential to determine your vision. Once established, your vision will help determine what you need and whether you should build or outsource. Vendors will assist in cost/benefit analysis of their equipment, but do not lose sight of internal factors such as "prior inclination" costs in the case of a nurse triage program. The technology is available to take your vision to its outer reaches. With the projected increase in utilization of call center services, don't let your organization be left behind!
Benko, Matúš; Gfrerer, Helmut
2018-01-01
In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.
Opening the door to coordination of care through teachable moments.
Berg, Gregory D; Korn, Allan M; Thomas, Eileen; Klemka-Walden, Linda; Bigony, Marysanta D; Newman, John F
2007-10-01
The challenge for care coordination is to identify members at a moment in time when they are receptive to intervention and provide the appropriate care management services. This manuscript describes a pilot program using inbound nurse advice calls from members to engage them in a care management program including disease management (DM). Annual medical claims diagnoses were used to identify members and their associated disease conditions. For each condition group for each year, nurse advice call data were used to calculate inbound nurse advice service call rates for each group. A pilot program was set up to engage inbound nurse advice callers in a broader discussion of their health concerns and refer them to a care management program. Among the program results, both the call rate by condition group and the correlation between average costs and call rates show that higher cost groups of members call the nurse advice service disproportionately more than lower cost members. Members who entered the DM programs through the nurse advice service were more likely to stay in the program than those who participated in the standard opt-in program. The results of this pilot program suggest that members who voluntarily call in to the nurse advice service for triage are at a "teachable moment" and highly motivated to participate in appropriate care management programs. The implication is that the nurse advice service may well be an innovative and effective way to enhance participation in a variety of care management programs including DM.
[Second victim : Critical incident stress management in clinical medicine].
Schiechtl, B; Hunger, M S; Schwappach, D L; Schmidt, C E; Padosch, S A
2013-09-01
Critical incidents in clinical medicine can have far-reaching consequences on patient health. In cases of severe medical errors they can seriously harm the patient or even lead to death. The involvement in such an event can result in a stress reaction, a so-called acute posttraumatic stress disorder in the healthcare provider, the so-called second victim of an adverse event. Psychological distress may not only have a long lasting impact on quality of life of the physician or caregiver involved but it may also affect the ability to provide safe patient care in the aftermath of adverse events. A literature review was performed to obtain information on care giver responses to medical errors and to determine possible supportive strategies to mitigate negative consequences of an adverse event on the second victim. An internet search and a search in Medline/Pubmed for scientific studies were conducted using the key words "second victim, "medical error", "critical incident stress management" (CISM) and "critical incident stress reporting system" (CIRS). Sources from academic medical societies and public institutions which offer crisis management programs where analyzed. The data were sorted by main categories and relevance for hospitals. Analysis was carried out using descriptive measures. In disaster medicine and aviation navigation services the implementation of a CISM program is an efficient intervention to help staff to recover after a traumatic event and to return to normal functioning and behavior. Several other concepts for a clinical crisis management plan were identified. The integration of CISM and CISM-related programs in a clinical setting may provide efficient support in an acute crisis and may help the caregiver to deal effectively with future error events and employee safety.
Enumerating Substituted Benzene Isomers of Tree-Like Chemical Graphs.
Li, Jinghui; Nagamochi, Hiroshi; Akutsu, Tatsuya
2018-01-01
Enumeration of chemical structures is useful for drug design, which is one of the main targets of computational biology and bioinformatics. A chemical graph with no other cycles than benzene rings is called tree-like, and becomes a tree possibly with multiple edges if we contract each benzene ring into a single virtual atom of valence 6. All tree-like chemical graphs with a given tree representation are called the substituted benzene isomers of . When we replace each virtual atom in with a benzene ring to obtain a substituted benzene isomer, distinct isomers of are caused by the difference in arrangements of atom groups around a benzene ring. In this paper, we propose an efficient algorithm that enumerates all substituted benzene isomers of a given tree representation . Our algorithm first counts the number of all the isomers of the tree representation by a dynamic programming method. To enumerate all the isomers, for each , our algorithm then generates the th isomer by backtracking the counting phase of the dynamic programming. We also implemented our algorithm for computational experiments.
The force on the flex: Global parallelism and portability
NASA Technical Reports Server (NTRS)
Jordan, H. F.
1986-01-01
A parallel programming methodology, called the force, supports the construction of programs to be executed in parallel by an unspecified, but potentially large, number of processes. The methodology was originally developed on a pipelined, shared memory multiprocessor, the Denelcor HEP, and embodies the primitive operations of the force in a set of macros which expand into multiprocessor Fortran code. A small set of primitives is sufficient to write large parallel programs, and the system has been used to produce 10,000 line programs in computational fluid dynamics. The level of complexity of the force primitives is intermediate. It is high enough to mask detailed architectural differences between multiprocessors but low enough to give the user control over performance. The system is being ported to a medium scale multiprocessor, the Flex/32, which is a 20 processor system with a mixture of shared and local memory. Memory organization and the type of processor synchronization supported by the hardware on the two machines lead to some differences in efficient implementations of the force primitives, but the user interface remains the same. An initial implementation was done by retargeting the macros to Flexible Computer Corporation's ConCurrent C language. Subsequently, the macros were caused to directly produce the system calls which form the basis for ConCurrent C. The implementation of the Fortran based system is in step with Flexible Computer Corporations's implementation of a Fortran system in the parallel environment.
49 CFR 198.39 - Qualifications for operation of one-call notification system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Qualifications for operation of one-call...) PIPELINE SAFETY REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.39 Qualifications for operation of one-call notification system. A one-call...
Using Decision Procedures to Build Domain-Specific Deductive Synthesis Systems
NASA Technical Reports Server (NTRS)
VanBaalen, Jeffrey; Roach, Steven; Lau, Sonie (Technical Monitor)
1998-01-01
This paper describes a class of decision procedures that we have found useful for efficient, domain-specific deductive synthesis. These procedures are called closure-based ground literal satisfiability procedures. We argue that this is a large and interesting class of procedures and show how to interface these procedures to a theorem prover for efficient deductive synthesis. Finally, we describe some results we have observed from our implementation. Amphion/NAIF is a domain-specific, high-assurance software synthesis system. It takes an abstract specification of a problem in solar system mechanics, such as 'when will a signal sent from the Cassini spacecraft to Earth be blocked by the planet Saturn?', and automatically synthesizes a FORTRAN program to solve it.
NASA Technical Reports Server (NTRS)
Tick, Evan
1987-01-01
This note describes an efficient software emulator for the Warren Abstract Machine (WAM) Prolog architecture. The version of the WAM implemented is called Lcode. The Lcode emulator, written in C, executes the 'naive reverse' benchmark at 3900 LIPS. The emulator is one of a set of tools used to measure the memory-referencing characteristics and performance of Prolog programs. These tools include a compiler, assembler, and memory simulators. An overview of the Lcode architecture is given here, followed by a description and listing of the emulator code implementing each Lcode instruction. This note will be of special interest to those studying the WAM and its performance characteristics. In general, this note will be of interest to those creating efficient software emulators for abstract machine architectures.
Investigating emergency room service quality using lean manufacturing.
Abdelhadi, Abdelhakim
2015-01-01
The purpose of this paper is to investigate a lean manufacturing metric called Takt time as a benchmark evaluation measure to evaluate a public hospital's service quality. Lean manufacturing is an established managerial philosophy with a proven track record in industry. A lean metric called Takt time is applied as a measure to compare the relative efficiency between two emergency departments (EDs) belonging to the same public hospital. Outcomes guide managers to improve patient services and increase hospital performances. The patient treatment lead time within the hospital's two EDs (one department serves male and the other female patients) are the study's focus. A lean metric called Takt time is used to find the service's relative efficiency. Findings show that the lean manufacturing metric called Takt time can be used as an effective way to measure service efficiency by analyzing relative efficiency and identifies bottlenecks in different departments providing the same services. The paper presents a new procedure to compare relative efficiency between two EDs. It can be applied to any healthcare facility.
Expectations of iPad use in an internal medicine residency program: is it worth the "hype"?
Luo, Nancy; Chapman, Christopher G; Patel, Bhakti K; Woodruff, James N; Arora, Vineet M
2013-05-08
While early reports highlight the benefits of tablet computing in hospitals, introducing any new technology can result in inflated expectations. The aim of the study is to compare anticipated expectations of Apple iPad use and perceptions after deployment among residents. 115 internal medicine residents received Apple iPads in October 2010. Residents completed matched surveys on anticipated usage and perceptions after distribution 1 month prior and 4 months after deployment. In total, 99% (114/115) of residents responded. Prior to deployment, most residents believed that the iPad would improve patient care and efficiency on the wards; however, fewer residents "strongly agreed" after deployment (34% vs 15% for patient care, P<.001; 41% vs 24% for efficiency, P=.005). Residents with higher expectations were more likely to report using the iPad for placing orders post call and during admission (71% vs 44% post call, P=.01, and 16% vs 0% admission, P=.04). Previous Apple iOS product owners were also more likely to use the iPad in key areas. Overall, 84% of residents thought the iPad was a good investment for the residency program, and over half of residents (58%) reported that patients commented on the iPad in a positive way. While the use of tablets such as the iPad by residents is generally well received, high initial expectations highlight the danger of implementing new technologies. Education on the realistic expectations of iPad benefits may be warranted.
Everyone wins - a program to upgrade energy efficiency in manufactured housing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, A.D.; Onisko, S.A.; Sandahl, L.J.
1994-03-01
Other regions might well benefit from this case history, illustrating how a region marshalled its resources to bring manufactured housing--a significant share of its new residential sector--into the modern era of energy efficiency. Everyone was a winner. In the Pacific Northwest, as in many parts of the country, a significant proportion of new homes are HUD-code manufactured, or so-called mobile, homes. About 25% of new single-family houses in the Pacific Northwest are manufactured homes. They represent an even larger share - nearly 40% - of new electrically heated housing in the region, and this share has been growing. When Congressmore » enacted the Pacific Northwest Power Planning Act of 1980, it also permitted the four Northwest states to establish an interstate compact body - the Northwest Power Planning Council - and required the Council to produce an integrated resource plan for the region served by the Bonneville Power Administration, the federal power marketing and transmission agency that operates the region's major transmission grid and sells most of its bulk power. Both the law and the plan charge Bonneville with developing cost-effective programs to save electricity in all end-use sectors through improved energy efficiency.« less
NASA Technical Reports Server (NTRS)
Sah, C. T.
1983-01-01
The performance improvements obtainable from extending the traditionally thin back-surface-field (BSF) layer deep into the base of silicon solar cells under terrestrial solar illumination (AM1) are analyzed. This extended BSF cell is also known as the back-drift-field cell. About 100 silicon cells were analyzed, each with a different emitter or base dopant impurity distribution whose selection was based on physically anticipated improvements. The four principal performance parameters (the open-circuit voltage, the short-circuit current, the fill factor, and the maximum efficiency) are computed using a FORTRAN program, called Circuit Technique for Semiconductor-device Analysis, CTSA, which numerically solves the six Shockley Equations under AM1 solar illumination at 88.92 mW/cm, at an optimum cell thickness of 50 um. The results show that very significant performance improvements can be realized by extending the BSF layer thickness from 2 um (18% efficiency) to 40 um (20% efficiency).
Computer-based learning: interleaving whole and sectional representation of neuroanatomy.
Pani, John R; Chariker, Julia H; Naaz, Farah
2013-01-01
The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously integrate learning of whole and sectional anatomy. A study of computer-based learning of neuroanatomy was conducted to compare a basic transfer paradigm for learning whole and sectional neuroanatomy with a method in which the two forms of representation were interleaved (alternated). For all experimental groups, interactive computer programs supported an approach to instruction called adaptive exploration. Each learning trial consisted of time-limited exploration of neuroanatomy, self-timed testing, and graphical feedback. The primary result of this study was that interleaved learning of whole and sectional neuroanatomy was more efficient than the basic transfer method, without cost to long-term retention or generalization of knowledge to recognizing new images (Visible Human and MRI). Copyright © 2012 American Association of Anatomists.
Computer-Based Learning: Interleaving Whole and Sectional Representation of Neuroanatomy
Pani, John R.; Chariker, Julia H.; Naaz, Farah
2015-01-01
The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously integrate learning of whole and sectional anatomy. A study of computer-based learning of neuroanatomy was conducted to compare a basic transfer paradigm for learning whole and sectional neuroanatomy with a method in which the two forms of representation were interleaved (alternated). For all experimental groups, interactive computer programs supported an approach to instruction called adaptive exploration. Each learning trial consisted of time-limited exploration of neuroanatomy, self-timed testing, and graphical feedback. The primary result of this study was that interleaved learning of whole and sectional neuroanatomy was more efficient than the basic transfer method, without cost to long-term retention or generalization of knowledge to recognizing new images (Visible Human and MRI). PMID:22761001
Analysis of labor employment assessment on production machine to minimize time production
NASA Astrophysics Data System (ADS)
Hernawati, Tri; Suliawati; Sari Gumay, Vita
2018-03-01
Every company both in the field of service and manufacturing always trying to pass efficiency of it’s resource use. One resource that has an important role is labor. Labor has different efficiency levels for different jobs anyway. Problems related to the optimal allocation of labor that has different levels of efficiency for different jobs are called assignment problems, which is a special case of linear programming. In this research, Analysis of Labor Employment Assesment on Production Machine to Minimize Time Production, in PT PDM is done by using Hungarian algorithm. The aim of the research is to get the assignment of optimal labor on production machine to minimize time production. The results showed that the assignment of existing labor is not suitable because the time of completion of the assignment is longer than the assignment by using the Hungarian algorithm. By applying the Hungarian algorithm obtained time savings of 16%.
NASA In-Space Propulsion Technology Program: Overview and Update
NASA Technical Reports Server (NTRS)
Johnson, Les; Alexander, Leslie; Baggett, Randy M.; Bonometti, Joseph A.; Herrmann, Melody; James, Bonnie F.; Montgomery, Sandy E.
2004-01-01
NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. The In-Space Propulsion Technology Program's technology portfolio includes many advanced propulsion systems. From the next-generation ion propulsion system operating in the 5- to 10-kW range to aerocapture and solar sails, substantial advances in - spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals use the environment of space itself for energy and propulsion and are generically called 'propellantless' because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations such as solar sails, electrodynamic and momentum transfer.tethers, aeroassist and aerocapture. This paper will provide an overview of both propellantless and propellant-based advanced propulsion technologies, as well as NASA's plans for advancing them as part of the In-Space Propulsion Technology Program.
NASA's In-Space Propulsion Technology Program: Overview and Status
NASA Technical Reports Server (NTRS)
Johnson, Les; Alexander, Leslie; Baggett, Randy; Bonometti, Joe; Herrmann, Melody; James, Bonnie; Montgomery, Sandy
2004-01-01
NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. The In-Space Propulsion Technology Program s technology portfolio includes many advanced propulsion systems. From the next generation ion propulsion system operating in the 5 - 10 kW range, to advanced cryogenic propulsion, substantial advances in spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals use the environment of space itself for energy and propulsion and are generically called, 'propellantless' because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations such as solar sails, electrodynamic and momentum transfer tethers, aeroassist, and aerocapture. This paper will provide an overview of both propellantless and propellant-based advanced propulsion technologies, and NASA s plans for advancing them as part of the $60M per year In-Space Propulsion Technology Program.
NASA's In-Space Propulsion Technology Program: Overview and Update
NASA Technical Reports Server (NTRS)
Johnson, Les; Alexander, Leslie; Baggett, Randy M.; Bonometti, Joseph A.; Herrmann, Melody; James, Bonnie F.; Montgomery, Sandy E.
2004-01-01
NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. The In-Space Propulsion Technology Program s technology portfolio includes many advanced propulsion systems. From the next-generation ion propulsion system operating in the 5- to 10-kW range to aerocapture and solar sails, substantial advances in spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals ase the environment of space itself for energy and propulsion and are generically called 'propellantless' because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations such as solar sails, electrodynamic and momentum transfer tethers, aeroassist, and aerocapture. This paper will provide an overview of both propellantless and propellant-based advanced propulsion technologies, as well as NASA s plans for advancing them as part of the In-Space Propulsion Technology Program.
Multiple Interactive Pollutants in Water Quality Trading
NASA Astrophysics Data System (ADS)
Sarang, Amin; Lence, Barbara J.; Shamsai, Abolfazl
2008-10-01
Efficient environmental management calls for the consideration of multiple pollutants, for which two main types of transferable discharge permit (TDP) program have been described: separate permits that manage each pollutant individually in separate markets, with each permit based on the quantity of the pollutant or its environmental effects, and weighted-sum permits that aggregate several pollutants as a single commodity to be traded in a single market. In this paper, we perform a mathematical analysis of TDP programs for multiple pollutants that jointly affect the environment (i.e., interactive pollutants) and demonstrate the practicality of this approach for cost-efficient maintenance of river water quality. For interactive pollutants, the relative weighting factors are functions of the water quality impacts, marginal damage function, and marginal treatment costs at optimality. We derive the optimal set of weighting factors required by this approach for important scenarios for multiple interactive pollutants and propose using an analytical elasticity of substitution function to estimate damage functions for these scenarios. We evaluate the applicability of this approach using a hypothetical example that considers two interactive pollutants. We compare the weighted-sum permit approach for interactive pollutants with individual permit systems and TDP programs for multiple additive pollutants. We conclude by discussing practical considerations and implementation issues that result from the application of weighted-sum permit programs.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
New York area and worldwide: call-in radio program on HIV.
1999-07-16
Treatment activist Jules Levin, founder of the National AIDS Treatment Advocacy Group, has begun a weekly radio program called "Living Well with HIV". Listeners can call in with questions for experts featured on the show. Programs on hepatitis and AIDS have already been scheduled. Contact information is provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knittel, Christopher; Wolfran, Catherine; Gandhi, Raina
A wide range of climate plans rely on energy efficiency to generate energy and carbon emissions reductions, but conventional wisdom holds that consumers have historically underinvested in energy efficiency upgrades. This underinvestment may occur for a variety of reasons, one of which is that consumers are not adequately informed about the benefits to energy efficiency. To address this, the U.S. Department of Energy created a tool called the Home Energy Score (HEScore) to act as a simple, low-cost means to provide clear information about a home’s energy efficiency and motivate homeowners and homebuyers to invest in energy efficiency. The Departmentmore » of Energy is in the process of conducting four evaluations assessing the impact of the Home Energy Score on residential energy efficiency investments and program participation. This paper describes one of these evaluations: a randomized controlled trial conducted in New Jersey in partnership with New Jersey Natural Gas. The evaluation randomly provides homeowners who have received an audit, either because they have recently replaced their furnace, boiler, and/or gas water heater with a high-efficiency model and participated in a free audit to access an incentive, or because they requested an independent audit3, between May 2014 and October 2015, with the Home Energy Score.« less
Near Zero Emissions at 50 Percent Thermal Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2012-12-31
Detroit Diesel Corporation (DDC) has successfully completed a 10 year DOE sponsored heavy-duty truck engine program, hereafter referred to as the NZ-50 program. This program was split into two major phases. The first phase was called Near-Zero Emission at 50 Percent Thermal Efficiency, and was completed in 2007. The second phase was initiated in 2006, and this phase was named Advancements in Engine Combustion Systems to Enable High-Efficiency Clean Combustion for Heavy-Duty Engines. This phase was completed in September, 2010. The key objectives of the NZ-50 program for this first phase were to: Quantify thermal efficiency degradation associated with reductionmore » of engine-out NOx emissions to the 2007 regulated level of ~1.1 g/hp-hr. Implement an integrated analytical/experimental development plan for improving subsystem and component capabilities in support of emerging engine technologies for emissions and thermal efficiency goals of the program. Test prototype subsystem hardware featuring technology enhancements and demonstrate effective application on a multi-cylinder, production feasible heavy-duty engine test-bed. Optimize subsystem components and engine controls (calibration) to demonstrate thermal efficiency that is in compliance with the DOE 2005 Joule milestone, meaning greater than 45% thermal efficiency at 2007 emission levels. Develop technology roadmap for meeting emission regulations of 2010 and beyond while mitigating the associated degradation in engine fuel consumption. Ultimately, develop technical prime-path for meeting the overall goal of the NZ-50 program, i.e., 50% thermal efficiency at 2010 regulated emissions. These objectives were successfully met during the course of the NZ-50 program. The most noteworthy achievements in this program are summarized as follows: Demonstrated technologies through advanced integrated experiments and analysis to achieve the technical objectives of the NZ-50 program with 50.2% equivalent thermal efficiency under EPA 2010 emissions regulations. Experimentally demonstrate brake efficiency of 48.5% at EPA 2010 emission level at single steady-state point. Analytically demonstrated additional brake efficiency benefits using advanced aftertreatment configuration concept and air system enhancement including, but not limited to, turbo-compound, variable valve actuator system, and new cylinder head redesign, thus helping to achieve the final program goals. Experimentally demonstrated EPA 2010 emissions over FTP cycles using advanced integrated engine and aftertreatment system. These aggressive thermal efficiency and emissions results were achieved by applying a robust systems technology development methodology. It used integrated analytical and experimental tools for subsystem component optimization encompassing advanced fuel injection system, increased EGR cooling capacity, combustion process optimization, and advanced aftertreatment technologies. Model based controls employing multiple input and output techniques enabled efficient integration of the various subsystems and ensured optimal performance of each system within the total engine package. . The key objective of the NZ-50 program for the second phase was to explore advancements in engine combustion systems using high-efficiency clean combustion (HECC) techniques to minimize cylinder-out emissions, targeting a 10% efficiency improvement. The most noteworthy achievements in this phase of the program are summarized as follows: Experimentally and analytically evaluated numerous air system improvements related to the turbocharger and variable valve actuation. Some of the items tested proved to be very successful and modifications to the turbine discovered in this program have since been incorporated into production hardware. The combustion system development continued with evaluation of various designs of the 2-step piston bowl. Significant improvements in engine emissions have been obtained, but fuel economy improvements have been tougher to realize. Development of a neural network control system progressed to the point that the system was fully functional and showing significant fuel economy gains in transient engine testing. Development of the QuantLogic injector with the capability of both a hollow cone spray during early injection and conventional diesel injection at later injection timings was undertaken and proved to be problematic. This injector was designed to be a key component in a PCCI combustion system, but this innovative fuel injector required significantly more development effort than this programâ's resources or timing would allow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Traylor, T.D.; Hicks, S.C.
1994-03-01
Transportation Energy Research announces on a monthly basis the current worldwide research and development information available on energy-efficient, environmentally sound transportation technologies. Its purpose is to enhance the technology transfer efforts of the Department of Energy. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`s Energy Technology Data Exchange or government-to-government agreements. The DOE Office of Transportation Technologies (OTT) managesmore » federal R&D programs aimed at improving transportation-sector energy efficiency. OTT currently supports activities in four major program areas: Electric and Hybrid Vehicles; Advanced Propulsion Systems; and magnetic levitation technology; Advanced Materials. DOE and DOE contractors can obtain copies for $4.00 per issue by using VISA, MasterCard, or OSTI deposit accounts. Contact the Office of Scientific and Technical Information, P.O. Box 62, Oak Ridge, TN 37831, Attention: Information Services. For further information, call (615) 576-8401. Public availability is by subscription from the US Department of Commerce, Technology Administration, National Technical Information Service, Springfield, VA 22161. Order PB94-900900.« less
NASA Astrophysics Data System (ADS)
Lee, Hyunki; Kim, Min Young; Moon, Jeon Il
2017-12-01
Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, R.N.
1990-02-28
The Inspection Shop at Lawrence Livermore Lab recently purchased a Sheffield Apollo RS50 Direct Computer Control Coordinate Measuring Machine. The performance of the machine was specified to conform to B89 standard which relies heavily upon using the measuring machine in its intended manner to verify its accuracy (rather than parametric tests). Although it would be possible to use the interactive measurement system to perform these tasks, a more thorough and efficient job can be done by creating Function Library programs for certain tasks which integrate Hewlett-Packard Basic 5.0 language and calls to proprietary analysis and machine control routines. This combinationmore » provides efficient use of the measuring machine with a minimum of keyboard input plus an analysis of the data with respect to the B89 Standard rather than a CMM analysis which would require subsequent interpretation. This paper discusses some characteristics of the Sheffield machine control and analysis software and my use of H-P Basic language to create automated measurement programs to support the B89 performance evaluation of the CMM. 1 ref.« less
Multistage Stochastic Programming and its Applications in Energy Systems Modeling and Optimization
NASA Astrophysics Data System (ADS)
Golari, Mehdi
Electric energy constitutes one of the most crucial elements to almost every aspect of life of people. The modern electric power systems face several challenges such as efficiency, economics, sustainability, and reliability. Increase in electrical energy demand, distributed generations, integration of uncertain renewable energy resources, and demand side management are among the main underlying reasons of such growing complexity. Additionally, the elements of power systems are often vulnerable to failures because of many reasons, such as system limits, weak conditions, unexpected events, hidden failures, human errors, terrorist attacks, and natural disasters. One common factor complicating the operation of electrical power systems is the underlying uncertainties from the demands, supplies and failures of system components. Stochastic programming provides a mathematical framework for decision making under uncertainty. It enables a decision maker to incorporate some knowledge of the intrinsic uncertainty into the decision making process. In this dissertation, we focus on application of two-stage and multistage stochastic programming approaches to electric energy systems modeling and optimization. Particularly, we develop models and algorithms addressing the sustainability and reliability issues in power systems. First, we consider how to improve the reliability of power systems under severe failures or contingencies prone to cascading blackouts by so called islanding operations. We present a two-stage stochastic mixed-integer model to find optimal islanding operations as a powerful preventive action against cascading failures in case of extreme contingencies. Further, we study the properties of this problem and propose efficient solution methods to solve this problem for large-scale power systems. We present the numerical results showing the effectiveness of the model and investigate the performance of the solution methods. Next, we address the sustainability issue considering the integration of renewable energy resources into production planning of energy-intensive manufacturing industries. Recently, a growing number of manufacturing companies are considering renewable energies to meet their energy requirements to move towards green manufacturing as well as decreasing their energy costs. However, the intermittent nature of renewable energies imposes several difficulties in long term planning of how to efficiently exploit renewables. In this study, we propose a scheme for manufacturing companies to use onsite and grid renewable energies provided by their own investments and energy utilities as well as conventional grid energy to satisfy their energy requirements. We propose a multistage stochastic programming model and study an efficient solution method to solve this problem. We examine the proposed framework on a test case simulated based on a real-world semiconductor company. Moreover, we evaluate long-term profitability of such scheme via so called value of multistage stochastic programming.
ParticleCall: A particle filter for base calling in next-generation sequencing systems
2012-01-01
Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Annika; Perry, Michael; Smith, Brian
Smart meters, smart thermostats, and other new technologies provide previously unavailable high-frequency and location-specific energy usage data. Many utilities are now able to capture real-time, customer specific hourly interval usage data for a large proportion of their residential and small commercial customers. These vast, constantly growing streams of rich data (or, “big data”) have the potential to provide novel insights into key policy questions about how people make energy decisions. The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull useful insights out ofmore » this high-frequency, human-focused data. In this series, we call this “behavior analytics.” This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, disaggregated and heterogeneous information about actual energy use allows energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; enables evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and may provide better insights into the energy and peak hour savings associated with EE and DR programs (e.g., behavior-based (BB) programs). The goal of this series is to enable evidence-based and data-driven decision making by policy makers and industry stakeholders, including program planners, program administrators, utilities, state regulatory agencies, and evaluators. We focus on research findings that are immediately relevant.« less
Efficient reordering of PROLOG programs
NASA Technical Reports Server (NTRS)
Gooley, Markian M.; Wah, Benjamin W.
1989-01-01
PROLOG programs are often inefficient: execution corresponds to a depth-first traversal of an AND/OR graph; traversing subgraphs in another order can be less expensive. It is shown how the reordering of clauses within PROLOG predicates, and especially of goals within clauses, can prevent unnecessary search. The characterization and detection of restrictions on reordering is discussed. A system of calling modes for PROLOG, geared to reordering, is proposed, and ways to infer them automatically are discussed. The information needed for safe reordering is summarized, and which types can be inferred automatically and which must be provided by the user are considered. An improved method for determining a good order for the goals of PROLOG clauses is presented and used as the basis for a reordering system.
Speeding up parallel processing
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1988-01-01
In 1967 Amdahl expressed doubts about the ultimate utility of multiprocessors. The formulation, now called Amdahl's law, became part of the computing folklore and has inspired much skepticism about the ability of the current generation of massively parallel processors to efficiently deliver all their computing power to programs. The widely publicized recent results of a group at Sandia National Laboratory, which showed speedup on a 1024 node hypercube of over 500 for three fixed size problems and over 1000 for three scalable problems, have convincingly challenged this bit of folklore and have given new impetus to parallel scientific computing.
1981-09-01
of the emergencies (41.2%) followed by third molar /pericoronitis (15.9%), defective filling or fractured tooth (11.0%), gingival and periodcntal...cause of the emergencies (41.2%J followed by third molar /pericoronitis (15.9%), defective filling or fractured tooth (11.0%), gingival and periodontal...EXERCISES AT FORt IRWIN, CA 1981 CONDITION FREQUENCY PERCENT CARIES 75 41.2 THIRD MOLARS /PERICORONITIS 29 15.9 DEFECTIVE FILLING/FRACTURED TOOTH 20 11.0
Determining Resident Sleep During and After Call With Commercial Sleep Monitoring Devices.
Morhardt, Duncan R; Luckenbaugh, Amy; Goldstein, Cathy; Faerber, Gary J
2017-08-01
To demonstrate that commercial activity monitoring devices (CAMDs) are practical for monitoring resident sleep while on call. Studies that have directly monitored resident sleep are limited, likely owing to both cost and difficulty in study interpretation. The advent of wearable CAMDs that estimate sleep presents the opportunity to more readily evaluate resident sleep in physically active settings and "home call," a coverage arrangement familiar to urology programs. Twelve urology residents were outfitted with Fitbit Flex devices during "home call" for a total of 57 (out of 64, or 89%) call or post-call night pairs. Residents were surveyed with the Stanford Sleepiness Scale (SSS), a single-question alertness survey. Time in bed (TIB) was "time to bed" to "rise for day." Fitbit accelerometers register activity as follows: (1) not moving; (2) minimal movement or restless; or (3) above threshold for accelerometer to register steps. Total sleep time (TST) was the number of minutes in level 1 activity during TIB. Sleep efficiency (SE) was defined as TST divided by TIB. While on call, 10 responding (of 12 available, 83%) residents on average reported TIB as 347 minutes, TST as 165 minutes, and had an SE of 47%. Interestingly, SSS responses did not correlate with sleep parameters. Post-call sleep demonstrated increases in TIB, SE, and TST (+23%, +15%, and +44%, respectively) while sleepiness was reduced by 22%. We demonstrate that urologic residents can consistently wear CAMDs while on home call. SSS did not correlate with Fitbit-estimated sleep duration. Further study with such devices may enhance sleep deprivation recognition to improve resident sleep. Copyright © 2017 Elsevier Inc. All rights reserved.
A high-fidelity satellite ephemeris program for Earth satellites in eccentric orbits
NASA Technical Reports Server (NTRS)
Simmons, David R.
1990-01-01
A program for mission planning called the Analytic Satellite Ephemeris Program (ASEP), produces projected data for orbits that remain fairly close to the Earth. ASEP does not take into account lunar and solar perturbations. These perturbations are accounted for in another program called GRAVE, which incorporates more flexible means of input for initial data, provides additional kinds of output information, and makes use of structural programming techniques to make the program more understandable and reliable. GRAVE was revised, and a new program called ORBIT was developed. It is divided into three major phases: initialization, integration, and output. Results of the program development are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dean, J.; VanGeet, O.; Simkus, S.
This report outlines the lessons learned and sub-metered energy performance of an ultra low energy single family ranch home and duplex unit, called the Paradigm Pilot Project and presents the final design recommendations for a 153-unit net zero energy residential development called the Josephine Commons Project. Affordable housing development authorities throughout the United States continually struggle to find the most cost-effective pathway to provide quality, durable, and sustainable housing. The challenge for these authorities is to achieve the mission of delivering affordable housing at the lowest cost per square foot in environments that may be rural, urban, suburban, or withinmore » a designated redevelopment district. With the challenges the U.S. faces regarding energy, the environmental impacts of consumer use of fossil fuels and the increased focus on reducing greenhouse gas emissions, housing authorities are pursuing the goal of constructing affordable, energy efficient and sustainable housing at the lowest life-cycle cost of ownership. This report outlines the lessons learned and sub-metered energy performance of an ultra-low-energy single family ranch home and duplex unit, called the Paradigm Pilot Project and presents the final design recommendations for a 153-unit net zero energy residential development called the Josephine Commons Project. In addition to describing the results of the performance monitoring from the pilot project, this paper describes the recommended design process of (1) setting performance goals for energy efficiency and renewable energy on a life-cycle cost basis, (2) using an integrated, whole building design approach, and (3) incorporating systems-built housing, a green jobs training program, and renewable energy technologies into a replicable high performance, low-income housing project development model.« less
WhopGenome: high-speed access to whole-genome variation and sequence data in R.
Wittelsbürger, Ulrich; Pfeifer, Bastian; Lercher, Martin J
2015-02-01
The statistical programming language R has become a de facto standard for the analysis of many types of biological data, and is well suited for the rapid development of new algorithms. However, variant call data from population-scale resequencing projects are typically too large to be read and processed efficiently with R's built-in I/O capabilities. WhopGenome can efficiently read whole-genome variation data stored in the widely used variant call format (VCF) file format into several R data types. VCF files can be accessed either on local hard drives or on remote servers. WhopGenome can associate variants with annotations such as those available from the UCSC genome browser, and can accelerate the reading process by filtering loci according to user-defined criteria. WhopGenome can also read other Tabix-indexed files and create indices to allow fast selective access to FASTA-formatted sequence files. The WhopGenome R package is available on CRAN at http://cran.r-project.org/web/packages/WhopGenome/. A Bioconductor package has been submitted. lercher@cs.uni-duesseldorf.de. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
2011-01-01
Background This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Methods Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. Results There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Conclusion Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services. PMID:22177237
Schillo, Barbara A; Mowery, Andrea; Greenseid, Lija O; Luxenberg, Michael G; Zieffler, Andrew; Christenson, Matthew; Boyle, Raymond G
2011-12-16
This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services.
Benchmarking and Self-Assessment in the Wine Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst
2005-12-01
Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energymore » consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.« less
Koda, Hiroki
2012-09-01
Heterospecific communication signals sometimes convey relevant information for animal survival. For example, animals use or eavesdrop on heterospecific alarm calls concerning common predators. Indeed, most observations have been reported regarding anti-predator strategies. Use of heterospecific signals has rarely been observed as part of a foraging strategy. Here, I report empirical evidence, collected using playback experiments, showing that Japanese sika deer, Cevus nippon, use heterospecific food calls of Japanese macaques, Macaca fuscata yakui, for foraging efficiency. The deer and macaques both inhabit the wild forest of Yakushima Island with high population densities and share many food items. Anecdotal observations suggest that deer often wait to browse fruit falls under the tree where a macaque group is foraging. Furthermore, macaques frequently produce food calls during their foraging. If deer effectively obtain fruit from the leftovers of macaques, browsing fruit fall would provide a potential benefit to the deer, and, further, deer are likely to associate macaque food calls with feeding activity. The results showed that playback of macaque food calls under trees gathered significantly more deer than silence control periods. These results suggest that deer can associate macaque food calls with foraging activities and use heterospecific calls for foraging efficiency. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hughes, Christopher E.
2013-01-01
The National Aeronautics and Space Administration has taken an active role in collaborative research with the U.S. aerospace industry to investigate technologies to minimize the impact of aviation on the environment. In December 2006, a new program, called the Fundamental Aeronautics Program, was established to enhance U.S. aeronautics technology and conduct research on energy, efficiency and the environment. A project within the overall program, the Subsonic Fixed Wing Project, was formed to focus on research related to subsonic aircraft with specific goals and time based milestones to reduce aircraft noise, emissions and fuel burn. This paper will present an overview of the Subsonic Fixed Wing Project environmental goals and describe a segment of the current research within NASA and also were worked collaboratively with partners from the U.S. aerospace industry related to the next generation of aircraft that will have lower noise, emissions and fuel burn.
The X-windows interactive navigation data editor
NASA Technical Reports Server (NTRS)
Rinker, G. C.
1992-01-01
A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.
From non-preemptive to preemptive scheduling using synchronization synthesis.
Černý, Pavol; Clarke, Edmund M; Henzinger, Thomas A; Radhakrishna, Arjun; Ryzhyk, Leonid; Samanta, Roopsha; Tarrach, Thorsten
2017-01-01
We present a computer-aided programming approach to concurrency. The approach allows programmers to program assuming a friendly, non-preemptive scheduler, and our synthesis procedure inserts synchronization to ensure that the final program works even with a preemptive scheduler. The correctness specification is implicit, inferred from the non-preemptive behavior. Let us consider sequences of calls that the program makes to an external interface. The specification requires that any such sequence produced under a preemptive scheduler should be included in the set of sequences produced under a non-preemptive scheduler. We guarantee that our synthesis does not introduce deadlocks and that the synchronization inserted is optimal w.r.t. a given objective function. The solution is based on a finitary abstraction, an algorithm for bounded language inclusion modulo an independence relation, and generation of a set of global constraints over synchronization placements. Each model of the global constraints set corresponds to a correctness-ensuring synchronization placement. The placement that is optimal w.r.t. the given objective function is chosen as the synchronization solution. We apply the approach to device-driver programming, where the driver threads call the software interface of the device and the API provided by the operating system. Our experiments demonstrate that our synthesis method is precise and efficient. The implicit specification helped us find one concurrency bug previously missed when model-checking using an explicit, user-provided specification. We implemented objective functions for coarse-grained and fine-grained locking and observed that different synchronization placements are produced for our experiments, favoring a minimal number of synchronization operations or maximum concurrency, respectively.
Moeller, Andrew; Webber, Jordan; Epstein, Ian
2016-07-13
Resident duty hours have recently been under criticism, with concerns for resident and patient well-being. Historically, call shifts have been long, and some residency training programs have now restricted shift lengths. Data and opinions about the effects of such restrictions are conflicting. The Internal Medicine Residency Program at Dalhousie University recently moved from a traditional call structure to a day float/night float system. This study evaluated how this change in duty hours affected resident perceptions in several key domains. Senior residents from an internal medicine training program in Canada responded to an anonymous online survey immediately before and 6 months after the implementation of duty hour reform. The survey contained questions relating to three major domains: resident wellness, ability to deliver quality health care, and medical education experience. Mean pre- and post-intervention scores were compared using the t-test for paired samples. Twenty-three of 27 (85 %) senior residents completed both pre- and post-reform surveys. Residents perceived significant changes in many domains with duty hour reform. These included improved general wellness, less exposure to personal harm, fewer feelings of isolation, less potential for error, improvement in clinical skills expertise, increased work efficiency, more successful teaching, increased proficiency in medical skills, more successful learning, and fewer rotation disruptions. Senior residents in a Canadian internal medicine training program perceived significant benefits in medical education experience, ability to deliver healthcare, and resident wellness after implementation of duty hour reform.
Standish, Kristopher A; Carland, Tristan M; Lockwood, Glenn K; Pfeiffer, Wayne; Tatineni, Mahidhar; Huang, C Chris; Lamberth, Sarah; Cherkas, Yauheniya; Brodmerkel, Carrie; Jaeger, Ed; Smith, Lance; Rajagopal, Gunaretnam; Curran, Mark E; Schork, Nicholas J
2015-09-22
Next-generation sequencing (NGS) technologies have become much more efficient, allowing whole human genomes to be sequenced faster and cheaper than ever before. However, processing the raw sequence reads associated with NGS technologies requires care and sophistication in order to draw compelling inferences about phenotypic consequences of variation in human genomes. It has been shown that different approaches to variant calling from NGS data can lead to different conclusions. Ensuring appropriate accuracy and quality in variant calling can come at a computational cost. We describe our experience implementing and evaluating a group-based approach to calling variants on large numbers of whole human genomes. We explore the influence of many factors that may impact the accuracy and efficiency of group-based variant calling, including group size, the biogeographical backgrounds of the individuals who have been sequenced, and the computing environment used. We make efficient use of the Gordon supercomputer cluster at the San Diego Supercomputer Center by incorporating job-packing and parallelization considerations into our workflow while calling variants on 437 whole human genomes generated as part of large association study. We ultimately find that our workflow resulted in high-quality variant calls in a computationally efficient manner. We argue that studies like ours should motivate further investigations combining hardware-oriented advances in computing systems with algorithmic developments to tackle emerging 'big data' problems in biomedical research brought on by the expansion of NGS technologies.
Nishizawa, Hiroaki; Nishimura, Yoshifumi; Kobayashi, Masato; Irle, Stephan; Nakai, Hiromi
2016-08-05
The linear-scaling divide-and-conquer (DC) quantum chemical methodology is applied to the density-functional tight-binding (DFTB) theory to develop a massively parallel program that achieves on-the-fly molecular reaction dynamics simulations of huge systems from scratch. The functions to perform large scale geometry optimization and molecular dynamics with DC-DFTB potential energy surface are implemented to the program called DC-DFTB-K. A novel interpolation-based algorithm is developed for parallelizing the determination of the Fermi level in the DC method. The performance of the DC-DFTB-K program is assessed using a laboratory computer and the K computer. Numerical tests show the high efficiency of the DC-DFTB-K program, a single-point energy gradient calculation of a one-million-atom system is completed within 60 s using 7290 nodes of the K computer. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
Energy-efficient membrane separations in the sweetener industry. Final report for Phase I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babcock, W.C.
1984-02-14
The objective of the program is to investigate the use of membrane processes as energy-efficient alternatives to conventional separation processes in current use in the corn sweetener industry. Two applications of membranes were studied during the program: (1) the concentration of corn steep water by reverse osmosis; and (2) the concentration of dilute wastes called sweetwater with a combination of reverse osmosis and a process known as countercurrent reverse osmosis. Laboratory experiments were conducted for both applications, and the results were used to conduct technical and economic analyses of the process. It was determined that the concentration of steep watermore » by reverse osmosis plus triple-effect evaporation offers savings of a factor of 2.5 in capital costs and a factor of 4.5 in operating costs over currently used triple-effect evaporation. In the concentration of sweetwater by reverse osmosis and countercurrent reverse osmosis, capital costs would be about the same as those for triple-effect evaporation, but operating costs would be only about one-half those of triple-effect evaporation.« less
Fuel efficient traffic signal operation and evaluation: Garden Grove Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-02-01
The procedures and results of a case study of fuel efficient traffic signal operation and evaluation in the City of Garden Grove, California are documented. Improved traffic signal timing was developed for a 70-intersection test network in Garden Grove using an optimization tool called the TRANSYT Version 8 computer program. Full-scale field testing of five alternative timing plans was conducted using two instrumented vehicles equipped to measure traffic performance characteristics and fuel consumption. The field tests indicated that significant improvements in traffic flow and fuel consumption result from the use of timing plans generated by the TRANSYT optimization model. Changingmore » from pre-existing to an optimized timing plan yields a networkwide 5 percent reduction in total travel time, more than 10 percent reduction in both the number of stops and stopped delay time, and 6 percent reduction in fuel consumption. Projections are made of the benefits and costs of implementing such a program at the 20,000 traffic signals in networks throughout the State of California.« less
PSOVina: The hybrid particle swarm optimization algorithm for protein-ligand docking.
Ng, Marcus C K; Fong, Simon; Siu, Shirley W I
2015-06-01
Protein-ligand docking is an essential step in modern drug discovery process. The challenge here is to accurately predict and efficiently optimize the position and orientation of ligands in the binding pocket of a target protein. In this paper, we present a new method called PSOVina which combined the particle swarm optimization (PSO) algorithm with the efficient Broyden-Fletcher-Goldfarb-Shannon (BFGS) local search method adopted in AutoDock Vina to tackle the conformational search problem in docking. Using a diverse data set of 201 protein-ligand complexes from the PDBbind database and a full set of ligands and decoys for four representative targets from the directory of useful decoys (DUD) virtual screening data set, we assessed the docking performance of PSOVina in comparison to the original Vina program. Our results showed that PSOVina achieves a remarkable execution time reduction of 51-60% without compromising the prediction accuracies in the docking and virtual screening experiments. This improvement in time efficiency makes PSOVina a better choice of a docking tool in large-scale protein-ligand docking applications. Our work lays the foundation for the future development of swarm-based algorithms in molecular docking programs. PSOVina is freely available to non-commercial users at http://cbbio.cis.umac.mo .
NASA Astrophysics Data System (ADS)
Penzias, Gregory; Janowczyk, Andrew; Singanamalli, Asha; Rusu, Mirabela; Shih, Natalie; Feldman, Michael; Stricker, Phillip D.; Delprado, Warick; Tiwari, Sarita; Böhm, Maret; Haynes, Anne-Maree; Ponsky, Lee; Viswanath, Satish; Madabhushi, Anant
2016-07-01
In applications involving large tissue specimens that have been sectioned into smaller tissue fragments, manual reconstruction of a “pseudo whole-mount” histological section (PWMHS) can facilitate (a) pathological disease annotation, and (b) image registration and correlation with radiological images. We have previously presented a program called HistoStitcher, which allows for more efficient manual reconstruction than general purpose image editing tools (such as Photoshop). However HistoStitcher is still manual and hence can be laborious and subjective, especially when doing large cohort studies. In this work we present AutoStitcher, a novel automated algorithm for reconstructing PWMHSs from digitized tissue fragments. AutoStitcher reconstructs (“stitches”) a PWMHS from a set of 4 fragments by optimizing a novel cost function that is domain-inspired to ensure (i) alignment of similar tissue regions, and (ii) contiguity of the prostate boundary. The algorithm achieves computational efficiency by performing reconstruction in a multi-resolution hierarchy. Automated PWMHS reconstruction results (via AutoStitcher) were quantitatively and qualitatively compared to manual reconstructions obtained via HistoStitcher for 113 prostate pathology sections. Distances between corresponding fiducials placed on each of the automated and manual reconstruction results were between 2.7%-3.2%, reflecting their excellent visual similarity.
Carrault, G; Cordier, M-O; Quiniou, R; Wang, F
2003-07-01
This paper proposes a novel approach to cardiac arrhythmia recognition from electrocardiograms (ECGs). ECGs record the electrical activity of the heart and are used to diagnose many heart disorders. The numerical ECG is first temporally abstracted into series of time-stamped events. Temporal abstraction makes use of artificial neural networks to extract interesting waves and their features from the input signals. A temporal reasoner called a chronicle recogniser processes such series in order to discover temporal patterns called chronicles which can be related to cardiac arrhythmias. Generally, it is difficult to elicit an accurate set of chronicles from a doctor. Thus, we propose to learn automatically from symbolic ECG examples the chronicles discriminating the arrhythmias belonging to some specific subset. Since temporal relationships are of major importance, inductive logic programming (ILP) is the tool of choice as it enables first-order relational learning. The approach has been evaluated on real ECGs taken from the MIT-BIH database. The performance of the different modules as well as the efficiency of the whole system is presented. The results are rather good and demonstrate that integrating numerical techniques for low level perception and symbolic techniques for high level classification is very valuable.
NASA's In-Space Propulsion Technology Program: A Step Toward Interstellar Exploration
NASA Technical Reports Server (NTRS)
Johnson, Les; James, Bonnie; Baggett, Randy; Montgomery, Sandy
2005-01-01
NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space. The maximum theoretical efficiencies have almost been reached and are insufficient to meet needs for many ambitious science missions currently being considered. By developing the capability to support mid-term robotic mission needs, the program is laying the technological foundation for travel to nearby interstellar space. The In-Space Propulsion Technology Program s technology portfolio includes many advanced propulsion systems. From the next-generation ion propulsion systems operating in the 5-10 kW range, to solar sail propulsion, substantial advances in spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals use the environment of space itself for energy and propulsion and are generically called "propellantless" because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations, such as solar sails, electrodynamic and momentum transfer tethers, and aerocapture. This paper will provide an overview of those propellantless and propellant-based advanced propulsion technologies that will most significantly advance our exploration of deep space.
NASA Astrophysics Data System (ADS)
Stock, Joachim W.; Kitzmann, Daniel; Patzer, A. Beate C.; Sedlmayr, Erwin
2018-06-01
For the calculation of complex neutral/ionized gas phase chemical equilibria, we present a semi-analytical versatile and efficient computer program, called FastChem. The applied method is based on the solution of a system of coupled nonlinear (and linear) algebraic equations, namely the law of mass action and the element conservation equations including charge balance, in many variables. Specifically, the system of equations is decomposed into a set of coupled nonlinear equations in one variable each, which are solved analytically whenever feasible to reduce computation time. Notably, the electron density is determined by using the method of Nelder and Mead at low temperatures. The program is written in object-oriented C++ which makes it easy to couple the code with other programs, although a stand-alone version is provided. FastChem can be used in parallel or sequentially and is available under the GNU General Public License version 3 at https://github.com/exoclime/FastChem together with several sample applications. The code has been successfully validated against previous studies and its convergence behavior has been tested even for extreme physical parameter ranges down to 100 K and up to 1000 bar. FastChem converges stable and robust in even most demanding chemical situations, which posed sometimes extreme challenges for previous algorithms.
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Adelman, H. M.
1984-01-01
Orbiting spacecraft such as large space antennas have to maintain a highly accurate space to operate satisfactorily. Such structures require active and passive controls to mantain an accurate shape under a variety of disturbances. Methods for the optimum placement of control actuators for correcting static deformations are described. In particular, attention is focused on the case were control locations have to be selected from a large set of available sites, so that integer programing methods are called for. The effectiveness of three heuristic techniques for obtaining a near-optimal site selection is compared. In addition, efficient reanalysis techniques for the rapid assessment of control effectiveness are presented. Two examples are used to demonstrate the methods: a simple beam structure and a 55m space-truss-parabolic antenna.
Jou, Jonathan D; Jain, Swati; Georgiev, Ivelin S; Donald, Bruce R
2016-06-01
Sparse energy functions that ignore long range interactions between residue pairs are frequently used by protein design algorithms to reduce computational cost. Current dynamic programming algorithms that fully exploit the optimal substructure produced by these energy functions only compute the GMEC. This disproportionately favors the sequence of a single, static conformation and overlooks better binding sequences with multiple low-energy conformations. Provable, ensemble-based algorithms such as A* avoid this problem, but A* cannot guarantee better performance than exhaustive enumeration. We propose a novel, provable, dynamic programming algorithm called Branch-Width Minimization* (BWM*) to enumerate a gap-free ensemble of conformations in order of increasing energy. Given a branch-decomposition of branch-width w for an n-residue protein design with at most q discrete side-chain conformations per residue, BWM* returns the sparse GMEC in O([Formula: see text]) time and enumerates each additional conformation in merely O([Formula: see text]) time. We define a new measure, Total Effective Search Space (TESS), which can be computed efficiently a priori before BWM* or A* is run. We ran BWM* on 67 protein design problems and found that TESS discriminated between BWM*-efficient and A*-efficient cases with 100% accuracy. As predicted by TESS and validated experimentally, BWM* outperforms A* in 73% of the cases and computes the full ensemble or a close approximation faster than A*, enumerating each additional conformation in milliseconds. Unlike A*, the performance of BWM* can be predicted in polynomial time before running the algorithm, which gives protein designers the power to choose the most efficient algorithm for their particular design problem.
Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.
Chmelnitsky, Elly G; Ferguson, Steven H
2012-06-01
Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.
[The ALANAM statement on public health policy].
Goic, Alejando; Armas, Rodolfo
2010-12-01
The ALANAM (Association of Latin American National Academies of Medicine) statement on public health policy, issued following its 19th Congress, held October 28–30, 2010, in Santiago, Chile, declares that cardiovascular diseases, cancer, accidents and violence are the leading causes of death in the region, while in several of its member nations, emergent and re-emergent infectious diseases, malnutrition, and mother-child illnesses remain prevalent. The statement calls attention to the lack of functioning water supply and sewage systems in many villages and rural areas. After describing the social causes of the present state of public health in Latin America (poverty levels reaching upwards of 44% of the total population, or some 110 million people), it calls on governments, first, to spare no efforts in the task of eradicating extreme poverty in the short-term, and poverty in the long-term. Second, considering that about 15 million 3-to-6 year-olds have no access to education, it recommends extending educational services to these children, and to improve the quality of existing pre-school and primary education. Third, the statement calls for universal health care coverage and for equal access to good quality medical care for everyone, and for programs aimed at promoting healthy personal habits and self-care. In this regard, it also recommends that disease prevention programs be sustained over time, that national sanitary objectives be defined, and that its results be periodically reviewed. Fourth, it recommends that primary health care be extended to everyone, and that it be enhanced by improving coverage and coordination with secondary and tertiary level health care institutions. The statement lays special stress on the need for adopting public health policies aimed at lowering the cost of medicines; to this end, it calls for the creation of an official list of generic drugs. The statement ends by calling on governments to support public health research as a necessary step in tackling with greater efficiency the health problems still prevalent in the region.
Multi-Year Program Plan FY'09-FY'15 Solid-State Lighting Research and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-03-01
President Obama's energy and environment agenda calls for deployment of 'the Cheapest, Cleanest, Fastest Energy Source - Energy Efficiency.' The Department of Energy's (DOE) Office of Energy Efficiency and Renewable Energy (EERE) plays a critical role in advancing the President's agenda by helping the United States advance toward an energy-efficient future. Lighting in the United States is projected to consume nearly 10 quads of primary energy by 2012.3 A nation-wide move toward solid-state lighting (SSL) for general illumination could save a total of 32.5 quads of primary energy between 2012 and 2027. No other lighting technology offers the DOE andmore » our nation so much potential to save energy and enhance the quality of our built environment. The DOE has set forth the following mission statement for the SSL R&D Portfolio: Guided by a Government-industry partnership, the mission is to create a new, U.S.-led market for high-efficiency, general illumination products through the advancement of semiconductor technologies, to save energy, reduce costs and enhance the quality of the lighted environment.« less
BLESS 2: accurate, memory-efficient and fast error correction method.
Heo, Yun; Ramachandran, Anand; Hwu, Wen-Mei; Ma, Jian; Chen, Deming
2016-08-01
The most important features of error correction tools for sequencing data are accuracy, memory efficiency and fast runtime. The previous version of BLESS was highly memory-efficient and accurate, but it was too slow to handle reads from large genomes. We have developed a new version of BLESS to improve runtime and accuracy while maintaining a small memory usage. The new version, called BLESS 2, has an error correction algorithm that is more accurate than BLESS, and the algorithm has been parallelized using hybrid MPI and OpenMP programming. BLESS 2 was compared with five top-performing tools, and it was found to be the fastest when it was executed on two computing nodes using MPI, with each node containing twelve cores. Also, BLESS 2 showed at least 11% higher gain while retaining the memory efficiency of the previous version for large genomes. Freely available at https://sourceforge.net/projects/bless-ec dchen@illinois.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Melzer, S M; Poole, S R
1999-08-01
To describe the operating characteristics, financial performance, and perceived value of computerized children's hospital-based telephone triage and advice (TTA) programs. A written survey of all 32 children's hospital-based TTA programs in the United States that used the same proprietary pediatric TTA software product for at least 6 months. The expense, revenues, and perceived value of children's hospital-based TTA programs. Of 30 programs (94%) responding, 27 (90%) were eligible for the study and reported on their experience with nearly 1.3 million TTA calls over a 12-month period. Programs provided pediatric TTA services for 1560 physicians, serving an average of 82 physicians (range, 10-340 physicians) and answering 38880 calls (range, 8500-140000 calls) annually. The mean call duration was 11.3 minutes and the estimated mean total expense per call was $12.45. Of programs charging fees for TTA services, 16 (59%) used a per-call fee and 7 (26%) used a monthly service fee. All respondents indicated that fees did not cover all associated costs. Telephone triage and advice programs, when examined on a stand-alone basis, were all operating with annual deficits (mean, $447000; median, $325000; range, $74000-$1.3 million), supported by the sponsoring children's hospitals and their companion programs. Using a 3-point Likert scale, the TTA program managers rated the value of the TTA program very highly as a mechanism for marketing to physicians (2.85) and increasing physician (2.92) and patient (2.80) satisfaction. Children's hospital-based TTA programs operate at substantial financial deficits. Ongoing support of these programs may derive from the perception that they are a valuable mechanism for marketing and increase patient and physician satisfaction. Children's hospitals should develop strategies to ensure the long-term financial viability of TTA programs or they may have to discontinue these services.
Ye, Congting; Ji, Guoli; Li, Lei; Liang, Chun
2014-01-01
Inverted repeats are present in abundance in both prokaryotic and eukaryotic genomes and can form DNA secondary structures--hairpins and cruciforms that are involved in many important biological processes. Bioinformatics tools for efficient and accurate detection of inverted repeats are desirable, because existing tools are often less accurate and time consuming, sometimes incapable of dealing with genome-scale input data. Here, we present a MATLAB-based program called detectIR for the perfect and imperfect inverted repeat detection that utilizes complex numbers and vector calculation and allows genome-scale data inputs. A novel algorithm is adopted in detectIR to convert the conventional sequence string comparison in inverted repeat detection into vector calculation of complex numbers, allowing non-complementary pairs (mismatches) in the pairing stem and a non-palindromic spacer (loop or gaps) in the middle of inverted repeats. Compared with existing popular tools, our program performs with significantly higher accuracy and efficiency. Using genome sequence data from HIV-1, Arabidopsis thaliana, Homo sapiens and Zea mays for comparison, detectIR can find lots of inverted repeats missed by existing tools whose outputs often contain many invalid cases. detectIR is open source and its source code is freely available at: https://sourceforge.net/projects/detectir.
NASA Astrophysics Data System (ADS)
Work, Paul R.
1991-12-01
This thesis investigates the parallelization of existing serial programs in computational electromagnetics for use in a parallel environment. Existing algorithms for calculating the radar cross section of an object are covered, and a ray-tracing code is chosen for implementation on a parallel machine. Current parallel architectures are introduced and a suitable parallel machine is selected for the implementation of the chosen ray-tracing algorithm. The standard techniques for the parallelization of serial codes are discussed, including load balancing and decomposition considerations, and appropriate methods for the parallelization effort are selected. A load balancing algorithm is modified to increase the efficiency of the application, and a high level design of the structure of the serial program is presented. A detailed design of the modifications for the parallel implementation is also included, with both the high level and the detailed design specified in a high level design language called UNITY. The correctness of the design is proven using UNITY and standard logic operations. The theoretical and empirical results show that it is possible to achieve an efficient parallel application for a serial computational electromagnetic program where the characteristics of the algorithm and the target architecture critically influence the development of such an implementation.
Parsons, M B; Reid, D H; Green, C W
1996-01-01
Shortcomings in the technology for training support staff in methods of teaching people with severe disabilities recently have resulted in calls to improve the technology. We evaluated a program for training basic teaching skills within one day. The program entailed classroom-based verbal and video instruction, practice, and feedback followed by on-the-job feedback. In Study I, four undergraduate interns participated in the program, and all four met the mastery criterion for teaching skills. Three teacher aides participated in Study 2, with results indicating that when the staff applied their newly acquired teaching skills, students with profound disabilities made progress in skill acquisition. Clinical replications occurred in Study 3, involving 17 staff in school classrooms, group homes, and an institution. Results of Studies 2 and 3 also indicated staff were accepting of the program and improved their verbal skills. Results are discussed regarding advantages of training staff in one day. Future research suggestions are offered, focusing on identifying means of rapidly training other teaching skills in order to develop the most effective, acceptable, and efficient technology for staff training.
There may be a new, more effective method for treating high-risk neuroblastoma, according to scientists at the Children’s Hospital of Philadelphia and collaborators in the Cancer and Inflammation Program at NCI at Frederick. Together, the groups published a study describing a previously unrecognized protein on neuroblastoma cells, called GPC2, as well as the creation of a novel antibody-drug conjugate, a combination of a human antibody and a naturally occurring anticancer drug, that locates and binds to GPC2 in a highly efficient way.
Portable parallel portfolio optimization in the Aurora Financial Management System
NASA Astrophysics Data System (ADS)
Laure, Erwin; Moritsch, Hans
2001-07-01
Financial planning problems are formulated as large scale, stochastic, multiperiod, tree structured optimization problems. An efficient technique for solving this kind of problems is the nested Benders decomposition method. In this paper we present a parallel, portable, asynchronous implementation of this technique. To achieve our portability goals we elected the programming language Java for our implementation and used a high level Java based framework, called OpusJava, for expressing the parallelism potential as well as synchronization constraints. Our implementation is embedded within a modular decision support tool for portfolio and asset liability management, the Aurora Financial Management System.
Energy-efficient container handling using hybrid model predictive control
NASA Astrophysics Data System (ADS)
Xin, Jianbin; Negenborn, Rudy R.; Lodewijks, Gabriel
2015-11-01
The performance of container terminals needs to be improved to adapt the growth of containers while maintaining sustainability. This paper provides a methodology for determining the trajectory of three key interacting machines for carrying out the so-called bay handling task, involving transporting containers between a vessel and the stacking area in an automated container terminal. The behaviours of the interacting machines are modelled as a collection of interconnected hybrid systems. Hybrid model predictive control (MPC) is proposed to achieve optimal performance, balancing the handling capacity and energy consumption. The underlying control problem is hereby formulated as a mixed-integer linear programming problem. Simulation studies illustrate that a higher penalty on energy consumption indeed leads to improved sustainability using less energy. Moreover, simulations illustrate how the proposed energy-efficient hybrid MPC controller performs under different types of uncertainties.
Radar target classification studies: Software development and documentation
NASA Astrophysics Data System (ADS)
Kamis, A.; Garber, F.; Walton, E.
1985-09-01
Three computer programs were developed to process and analyze calibrated radar returns. The first program, called DATABASE, was developed to create and manage a random accessed data base. The second program, called FTRAN DB, was developed to process horizontal and vertical polarizations radar returns into different formats (i.e., time domain, circular polarizations and polarization parameters). The third program, called RSSE, was developed to simulate a variety of radar systems and to evaluate their ability to identify radar returns. Complete computer listings are included in the appendix volumes.
Rauber, R.; Manser, M. B.
2017-01-01
Sentinel behaviour, a form of coordinated vigilance, occurs in a limited range of species, mostly in cooperative breeders. In some species sentinels confirm their presence vocally by giving a single sentinel call type, whereby the rate and subtle acoustic changes provide graded information on the variation of perceived predation risk. In contrast, meerkat (Suricata suricatta) sentinels produce six different sentinel call types. Here we show that manipulation of perception of danger has different effects on the likelihood of emitting these different call types, and that these call types affect foraging individuals differently. Increasing the perceived predation risk by playing back alarm calls decreased the production rate of the common short note calls and increased the production rate of the rare long calls. Playbacks of short note calls increased foraging behaviour and decreased vigilance in the rest of the group, whereas the opposite was observed when playing long calls. This suggests that the common call types act as all-clear signals, while the rare call types have a warning function. Therefore, meerkats increase the efficiency of their sentinel system by producing several discrete call types that represent changes in predation risk and lead to adjustments of the group’s vigilance behaviour. PMID:28303964
Agwu, Allison L; Lee, Carlton K K; Jain, Sanjay K; Murray, Kara L; Topolski, Jason; Miller, Robert E; Townsend, Timothy; Lehmann, Christoph U
2008-09-15
Antimicrobial stewardship programs aim to reduce inappropriate hospital antimicrobial use. At the Johns Hopkins Children's Medical and Surgical Center (Baltimore, MD), we implemented a World Wide Web-based antimicrobial restriction program to address problems with the existing restriction program. A user survey identified opportunities for improvement of an existing antimicrobial restriction program and resulted in subsequent design, implementation, and evaluation of a World Wide Web-based antimicrobial restriction program at a 175-bed, tertiary care pediatric teaching hospital. The program provided automated clinical decision support, facilitated approval, and enhanced real-time communication among prescribers, pharmacists, and pediatric infectious diseases fellows. Approval status, duration, and rationale; missing request notifications; and expiring approvals were stored in a database that is accessible via a secure Intranet site. Before and after implementation of the program, user satisfaction, reports of missed and/or delayed doses, antimicrobial dispensing times, and cost were evaluated. After implementation of the program, there was a $370,069 reduction in projected annual cost associated with restricted antimicrobial use and an 11.6% reduction in the number of dispensed doses. User satisfaction increased from 22% to 68% and from 13% to 69% among prescribers and pharmacists, respectively. There were 21% and 32% reductions in the number of prescriber reports of missed and delayed doses, respectively, and there was a 37% reduction in the number of pharmacist reports of delayed approvals; measured dispensing times were unchanged (P = .24). In addition, 40% fewer restricted antimicrobial-related phone calls were noted by the pharmacy. The World Wide Web-based antimicrobial approval program led to improved communication, more-efficient antimicrobial administration, increased user satisfaction, and significant cost savings. Integrated tools, such as this World Wide Web-based antimicrobial approval program, will effectively enhance antimicrobial stewardship programs.
ERIC Educational Resources Information Center
Ali, Azad; Smith, David
2014-01-01
This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…
The efficiency of asset management strategies to reduce urban flood risk.
ten Veldhuis, J A E; Clemens, F H L R
2011-01-01
In this study, three asset management strategies were compared with respect to their efficiency to reduce flood risk. Data from call centres at two municipalities were used to quantify urban flood risks associated with three causes of urban flooding: gully pot blockage, sewer pipe blockage and sewer overloading. The efficiency of three flood reduction strategies was assessed based on their effect on the causes contributing to flood risk. The sensitivity of the results to uncertainty in the data source, citizens' calls, was analysed through incorporation of uncertainty ranges taken from customer complaint literature. Based on the available data it could be shown that increasing gully pot blockage is the most efficient action to reduce flood risk, given data uncertainty. If differences between cause incidences are large, as in the presented case study, call data are sufficient to decide how flood risk can be most efficiently reduced. According to the results of this analysis, enlargement of sewer pipes is not an efficient strategy to reduce flood risk, because flood risk associated with sewer overloading is small compared to other failure mechanisms.
Extending green technology innovations to enable greener fabs
NASA Astrophysics Data System (ADS)
Takahisa, Kenji; Yoo, Young Sun; Fukuda, Hitomi; Minegishi, Yuji; Enami, Tatsuo
2015-03-01
Semiconductor manufacturing industry has growing concerns over future environmental impacts as fabs expand and new generations of equipment become more powerful. Especially rare gases supply and price are one of prime concerns for operation of high volume manufacturing (HVM) fabs. Over the past year it has come to our attention that Helium and Neon gas supplies could be unstable and become a threat to HVM fabs. To address these concerns, Gigaphoton has implemented various green technologies under its EcoPhoton program. One of the initiatives is GigaTwin deep ultraviolet (DUV) lithography laser design which enables highly efficient and stable operation. Under this design laser systems run with 50% less electric energy and gas consumption compared to conventional laser designs. In 2014 we have developed two technologies to further reduce electric energy and gas efficiency. The electric energy reduction technology is called eGRYCOS (enhanced Gigaphoton Recycled Chamber Operation System), and it reduces electric energy by 15% without compromising any of laser performances. eGRYCOS system has a sophisticated gas flow design so that we can reduce cross-flow-fan rotation speed. The gas reduction technology is called eTGM (enhanced Total gas Manager) and it improves gas management system optimizing the gas injection and exhaust amount based on laser performances, resulting in 50% gas savings. The next steps in our roadmap technologies are indicated and we call for potential partners to work with us based on OPEN INNOVATION concept to successfully develop faster and better solutions in all possible areas where green innovation may exist.
Clouds of different colors: A prospective look at head and neck surgical resident call experience.
Melzer, Jonathan
2017-12-01
Graduate medical education programs typically set up call under the assumption that residents will have similar experiences. The terms black cloud and white cloud have frequently been used to describe residents with more difficult (black) or less difficult (white) call experiences. This study followed residents in the department of head and neck surgery during call to determine whether certain residents have a significantly different call experience than the norm. It is a prospective observational study conducted over 16 months in a tertiary care center with a resident training program in otolaryngology. Resident call data on total pages, consults, and operative interventions were examined, as well as subjective survey data about sleep and perceived difficulty of resident call. Analysis showed no significant difference in call activity (pages, consults, operative interventions) among residents. However, data from the resident call surveys revealed perceived disparities in call difficulty that were significant. Two residents were clearly labeled as black clouds compared to the rest. These residents did not have the highest average number of pages, consults, or operative interventions. This study suggests that factors affecting call perception are outside the objective, absolute workload. These results may be used to improve resident education on sleep training and nighttime patient management in the field of otolaryngology and may influence otolaryngology residency programs.
75 FR 25255 - Structure and Practices of the Video Relay Service Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... Video Relay Service Program AGENCY: Federal Communications Commission. ACTION: Notice. SUMMARY: In this... compensability from the Interstate TRS Fund (Fund) of certain types of calls made through Video Relay Service... CA, after the VRS user has initiated the video call to the CA, call back the VRS user on a voice...
Jamaican Call-In Radio: A Uses and Gratification Analysis.
ERIC Educational Resources Information Center
Surlin, Stuart H.
Noting that radio call-in programs seem to contain the elements for active audience involvement and participation, a study was conducted to examine the hypothesis that information gain and surveillance are the primary gratifications sought through call-in radio programs, especially in a culture that has a strong oral tradition and relatively few…
49 CFR 198.35 - Grants conditioned on adoption of one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... considers whether a State has adopted or is seeking to adopt a one-call damage prevention program in accordance with § 198.37. If a State has not adopted or is not seeking to adopt such program, the State...
Instruction-matrix-based genetic programming.
Li, Gang; Wang, Jin Feng; Lee, Kin Hong; Leung, Kwong-Sak
2008-08-01
In genetic programming (GP), evolving tree nodes separately would reduce the huge solution space. However, tree nodes are highly interdependent with respect to their fitness. In this paper, we propose a new GP framework, namely, instruction-matrix (IM)-based GP (IMGP), to handle their interactions. IMGP maintains an IM to evolve tree nodes and subtrees separately. IMGP extracts program trees from an IM and updates the IM with the information of the extracted program trees. As the IM actually keeps most of the information of the schemata of GP and evolves the schemata directly, IMGP is effective and efficient. Our experimental results on benchmark problems have verified that IMGP is not only better than those of canonical GP in terms of the qualities of the solutions and the number of program evaluations, but they are also better than some of the related GP algorithms. IMGP can also be used to evolve programs for classification problems. The classifiers obtained have higher classification accuracies than four other GP classification algorithms on four benchmark classification problems. The testing errors are also comparable to or better than those obtained with well-known classifiers. Furthermore, an extended version, called condition matrix for rule learning, has been used successfully to handle multiclass classification problems.
NASA Technical Reports Server (NTRS)
Prahst, Patricia S.; Kulkarni, Sameer; Sohn, Ki H.
2015-01-01
NASA's Environmentally Responsible Aviation (ERA) Program calls for investigation of the technology barriers associated with improved fuel efficiency of large gas turbine engines. Under ERA the task for a High Pressure Ratio Core Technology program calls for a higher overall pressure ratio of 60 to 70. This mean that the HPC would have to almost double in pressure ratio and keep its high level of efficiency. The challenge is how to match the corrected mass flow rate of the front two supersonic high reaction and high corrected tip speed stages with a total pressure ratio of 3.5. NASA and GE teamed to address this challenge by using the initial geometry of an advanced GE compressor design to meet the requirements of the first 2 stages of the very high pressure ratio core compressor. The rig was configured to run as a 2 stage machine, with Strut and IGV, Rotor 1 and Stator 1 run as independent tests which were then followed by adding the second stage. The goal is to fully understand the stage performances under isolated and multi-stage conditions and fully understand any differences and provide a detailed aerodynamic data set for CFD validation. Full use was made of steady and unsteady measurement methods to isolate fluid dynamics loss source mechanisms due to interaction and endwalls. The paper will present the description of the compressor test article, its predicted performance and operability, and the experimental results for both the single stage and two stage configurations. We focus the detailed measurements on 97 and 100 of design speed at 3 vane setting angles.
Deep Space Systems Technology Program Future Deliveries
NASA Technical Reports Server (NTRS)
Salvo, Christopher G.; Keuneke, Matthew S.
2000-01-01
NASA is in a period of frequent launches of low cost deep space missions with challenging performance needs. The modest budgets of these missions make it impossible for each to develop its own technology, therefore, efficient and effective development and insertion of technology for these missions must be approached at a higher level than has been done in the past. The Deep Space Systems Technology Program (DSST), often referred to as X2000, has been formed to address this need. The program is divided into a series of "Deliveries" that develop and demonstrate a set of spacecraft system capabilities with broad applicability for use by multiple missions. The First Delivery Project, to be completed in 2001, will provide a one MRAD-tolerant flight computer, power switching electronics, efficient radioisotope power source, and a transponder with services at 8.4 GHz and 32 GHz bands. Plans call for a Second Delivery in late 2003 to enable complete deep space systems in the 10 to 50 kg class, and a Third Delivery built around Systems on a Chip (extreme levels of electronic and microsystems integration) around 2006. Formulation of Future Deliveries (past the First Delivery) is ongoing and includes plans for such developments as highly miniaturized digital/analog/power electronics, optical communications, multifunctional structures, miniature lightweight propulsion, advanced thermal control techniques, highly efficient radioisotope power sources, and a unified flight ground software architecture to support the needs of future highly intelligent space systems. All developments are targeted at broad applicability and reuse, and will be commercialized within the US.
Impact of the FY 2009 Building Technologies Program on United States Employment and Earned Income
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livingston, Olga V.; Scott, Michael J.; Hostick, Donna J.
2008-06-17
The Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) is interested in assessing the potential economic impacts of its portfolio of subprograms on national employment and income. A special purpose input-output model called ImSET is used in this study of 14 Building Technologies Program subprograms in the EERE final FY 2009 budget request to the Office of Management and Budget in February 2008. Energy savings, investments, and impacts on U.S. national employment and earned income are reported by subprogram for selected years to the year 2025. Energy savings and investments from these subprograms have the potentialmore » of creating a total of 258,000 jobs and about $3.7 billion in earned income (2007$) by the year 2025.« less
Outsourcing an Effective Postdischarge Call Program
Meek, Kevin L.; Williams, Paula; Unterschuetz, Caryn J.
2018-01-01
To improve patient satisfaction ratings and decrease readmissions, many organizations utilize internal staff to complete postdischarge calls to recently released patients. Developing, implementing, monitoring, and sustaining an effective call program can be challenging and have eluded some of the renowned medical centers in the country. Using collaboration with an outsourced vendor to bring state-of-the-art call technology and staffed with specially trained callers, health systems can achieve elevated levels of engagement and satisfaction for their patients postdischarge. PMID:29494453
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
Bekelman, Justin E.; Deye, James A.; Vikram, Bhadrasain; Bentzen, Soren M.; Bruner, Deborah; Curran, Walter J.; Dignam, James; Efstathiou, Jason A.; FitzGerald, T. J.; Hurkmans, Coen; Ibbott, Geoffrey S.; Lee, J. Jack; Merchant, Timothy E.; Michalski, Jeff; Palta, Jatinder R.; Simon, Richard; Ten Haken, Randal K.; Timmerman, Robert; Tunis, Sean; Coleman, C. Norman; Purdy, James
2012-01-01
Background In the context of national calls for reorganizing cancer clinical trials, the National Cancer Institute (NCI) sponsored a two day workshop to examine the challenges and opportunities for optimizing radiotherapy quality assurance (QA) in clinical trial design. Methods Participants reviewed the current processes of clinical trial QA and noted the QA challenges presented by advanced technologies. Lessons learned from the radiotherapy QA programs of recent trials were discussed in detail. Four potential opportunities for optimizing radiotherapy QA were explored, including the use of normal tissue toxicity and tumor control metrics, biomarkers of radiation toxicity, new radiotherapy modalities like proton beam therapy, and the international harmonization of clinical trial QA. Results Four recommendations were made: 1) Develop a tiered (and more efficient) system for radiotherapy QA and tailor intensity of QA to clinical trial objectives. Tiers include (i) general credentialing, (ii) trial specific credentialing, and (iii) individual case review; 2) Establish a case QA repository; 3) Develop an evidence base for clinical trial QA and introduce innovative prospective trial designs to evaluate radiotherapy QA in clinical trials; and 4) Explore the feasibility of consolidating clinical trial QA in the United States. Conclusion Radiotherapy QA may impact clinical trial accrual, cost, outcomes and generalizability. To achieve maximum benefit, QA programs must become more efficient and evidence-based. PMID:22425219
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... UNITED STATES INSTITUTE OF PEACE Call for Proposals for a Micro Support Program on International Conflict Resolution and Peacebuilding For Immediate Release AGENCY: United States Institute of Peace. ACTION: Notice. SUMMARY: Micro Support Program on International Conflict Resolution and Peacebuilding...
ERIC Educational Resources Information Center
Fukuzawa, Jeannette L.; Lubin, Jan M.
Five computer programs for the Macintosh that are geared for Computer-Assisted Language Learning (CALL) are described. All five programs allow the teacher to input material. The first program allows entry of new vocabulary lists including definition, a sentence in which the exact word is used, a fill-in-the-blank exercise, and the word's phonetics…
NASA Astrophysics Data System (ADS)
Pochampally, Kishore K.; Gupta, Surendra M.; Kamarthi, Sagar V.
2004-02-01
Although there are many quantitative models in the literature to design a reverse supply chain, every model assumes that all the recovery facilities that are engaged in the supply chain have enough potential to efficiently re-process the incoming used products. Motivated by the risk of re-processing used products in facilities of insufficient potentiality, this paper proposes a method to identify potential facilities in a set of candidate recovery facilities operating in a region where a reverse supply chain is to be established. In this paper, the problem is solved using a newly developed method called physical programming. The most significant advantage of using physical programming is that it allows a decision maker to express his preferences for values of criteria (for comparing the alternatives), not in the traditional form of weights but in terms of ranges of different degrees of desirability, such as ideal range, desirable range, highly desirable range, undesirable range, and unacceptable range. A numerical example is considered to illustrate the proposed method.
Lamouche, Florian; Gully, Djamel; Chaumeret, Anaïs; Nouwen, Nico; Verly, Camille; Pierre, Olivier; Sciallano, Coline; Fardoux, Joël; Jeudy, Christian; Szücs, Attila; Mondy, Samuel; Salon, Christophe; Nagy, István; Kereszt, Attila; Dessaux, Yves; Giraud, Eric; Mergaert, Peter; Alunni, Benoit
2018-06-19
To circumvent the paucity of nitrogen sources in the soil legume plants establish a symbiotic interaction with nitrogen-fixing soil bacteria called rhizobia. During symbiosis, the plants form root organs called nodules, where bacteria are housed intracellularly and become active nitrogen fixers known as bacteroids. Depending on their host plant, bacteroids can adopt different morphotypes, being either unmodified (U), elongated (E) or spherical (S). E- and S-type bacteroids undergo a terminal differentiation leading to irreversible morphological changes and DNA endoreduplication. Previous studies suggest that differentiated bacteroids display an increased symbiotic efficiency (E>U and S>U). In this study, we used a combination of Aeschynomene species inducing E- or S-type bacteroids in symbiosis with Bradyrhizobium sp. ORS285 to show that S-type bacteroids present a better symbiotic efficiency than E-type bacteroids. We performed a transcriptomic analysis on E- and S-type bacteroids formed by Aeschynomene afraspera and Aeschynomene indica nodules and identified the bacterial functions activated in bacteroids and specific to each bacteroid type. Extending the expression analysis in E- and S-type bacteroids in other Aeschynomene species by qRT-PCR on selected genes from the transcriptome analysis narrowed down the set of bacteroid morphotype-specific genes. Functional analysis of a selected subset of 31 bacteroid-induced or morphotype-specific genes revealed no symbiotic phenotypes in the mutants. This highlights the robustness of the symbiotic program but could also indicate that the bacterial response to the plant environment is partially anticipatory or even maladaptive. Our analysis confirms the correlation between differentiation and efficiency of the bacteroids and provides a framework for the identification of bacterial functions that affect the efficiency of bacteroids. This article is protected by copyright. All rights reserved. © 2018 Society for Applied Microbiology and John Wiley & Sons Ltd.
ACFA 2020 - An FP7 project on active control of flexible fuel efficient aircraft configurations
NASA Astrophysics Data System (ADS)
Maier, R.
2013-12-01
This paper gives an overview about the project ACFA 2020 which is funded by the European Commission within the 7th framework program. The acronym ACFA 2020 stands for Active Control for Flexible Aircraft 2020. The project is dealing with the design of highly fuel efficient aircraft configurations and, in particular, on innovative active control concepts with the goal to reduce loads and structural weight. Major focus lays on blended wing body (BWB) aircraft. Blended wing body type aircraft configurations are seen as the most promising future concept to fulfill the so-called ACARE (Advisory Council for Aeronautics Research in Europe) vision 2020 goals in regards to reduce fuel consumption and external noise. The paper discusses in some detail the overall goals and how they are addressed in the workplan. Furthermore, the major achievements of the project are outlined and a short outlook on the remaining work is given.
Evaluation of 3-D graphics software: A case study
NASA Technical Reports Server (NTRS)
Lores, M. E.; Chasen, S. H.; Garner, J. M.
1984-01-01
An efficient 3-D geometry graphics software package which is suitable for advanced design studies was developed. The advanced design system is called GRADE--Graphics for Advanced Design. Efficiency and ease of use are gained by sacrificing flexibility in surface representation. The immediate options were either to continue development of GRADE or to acquire a commercially available system which would replace or complement GRADE. Test cases which would reveal the ability of each system to satisfy the requirements were developed. A scoring method which adequately captured the relative capabilities of the three systems was presented. While more complex multi-attribute decision methods could be used, the selected method provides all the needed information without being so complex that it is difficult to understand. If the value factors are modestly perturbed, system Z is a clear winner based on its overall capabilities. System Z is superior in two vital areas: surfacing and ease of interface with application programs.
NASA Astrophysics Data System (ADS)
Peng, Heng; Liu, Yinghua; Chen, Haofeng
2018-05-01
In this paper, a novel direct method called the stress compensation method (SCM) is proposed for limit and shakedown analysis of large-scale elastoplastic structures. Without needing to solve the specific mathematical programming problem, the SCM is a two-level iterative procedure based on a sequence of linear elastic finite element solutions where the global stiffness matrix is decomposed only once. In the inner loop, the static admissible residual stress field for shakedown analysis is constructed. In the outer loop, a series of decreasing load multipliers are updated to approach to the shakedown limit multiplier by using an efficient and robust iteration control technique, where the static shakedown theorem is adopted. Three numerical examples up to about 140,000 finite element nodes confirm the applicability and efficiency of this method for two-dimensional and three-dimensional elastoplastic structures, with detailed discussions on the convergence and the accuracy of the proposed algorithm.
The drive for Aircraft Energy Efficiency
NASA Technical Reports Server (NTRS)
James, R. L., Jr.; Maddalon, D. V.
1984-01-01
NASA's Aircraft Energy Efficiency (ACEE) program, which began in 1976, has mounted a development effort in four major transport aircraft technology fields: laminar flow systems, advanced aerodynamics, flight controls, and composite structures. ACEE has explored two basic methods for achieving drag-reducing boundary layer laminarization: the use of suction through the wing structure (via slots or perforations) to remove boundary layer turbulence, and the encouragement of natural laminar flow maintenance through refined design practices. Wind tunnel tests have been conducted for wide bodied aircraft equipped with high aspect ratio supercritical wings and winglets. Maneuver load control and pitch-active stability augmentation control systems reduce fuel consumption by reducing the drag associated with high aircraft stability margins. Composite structures yield lighter airframes that in turn call for smaller wing and empennage areas, reducing induced drag for a given payload. In combination, all four areas of development are expected to yield a fuel consumption reduction of 40 percent.
NASA Technical Reports Server (NTRS)
Ramaswamy, Shankar; Banerjee, Prithviraj
1994-01-01
Appropriate data distribution has been found to be critical for obtaining good performance on Distributed Memory Multicomputers like the CM-5, Intel Paragon and IBM SP-1. It has also been found that some programs need to change their distributions during execution for better performance (redistribution). This work focuses on automatically generating efficient routines for redistribution. We present a new mathematical representation for regular distributions called PITFALLS and then discuss algorithms for redistribution based on this representation. One of the significant contributions of this work is being able to handle arbitrary source and target processor sets while performing redistribution. Another important contribution is the ability to handle an arbitrary number of dimensions for the array involved in the redistribution in a scalable manner. Our implementation of these techniques is based on an MPI-like communication library. The results presented show the low overheads for our redistribution algorithm as compared to naive runtime methods.
NASA Technical Reports Server (NTRS)
1996-01-01
Under the Enabling Propulsion Materials (EPM) program - a partnership between NASA, Pratt & Whitney, and GE Aircraft Engines - the Materials and Structures Divisions of the NASA Lewis Research Center are involved in developing a fan-containment system for the High-Speed Civil Transport (HSCT). The program calls for a baseline system to be designed by the end of 1995, with subsequent testing of innovative concepts. Five metal candidate materials are currently being evaluated for the baseline system in the Structures Division's Ballistic Impact Facility. This facility was developed to provide the EPM program with cost-efficient and timely impact test data. At the facility, material specimens are impacted at speeds up to 350 m/sec by projectiles of various sizes and shapes to assess the specimens' ability to absorb energy and withstand impact. The tests can be conducted at either room or elevated temperatures. Posttest metallographic analysis is conducted to improve understanding of the failure modes. A dynamic finite element program is used to simulate the events and both guide the testing as well as aid in designing the fan-containment system.
State Roles in the Global Climate Change Issue.
NASA Astrophysics Data System (ADS)
Changnon, Stanley A.
1995-02-01
Events in 1988 helped focus the attention of several states on the global climate change issue. Consequently, the National Governors' Association conducted an assessment in 1989 and recommended various actions. By 1994, 22 states have enacted laws or regulations and/or established research programs addressing climate change. Most of these "no regrets" actions are set up to conserve energy or improve energy efficiency and also to reduce greenhouse gas emissions. Illinois has adopted an even broader program by 1) establishing a Global Climate Change Office to foster research and provide information and 2) forming a task force to address a wide array of issues including state input to federal policies such as the Clinton administration's 1993 Climate Change Action Plan and to the research dimensions of the U.S. Global Climate Change Research Program. The Illinois program calls for increased attention to studies of regional impacts, including integrated assessments, and to research addressing means to adapt to future climate change. These various state efforts to date help show the direction of policy development and should be useful to those grappling with these issues.
Mashburn, Andrew J; Downer, Jason T; Rivers, Susan E; Brackett, Marc A; Martinez, Andres
2014-04-01
Social and emotional learning programs are designed to improve the quality of social interactions in schools and classrooms in order to positively affect students' social, emotional, and academic development. The statistical power of group randomized trials to detect effects of social and emotional learning programs and other preventive interventions on setting-level outcomes is influenced by the reliability of the outcome measure. In this paper, we apply generalizability theory to an observational measure of the quality of classroom interactions that is an outcome in a study of the efficacy of a social and emotional learning program called The Recognizing, Understanding, Labeling, Expressing, and Regulating emotions Approach. We estimate multiple sources of error variance in the setting-level outcome and identify observation procedures to use in the efficacy study that most efficiently reduce these sources of error. We then discuss the implications of using different observation procedures on both the statistical power and the monetary costs of conducting the efficacy study.
Outsourcing an Effective Postdischarge Call Program: A Collaborative Approach.
Meek, Kevin L; Williams, Paula; Unterschuetz, Caryn J
To improve patient satisfaction ratings and decrease readmissions, many organizations utilize internal staff to complete postdischarge calls to recently released patients. Developing, implementing, monitoring, and sustaining an effective call program can be challenging and have eluded some of the renowned medical centers in the country. Using collaboration with an outsourced vendor to bring state-of-the-art call technology and staffed with specially trained callers, health systems can achieve elevated levels of engagement and satisfaction for their patients postdischarge.
Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin
Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.
2018-01-01
Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.
Efficient experimental design of high-fidelity three-qubit quantum gates via genetic programming
NASA Astrophysics Data System (ADS)
Devra, Amit; Prabhu, Prithviraj; Singh, Harpreet; Arvind; Dorai, Kavita
2018-03-01
We have designed efficient quantum circuits for the three-qubit Toffoli (controlled-controlled-NOT) and the Fredkin (controlled-SWAP) gate, optimized via genetic programming methods. The gates thus obtained were experimentally implemented on a three-qubit NMR quantum information processor, with a high fidelity. Toffoli and Fredkin gates in conjunction with the single-qubit Hadamard gates form a universal gate set for quantum computing and are an essential component of several quantum algorithms. Genetic algorithms are stochastic search algorithms based on the logic of natural selection and biological genetics and have been widely used for quantum information processing applications. We devised a new selection mechanism within the genetic algorithm framework to select individuals from a population. We call this mechanism the "Luck-Choose" mechanism and were able to achieve faster convergence to a solution using this mechanism, as compared to existing selection mechanisms. The optimization was performed under the constraint that the experimentally implemented pulses are of short duration and can be implemented with high fidelity. We demonstrate the advantage of our pulse sequences by comparing our results with existing experimental schemes and other numerical optimization methods.
NASA Astrophysics Data System (ADS)
Intarasothonchun, Silada; Thipchaksurat, Sakchai; Varakulsiripunth, Ruttikorn; Onozato, Yoshikuni
In this paper, we propose a modified scheme of MSODB and PMS, called Predictive User Mobility Behavior (PUMB) to improve performance of resource reservation and call admission control for cellular networks. This algorithm is proposed in which bandwidth is allocated more efficiently to neighboring cells by key mobility parameters in order to provide QoS guarantees for transferring traffic. The probability is used to form a cluster of cells and the shadow cluster, where a mobile unit is likely to visit. When a mobile unit may change the direction and migrate to the cell that does not belong to its shadow cluster, we can support it by making efficient use of predicted nonconforming call. Concomitantly, to ensure continuity of on-going calls with better utilization of resources, bandwidth is borrowed from predicted nonconforming calls and existing adaptive calls without affecting the minimum QoS guarantees. The performance of the PUMB is demonstrated by simulation results in terms of new call blocking probability, handoff call dropping probability, bandwidth utilization, call successful probability, and overhead message transmission when arrival rate and speed of mobile units are varied. Our results show that PUMB provides the better performances comparing with those of MSODB and PMS under different traffic conditions.
77 FR 72364 - National Institute of Allergy and Infectious Diseases; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... Conference Call). Contact Person: Lynn Rust, Ph.D., Scientific Review Officer, Scientific Review Program... Call). Contact Person: Lynn Rust, Ph.D., Scientific Review Officer, Scientific Review Program, Division...
Deicing System Protects General Aviation Aircraft
NASA Technical Reports Server (NTRS)
2007-01-01
Kelly Aerospace Thermal Systems LLC worked with researchers at Glenn Research Center on deicing technology with assistance from the Small Business Innovation Research (SBIR) program. Kelly Aerospace acquired Northcoast Technologies Ltd., a firm that had conducted work on a graphite foil heating element under a NASA SBIR contract and developed a lightweight, easy-to-install, reliable wing and tail deicing system. Kelly Aerospace engineers combined their experiences with those of the Northcoast engineers, leading to the certification and integration of a thermoelectric deicing system called Thermawing, a DC-powered air conditioner for single-engine aircraft called Thermacool, and high-output alternators to run them both. Thermawing, a reliable anti-icing and deicing system, allows pilots to safely fly through ice encounters and provides pilots of single-engine aircraft the heated wing technology usually reserved for larger, jet-powered craft. Thermacool, an innovative electric air conditioning system, uses a new compressor whose rotary pump design runs off an energy-efficient, brushless DC motor and allows pilots to use the air conditioner before the engine even starts
Generalized SMO algorithm for SVM-based multitask learning.
Cai, Feng; Cherkassky, Vladimir
2012-06-01
Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as "learning with structured data" and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+MTL for classification. Training the SVM+MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n(3)) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+MTL setting. Empirical results show that, for typical SVM+MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.
76 FR 3653 - Alaska Region's Subsistence Resource Commission (SRC) Program; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-20
... subsistence management issues. The NPS SRC program is authorized under Title VIII, Section 808 of the Alaska...: 1. Call to order. 2. SRC Roll Call and Confirmation of Quorum. 3. Welcome and Introductions. 4.... c. Resource Management Program Update. 14. Public and other Agency Comments. 15. SRC Work Session...
A Call for Reformation of Teacher Preparation Programs in the United States
ERIC Educational Resources Information Center
Dann, Ashley Ireland
2014-01-01
Although current research, educational theorists, and international comparison prove a need for reform, the United States' teacher preparation programs are failing. The following paper will call for the reform of teacher preparation programs in three distinct areas. Examination of current data, application of educational theorists'…
Ota, Ken S; Beutler, David S; Sheikh, Hassam; Weiss, Jessica L; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D; Loli, Akil I
2013-10-01
This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the "Weekday Daytime" period, Eighty-five percent of the totals calls were made. There were 229 calls during the "Weekday Nights" period with 1.5 inbound calls per week. The "Total Weekend" calls were 10.2% of the total calls which equated to a weekly average of 8.8. Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations.
Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0
NASA Technical Reports Server (NTRS)
Perry, John; Stroud, C. W.
1986-01-01
A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.
Programs and Place: Risk and Asset Mapping for Fall Prevention
Smith, Matthew Lee; Towne, Samuel D.; Motlagh, Audry S.; Smith, Donald R.; Boolani, Ali; Horel, Scott A.; Ory, Marcia G.
2017-01-01
Identifying ways to measure access, availability, and utilization of health-care services, relative to at-risk areas or populations, is critical in providing practical and actionable information to key stakeholders. This study identified the prevalence and geospatial distribution of fall-related emergency medical services (EMS) calls in relation to the delivery of an evidence-based fall prevention program in Tarrant County, Texas over a 3-year time period. It aims to educate public health professionals and EMS first respondents about the application of geographic information system programs to identify risk-related “hot spots,” service gaps, and community assets to reduce falls among older adults. On average, 96.09 (±108.65) calls were received per ZIP Code (ranging from 0 calls to 386 calls). On average, EMS calls per ZIP Code increased from 30.80 (±34.70) calls in 2009 to 33.75 (±39.58) calls in 2011, which indicate a modest annual call increase over the 3-year study period. The percent of ZIP Codes offering A Matter of Balance/Volunteer Lay Leader Model (AMOB/VLL) workshops increased from 27.3% in 2009 to 34.5% in 2011. On average, AMOB/VLL workshops were offered in ZIP Codes with more fall-related EMS calls over the 3-year study period. Findings suggest that the study community was providing evidence-based fall prevention programming (AMOB/VLL workshops) in higher-risk areas. Opportunities for strategic service expansion were revealed through the identification of fall-related hot spots and asset mapping. PMID:28361049
I/O-Efficient Scientific Computation Using TPIE
NASA Technical Reports Server (NTRS)
Vengroff, Darren Erik; Vitter, Jeffrey Scott
1996-01-01
In recent years, input/output (I/O)-efficient algorithms for a wide variety of problems have appeared in the literature. However, systems specifically designed to assist programmers in implementing such algorithms have remained scarce. TPIE is a system designed to support I/O-efficient paradigms for problems from a variety of domains, including computational geometry, graph algorithms, and scientific computation. The TPIE interface frees programmers from having to deal not only with explicit read and write calls, but also the complex memory management that must be performed for I/O-efficient computation. In this paper we discuss applications of TPIE to problems in scientific computation. We discuss algorithmic issues underlying the design and implementation of the relevant components of TPIE and present performance results of programs written to solve a series of benchmark problems using our current TPIE prototype. Some of the benchmarks we present are based on the NAS parallel benchmarks while others are of our own creation. We demonstrate that the central processing unit (CPU) overhead required to manage I/O is small and that even with just a single disk, the I/O overhead of I/O-efficient computation ranges from negligible to the same order of magnitude as CPU time. We conjecture that if we use a number of disks in parallel this overhead can be all but eliminated.
Computer programs for calculating potential flow in propulsion system inlets
NASA Technical Reports Server (NTRS)
Stockman, N. O.; Button, S. L.
1973-01-01
In the course of designing inlets, particularly for VTOL and STOL propulsion systems, a calculational procedure utilizing three computer programs evolved. The chief program is the Douglas axisymmetric potential flow program called EOD which calculates the incompressible potential flow about arbitrary axisymmetric bodies. The other two programs, original with Lewis, are called SCIRCL AND COMBYN. Program SCIRCL generates input for EOD from various specified analytic shapes for the inlet components. Program COMBYN takes basic solutions output by EOD and combines them into solutions of interest, and applies a compressibility correction.
The North American Amphibian Monitoring Program. [abstract
Griffin, J.
1998-01-01
The North American Amphibian Monitoring Program has been under development for the past three years. The monitoring strategy for NAAMP has five main prongs: terrestrial salamander surveys, calling surveys, aquatic surveys, western surveys, and atlassing. Of these five, calling surveys were selected as one of the first implementation priorities due to their friendliness to volunteers of varying knowledge levels, relative low cost, and the fact that several groups had already pioneered the techniques involved. While some states and provinces had implemented calling surveys prior to NAAMP, like WI and IL, most states and provinces had little or no history of state/provincewide amphibian monitoring. Thus, the majority of calling survey programs were initiated in the past two years. To assess the progress of this pilot phase, a program review was conducted on the status of the NAAMP calling survey program, and the results of that review will be presented at the meeting. Topics to be discussed include: who is doing what where, extent of route coverage, the continuing random route discussions, quality assurance, strengths and weaknesses of calling surveys, reliability of data, and directions for the future. In addition, a brief overview of the DISPro project will be included. DISPro is a new amphibian monitoring program in National Parks, funded by the Demonstration of Intensive Sites Program (DISPro) through the EPA and NPS. It will begin this year at Big Bend and Shenandoah National Parks. The purpose of the DISPro Amphibian Project will be to investigate relationships between environmental factors and stressors and the distribution, abundance, and health of amphibians in these National Parks. At each Park, amphibian long-term monitoring protocols will be tested, distributions and abundance of amphibians will be mapped, and field research experiments will be conducted to examine stressor effects on amphibians (e.g., ultraviolet radiation, contaminants, acidification).
47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.
Code of Federal Regulations, 2010 CFR
2010-10-01
... programming in the mobile unit that determines the handling of a non-911 call and permit the call to be... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...
McInnes, Colin W; Vorstenbosch, Joshua; Chard, Ryan; Logsetty, Sarvesh; Buchel, Edward W; Islur, Avinash
2018-02-01
The impact of resident work hour restrictions on training and patient care remains a highly controversial topic, and to date, there lacks a formal assessment as it pertains to Canadian plastic surgery residents. To characterize the work hour profile of Canadian plastic surgery residents and assess the perspectives of residents and program directors regarding work hour restrictions related to surgical competency, resident wellness, and patient safety. An anonymous online survey developed by the authors was sent to all Canadian plastic surgery residents and program directors. Basic summary statistics were calculated. Eighty (53%) residents and 10 (77%) program directors responded. Residents reported working an average of 73 hours in hospital per week with 8 call shifts per month and sleep 4.7 hours/night while on call. Most residents (88%) reported averaging 0 post-call days off per month and 61% will work post-call without any sleep. The majority want the option of working post-call (63%) and oppose an 80-hour weekly maximum (77%). Surgical and medical errors attributed to post-call fatigue were self-reported by 26% and 49% of residents, respectively. Residents and program directors expressed concern about the ability to master surgical skills without working post-call. The majority of respondents oppose duty hour restrictions. The reason is likely multifactorial, including the desire of residents to meet perceived expectations and to master their surgical skills while supervised. If duty hour restrictions are aggressively implemented, many respondents feel that an increased duration of training may be necessary.
Optimization methods and silicon solar cell numerical models
NASA Technical Reports Server (NTRS)
Girardini, K.; Jacobsen, S. E.
1986-01-01
An optimization algorithm for use with numerical silicon solar cell models was developed. By coupling an optimization algorithm with a solar cell model, it is possible to simultaneously vary design variables such as impurity concentrations, front junction depth, back junction depth, and cell thickness to maximize the predicted cell efficiency. An optimization algorithm was developed and interfaced with the Solar Cell Analysis Program in 1 Dimension (SCAP1D). SCAP1D uses finite difference methods to solve the differential equations which, along with several relations from the physics of semiconductors, describe mathematically the performance of a solar cell. A major obstacle is that the numerical methods used in SCAP1D require a significant amount of computer time, and during an optimization the model is called iteratively until the design variables converge to the values associated with the maximum efficiency. This problem was alleviated by designing an optimization code specifically for use with numerically intensive simulations, to reduce the number of times the efficiency has to be calculated to achieve convergence to the optimal solution.
Two-dimensional nonsteady viscous flow simulation on the Navier-Stokes computer miniNode
NASA Technical Reports Server (NTRS)
Nosenchuck, Daniel M.; Littman, Michael G.; Flannery, William
1986-01-01
The needs of large-scale scientific computation are outpacing the growth in performance of mainframe supercomputers. In particular, problems in fluid mechanics involving complex flow simulations require far more speed and capacity than that provided by current and proposed Class VI supercomputers. To address this concern, the Navier-Stokes Computer (NSC) was developed. The NSC is a parallel-processing machine, comprised of individual Nodes, each comparable in performance to current supercomputers. The global architecture is that of a hypercube, and a 128-Node NSC has been designed. New architectural features, such as a reconfigurable many-function ALU pipeline and a multifunction memory-ALU switch, have provided the capability to efficiently implement a wide range of algorithms. Efficient algorithms typically involve numerically intensive tasks, which often include conditional operations. These operations may be efficiently implemented on the NSC without, in general, sacrificing vector-processing speed. To illustrate the architecture, programming, and several of the capabilities of the NSC, the simulation of two-dimensional, nonsteady viscous flows on a prototype Node, called the miniNode, is presented.
Penalty dynamic programming algorithm for dim targets detection in sensor systems.
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.
Improving School Nurse Pain Assessment Practices for Students With Intellectual Disability.
Quinn, Brenna L; Smolinski, Megan
2017-01-01
School nurses are afforded minimal resources related to assessing pain in students with intellectual disability (ID) and have called for continuing education. The purpose of this study was to measure the effectiveness of an education program regarding best practices for assessing pain in students with ID. Educational sessions were presented to 248 school nurses. Pre-, post-, and follow-up surveys measured (1) difficulty school nurses face when assessing pain, (2) knowledge and use of pain assessment methods, and (3) intent to change and actual changes to professional practices. Participants experienced less difficulty assessing pain following the educational program. Almost all participants intended to change pain assessment practices, but large caseloads limited new practice adoption. Policy makers must consider population size and acuity when determining school nurse staffing. Trainings and other resources should be made available to school nurses in order to make pain assessments for students with ID more thorough and efficient.
Efficiency Analysis of Public Universities in Thailand
ERIC Educational Resources Information Center
Kantabutra, Saranya; Tang, John C. S.
2010-01-01
This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…
Accomplishments of the Oak Ridge National Laboratory Seed Money program
DOE R&D Accomplishments Database
1986-09-01
In 1974, a modest program for funding new, innovative research was initiated at ORNL. It was called the "Seed Money" program and has become part of a larger program, called Exploratory R and D, which is being carried out at all DOE national laboratories. This report highlights 12 accomplishments of the Seed Money Program: nickel aluminide, ion implantation, laser annealing, burn meter, Legionnaires' disease, whole-body radiation counter, the ANFLOW system, genetics and molecular biology, high-voltage equipment, microcalorimeter, positron probe, and atom science. (DLC)
When They Talk about CALL: Discourse in a Required CALL Class
ERIC Educational Resources Information Center
Kessler, Greg
2010-01-01
This study investigates preservice teachers' discourse about CALL in a required CALL class which combines theory and practice. Thirty-three students in a Linguistics MA program CALL course were observed over a 10-week quarter. For all of these students, it was their first formal exposure to CALL as a discipline. Communication in the class…
Insights from Smart Meters: The Potential for Peak-Hour Savings from Behavior-Based Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Annika; Perry, Michael; Smith, Brian
The rollout of smart meters in the last several years has opened up new forms of previously unavailable energy data. Many utilities are now able in real-time to capture granular, household level interval usage data at very high-frequency levels for a large proportion of their residential and small commercial customer population. This can be linked to other time and locationspecific information, providing vast, constantly growing streams of rich data (sometimes referred to by the recently popular buzz word, “big data”). Within the energy industry there is increasing interest in tapping into the opportunities that these data can provide. What canmore » we do with all of these data? The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull interesting insights out of this highfrequency, human-focused data. We at LBNL are calling this “behavior analytics”. This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, highly disaggregated and heterogeneous information about actual energy use would allow energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; would enable evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and would provide better insights in to the energy and peak hour savings associated with specifics types of EE and DR programs (e.g., behavior-based (BB) programs). In this series, “Insights from Smart Meters”, we will present concrete, illustrative examples of the type of value that insights from behavior analytics of these data can provide (as well as pointing out its limitations). We will supply several types of key findings, including: • Novel results, which answer questions the industry previously was unable to answer; • Proof-of-concept analytics tools that can be adapted and used by others; and • Guidelines and protocols that summarize analytical best practices. This report focuses on one example of the kind of value that analysis of this data can provide: insights into whether behavior-based (BB) efficiency programs have the potential to provide peak-hour energy savings.« less
Flight dynamics analysis and simulation of heavy lift airships. Volume 5: Programmer's manual
NASA Technical Reports Server (NTRS)
Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.
1982-01-01
The Programmer's Manual contains explanations of the logic embodied in the various program modules, a dictionary of program variables, a subroutine listing, subroutine/common block/cross reference listing, and a calling/called subroutine cross reference listing.
An approach to efficient mobility management in intelligent networks
NASA Technical Reports Server (NTRS)
Murthy, K. M. S.
1995-01-01
Providing personal communication systems supporting full mobility require intelligent networks for tracking mobile users and facilitating outgoing and incoming calls over different physical and network environments. In realizing the intelligent network functionalities, databases play a major role. Currently proposed network architectures envision using the SS7-based signaling network for linking these DB's and also for interconnecting DB's with switches. If the network has to support ubiquitous, seamless mobile services, then it has to support additionally mobile application parts, viz., mobile origination calls, mobile destination calls, mobile location updates and inter-switch handovers. These functions will generate significant amount of data and require them to be transferred between databases (HLR, VLR) and switches (MSC's) very efficiently. In the future, the users (fixed or mobile) may use and communicate with sophisticated CPE's (e.g. multimedia, multipoint and multisession calls) which may require complex signaling functions. This will generate volumness service handling data and require efficient transfer of these message between databases and switches. Consequently, the network providers would be able to add new services and capabilities to their networks incrementally, quickly and cost-effectively.
Hardware independence checkout software
NASA Technical Reports Server (NTRS)
Cameron, Barry W.; Helbig, H. R.
1990-01-01
ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.
Jones, Deborah J; Forehand, Rex; Cuellar, Jessica; Parent, Justin; Honeycutt, Amanda; Khavjou, Olga; Gonzalez, Michelle; Anton, Margaret; Newey, Greg A
2014-01-01
Early onset disruptive behavior disorders are overrepresented in low-income families; yet these families are less likely to engage in behavioral parent training (BPT) than other groups. This project aimed to develop and pilot test a technology-enhanced version of one evidence-based BPT program, Helping the Noncompliant Child (HNC). The aim was to increase engagement of low-income families and, in turn, child behavior outcomes, with potential cost-savings associated with greater treatment efficiency. Low-income families of 3- to 8-year-old children with clinically significant disruptive behaviors were randomized to and completed standard HNC (n = 8) or Technology-Enhanced HNC (TE-HNC; n = 7). On average, caregivers were 37 years old; 87% were female, and 80% worked at least part-time. More than half (53%) of the youth were boys; the average age of the sample was 5.67 years. All families received the standard HNC program; however, TE-HNC also included the following smartphone enhancements: (a) skills video series, (b) brief daily surveys, (c) text message reminders, (d) video recording home practice, and (e) midweek video calls. TE-HNC yielded larger effect sizes than HNC for all engagement outcomes. Both groups yielded clinically significant improvements in disruptive behavior; however, findings suggest that the greater program engagement associated with TE-HNC boosted child treatment outcome. Further evidence for the boost afforded by the technology is revealed in family responses to postassessment interviews. Finally, cost analysis suggests that TE-HNC families also required fewer sessions than HNC families to complete the program, an efficiency that did not compromise family satisfaction. TE-HNC shows promise as an innovative approach to engaging low-income families in BPT with potential cost-savings and, therefore, merits further investigation on a larger scale.
Jones, Deborah J.; Forehand, Rex; Cuellar, Jessica; Parent, Justin; Honeycutt, Amanda; Khavjou, Olga; Gonzalez, Michelle; Anton, Margaret; Newey, Greg A.
2013-01-01
Objective Early onset Disruptive Behavior Disorders (DBDs) are overrepresented in low-income families; yet, these families are less likely to engage in Behavioral Parent Training (BPT) than other groups. This project aimed to develop and pilot test a technology-enhanced version of one evidence-based BPT program, Helping the Noncompliant Child (HNC). The aim was to increase engagement of low-income families and, in turn, child behavior outcomes, with potential cost-savings associated with greater treatment efficiency. Method Low-income families of 3-to-8 year old children with clinically-significant disruptive behaviors were randomized to and completed standard HNC (n =8) or technology-enhanced HNC (TE-HNC) (n = 7). On average, caregivers were 37 years old, female (87%), and most (80%) worked at least part-time. Half (53%) of the youth were boys, average age of the sample was 5.67 years. All families received the standard HNC program; however, TE-HNC also included the following smartphone-enhancements: (1). Skills video series; (2). Brief daily surveys; (3). Text message reminders; (4). Video recording home practice; and (5). Mid-week video calls. Results TE-HNC yielded larger effect sizes than HNC for all engagement outcomes. Both groups yielded clinically significant improvements in disruptive behavior; however, findings suggest that the greater program engagement associated with TE-HNC boosted child treatment outcome. Further evidence for the boost afforded by the technology is revealed in family responses to post-assessment interviews. Finally, cost analysis suggests that TE-HNC families also required fewer sessions than HNC families to complete the program, an efficiency that did not compromise family satisfaction. Conclusions TE-HNC shows promise as an innovative approach to engaging low-income families in BPT with potential cost-savings and, therefore, merits further investigation on a larger scale. PMID:23924046
Direct Telephonic Communication in a Heart Failure Transitional Care Program: An observational study
Ota, Ken S.; Beutler, David S.; Sheikh, Hassam; Weiss, Jessica L.; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D.; Loli, Akil I.
2013-01-01
Background This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. Methods This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. Results A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the “Weekday Daytime” period, Eighty-five percent of the totals calls were made. There were 229 calls during the “Weekday Nights” period with 1.5 inbound calls per week. The “Total Weekend” calls were 10.2% of the total calls which equated to a weekly average of 8.8. Conclusions Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations. PMID:28352437
Primer on Computer Graphics Programming. Revision
1982-04-01
TEXTO 60 TO 4 3 CALL UWRITl C’Ai’,’TEXT 4 CONTINUE «.«. ^^^^ef%,xN...CX.Y.’NOO mm^^ CALL UPRNTl CTTTLECO,’ TEXTO CALL UPRNTJ CX.OPTIONCI33 CALL UPRNTJ CTITLEC25.’ TEXTO CALL UPRNTl CY,OPTIONCli3 CALL UMOVE OC.Y5...CALL USET (’TEXT’) CALL UPRINT (-1.0,-1.05,’SIDES;’) CALL USET (’INTEGER’) CALL UPRINT (0.9,-1.05,S! DES ) 1 CONTINUE CALLUEND STOP
A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises
ERIC Educational Resources Information Center
O'Brien, Myles
2012-01-01
The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…
Melnick, Glenn A; Green, Lois; Rich, Jeremy
2016-01-01
In 2009 HealthCare Partners Affiliates Medical Group, based in Southern California, launched House Calls, an in-home program that provides, coordinates, and manages care primarily for recently discharged high-risk, frail, and psychosocially compromised patients. Its purpose is to reduce preventable emergency department visits and hospital readmissions. We present data over time from this well-established program to provide an example for other new programs that are being established across the United States to serve this population with complex needs. The findings show that the initial House Calls structure, staffing patterns, and processes differed across the geographic areas that it served, and that they also evolved over time in different ways. In the same time period, all areas experienced a reduction in operating costs per patient and showed substantial reductions in monthly per patient health care spending and hospital utilization after enrollment in the House Calls program, compared to the period before enrollment. Despite more than five years of experience, the program structure continues to evolve and adjust staffing and other features to accommodate the dynamic nature of this complex patient population. Project HOPE—The People-to-People Health Foundation, Inc.
Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.
Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M
2014-04-21
Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.
Impact of a Temporary NRT Enhancement in a State Quitline and Web-Based Program.
Cole, Sam; Suter, Casey; Nash, Chelsea; Pollard, Joseph
2018-06-01
To examine the impact of a nicotine replacement therapy (NRT) enhancement on quit outcomes. Observational study using an intent to treat as treated analysis. Not available. A total of 4022 Idaho tobacco users aged ≥18 years who received services from the Idaho Tobacco Quitline or Idaho's web-based program. One-call phone or web-based participants were sent a single 4- or 8-week NRT shipment. Multiple-call participants were sent NRT in a single 4-week shipment or two 4-week shipments (second shipment sent only to those completing a second coaching call). North American Quitline Consortium recommended Minimal Data Set items collected at registration and follow-up. Thirty-day point prevalence quit rates were assessed at 7-month follow-up. Multiple logistic regression models were used to examine the effects of program type and amount of NRT sent to participants while controlling for demographic and tobacco use characteristics. Abstinence rates were significantly higher among 8-week versus 4-week NRT recipients (42.5% vs 33.3%). The effect was only significant between multiple-call program participants who received both 4-week NRT shipments versus only the first of 2 possible 4-week shipments (51.1% vs 31.1%). Costs per quit were lowest among web-based participants who received 4 weeks of NRT (US$183 per quit) and highest among multiple-call participants who received only 1 of 2 possible NRT shipments (US$557 per quit). To better balance cost with clinical effectiveness, funders of state-based tobacco cessation services may want to consider (1) allowing tobacco users to choose between phone- and web-based programs while (2) limiting longer NRT benefits only to multiple-call program participants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekelman, Justin E., E-mail: bekelman@uphs.upenn.edu; Deye, James A.; Vikram, Bhadrasain
2012-07-01
Purpose: In the context of national calls for reorganizing cancer clinical trials, the National Cancer Institute sponsored a 2-day workshop to examine challenges and opportunities for optimizing radiotherapy quality assurance (QA) in clinical trial design. Methods and Materials: Participants reviewed the current processes of clinical trial QA and noted the QA challenges presented by advanced technologies. The lessons learned from the radiotherapy QA programs of recent trials were discussed in detail. Four potential opportunities for optimizing radiotherapy QA were explored, including the use of normal tissue toxicity and tumor control metrics, biomarkers of radiation toxicity, new radiotherapy modalities such asmore » proton beam therapy, and the international harmonization of clinical trial QA. Results: Four recommendations were made: (1) to develop a tiered (and more efficient) system for radiotherapy QA and tailor the intensity of QA to the clinical trial objectives (tiers include general credentialing, trial-specific credentialing, and individual case review); (2) to establish a case QA repository; (3) to develop an evidence base for clinical trial QA and introduce innovative prospective trial designs to evaluate radiotherapy QA in clinical trials; and (4) to explore the feasibility of consolidating clinical trial QA in the United States. Conclusion: Radiotherapy QA can affect clinical trial accrual, cost, outcomes, and generalizability. To achieve maximum benefit, QA programs must become more efficient and evidence-based.« less
Bekelman, Justin E; Deye, James A; Vikram, Bhadrasain; Bentzen, Soren M; Bruner, Deborah; Curran, Walter J; Dignam, James; Efstathiou, Jason A; FitzGerald, T J; Hurkmans, Coen; Ibbott, Geoffrey S; Lee, J Jack; Merchant, Thomas E; Michalski, Jeff; Palta, Jatinder R; Simon, Richard; Ten Haken, Randal K; Timmerman, Robert; Tunis, Sean; Coleman, C Norman; Purdy, James
2012-07-01
In the context of national calls for reorganizing cancer clinical trials, the National Cancer Institute sponsored a 2-day workshop to examine challenges and opportunities for optimizing radiotherapy quality assurance (QA) in clinical trial design. Participants reviewed the current processes of clinical trial QA and noted the QA challenges presented by advanced technologies. The lessons learned from the radiotherapy QA programs of recent trials were discussed in detail. Four potential opportunities for optimizing radiotherapy QA were explored, including the use of normal tissue toxicity and tumor control metrics, biomarkers of radiation toxicity, new radiotherapy modalities such as proton beam therapy, and the international harmonization of clinical trial QA. Four recommendations were made: (1) to develop a tiered (and more efficient) system for radiotherapy QA and tailor the intensity of QA to the clinical trial objectives (tiers include general credentialing, trial-specific credentialing, and individual case review); (2) to establish a case QA repository; (3) to develop an evidence base for clinical trial QA and introduce innovative prospective trial designs to evaluate radiotherapy QA in clinical trials; and (4) to explore the feasibility of consolidating clinical trial QA in the United States. Radiotherapy QA can affect clinical trial accrual, cost, outcomes, and generalizability. To achieve maximum benefit, QA programs must become more efficient and evidence-based. Copyright © 2012 Elsevier Inc. All rights reserved.
CASPER: A GENERALIZED PROGRAM FOR PLOTTING AND SCALING DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietzke, M.P.; Smith, R.E.
A Fortran subroutine was written to scale floating-point data and generate a magnetic tape to plot it on the Calcomp 570 digital plotter. The routine permits a great deal of flexibility, and may be used with any type of FORTRAN or FAP calling program. A simple calling program was also written to permit the user to read in data from cards and plot it without any additional programming. Both the Fortran and binary decks are available. (auth)
Incremental Parallelization of Non-Data-Parallel Programs Using the Charon Message-Passing Library
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.
2000-01-01
Message passing is among the most popular techniques for parallelizing scientific programs on distributed-memory architectures. The reasons for its success are wide availability (MPI), efficiency, and full tuning control provided to the programmer. A major drawback, however, is that incremental parallelization, as offered by compiler directives, is not generally possible, because all data structures have to be changed throughout the program simultaneously. Charon remedies this situation through mappings between distributed and non-distributed data. It allows breaking up the parallelization into small steps, guaranteeing correctness at every stage. Several tools are available to help convert legacy codes into high-performance message-passing programs. They usually target data-parallel applications, whose loops carrying most of the work can be distributed among all processors without much dependency analysis. Others do a full dependency analysis and then convert the code virtually automatically. Even more toolkits are available that aid construction from scratch of message passing programs. None, however, allows piecemeal translation of codes with complex data dependencies (i.e. non-data-parallel programs) into message passing codes. The Charon library (available in both C and Fortran) provides incremental parallelization capabilities by linking legacy code arrays with distributed arrays. During the conversion process, non-distributed and distributed arrays exist side by side, and simple mapping functions allow the programmer to switch between the two in any location in the program. Charon also provides wrapper functions that leave the structure of the legacy code intact, but that allow execution on truly distributed data. Finally, the library provides a rich set of communication functions that support virtually all patterns of remote data demands in realistic structured grid scientific programs, including transposition, nearest-neighbor communication, pipelining, gather/scatter, and redistribution. At the end of the conversion process most intermediate Charon function calls will have been removed, the non-distributed arrays will have been deleted, and virtually the only remaining Charon functions calls are the high-level, highly optimized communications. Distribution of the data is under complete control of the programmer, although a wide range of useful distributions is easily available through predefined functions. A crucial aspect of the library is that it does not allocate space for distributed arrays, but accepts programmer-specified memory. This has two major consequences. First, codes parallelized using Charon do not suffer from encapsulation; user data is always directly accessible. This provides high efficiency, and also retains the possibility of using message passing directly for highly irregular communications. Second, non-distributed arrays can be interpreted as (trivial) distributions in the Charon sense, which allows them to be mapped to truly distributed arrays, and vice versa. This is the mechanism that enables incremental parallelization. In this paper we provide a brief introduction of the library and then focus on the actual steps in the parallelization process, using some representative examples from, among others, the NAS Parallel Benchmarks. We show how a complicated two-dimensional pipeline-the prototypical non-data-parallel algorithm- can be constructed with ease. To demonstrate the flexibility of the library, we give examples of the stepwise, efficient parallel implementation of nonlocal boundary conditions common in aircraft simulations, as well as the construction of the sequence of grids required for multigrid.
PIRIA: a general tool for indexing, search, and retrieval of multimedia content
NASA Astrophysics Data System (ADS)
Joint, Magali; Moellic, Pierre-Alain; Hede, P.; Adam, P.
2004-05-01
The Internet is a continuously expanding source of multimedia content and information. There are many products in development to search, retrieve, and understand multimedia content. But most of the current image search/retrieval engines, rely on a image database manually pre-indexed with keywords. Computers are still powerless to understand the semantic meaning of still or animated image content. Piria (Program for the Indexing and Research of Images by Affinity), the search engine we have developed brings this possibility closer to reality. Piria is a novel search engine that uses the query by example method. A user query is submitted to the system, which then returns a list of images ranked by similarity, obtained by a metric distance that operates on every indexed image signature. These indexed images are compared according to several different classifiers, not only Keywords, but also Form, Color and Texture, taking into account geometric transformations and variance like rotation, symmetry, mirroring, etc. Form - Edges extracted by an efficient segmentation algorithm. Color - Histogram, semantic color segmentation and spatial color relationship. Texture - Texture wavelets and local edge patterns. If required, Piria is also able to fuse results from multiple classifiers with a new classification of index categories: Single Indexer Single Call (SISC), Single Indexer Multiple Call (SIMC), Multiple Indexers Single Call (MISC) or Multiple Indexers Multiple Call (MIMC). Commercial and industrial applications will be explored and discussed as well as current and future development.
Frattaroli, Shannon; Schulman, Eric; McDonald, Eileen M; Omaki, Elise C; Shields, Wendy C; Jones, Vanya; Brewer, William
2018-05-17
Innovative strategies are needed to improve the prevalence of working smoke alarms in homes. To our knowledge, this is the first study to report on the effectiveness of Facebook advertising and automated telephone calls as population-level strategies to encourage an injury prevention behavior. We examine the effectiveness of Facebook advertising and automated telephone calls as strategies to enroll individuals in Baltimore City's Fire Department's free smoke alarm installation program. We directed our advertising efforts toward Facebook users eligible for the Baltimore City Fire Department's free smoke alarm installation program and all homes with a residential phone line included in Baltimore City's automated call system. The Facebook campaign targeted Baltimore City residents 18 years of age and older. In total, an estimated 300 000 Facebook users met the eligibility criteria. Facebook advertisements were delivered to users' desktop and mobile device newsfeeds. A prerecorded message was sent to all residential landlines listed in the city's automated call system. By the end of the campaign, the 3 advertisements generated 456 666 impressions reaching 130 264 Facebook users. Of the users reached, 4367 individuals (1.3%) clicked the advertisement. The automated call system included approximately 90 000 residential phone numbers. Participants attributed 25 smoke alarm installation requests to Facebook and 458 to the automated call. Facebook advertisements are a novel approach to promoting smoke alarms and appear to be effective in exposing individuals to injury prevention messages. However, converting Facebook message recipients to users of a smoke alarm installation program occurred infrequently in this study. Residents who participated in the smoke alarm installation program were more likely to cite the automated call as the impetus for their participation. Additional research is needed to understand the circumstances and strategies to effectively use the social networking site as a tool to convert passive users into active participants.
Integrating Corpus-Based CALL Programs in Teaching English through Children's Literature
ERIC Educational Resources Information Center
Johns, Tim F.; Hsingchin, Lee; Lixun, Wang
2008-01-01
This paper presents particular pedagogical applications of a number of corpus-based CALL (computer assisted language learning) programs such as "CONTEXTS" and "CLOZE," "MATCHUP" and "BILINGUAL SENTENCE SHUFFLER," in the teaching of English through children's literature. An elective course in Taiwan for…
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment
NASA Astrophysics Data System (ADS)
Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong
Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.
Programmed release triggered by osmotic gradients in multicomponent vesicles
NASA Astrophysics Data System (ADS)
Dong, Ruo-Yu; Jang, Hyun-Sook; Granick, Steve
Polymersomes, a good candidate for encapsulation and delivery of active ingredients, can be constructed with inter-connected multiple compartments. These so-called multisomes on the one hand enable the spatial separation of various incompatible contents or processes, and on the other hand provide an efficient route for inter-compartment communication via the interface semipermeable membrane. Here we show that by establishing osmotic imbalances between different compartments, interesting synergetic morphology changes of the multisomes can be observed. And by further carefully adjusting the osmotic gradients and the arrangement of compartments, we can realize a cascade rupture of these individual units, which may be a new step towards controlled mixing and timed sequences of chemical reactions.
Fibrous minerals from Somma-Vesuvius volcanic complex
NASA Astrophysics Data System (ADS)
Rossi, Manuela; Nestola, Fabrizio; Ghiara, Maria R.; Capitelli, Francesco
2016-08-01
A survey on fibrous minerals coming from the densely populated area of Campania around the Somma-Vesuvius volcanic complex (Italy) was performed by means of a multi-methodological approach, based on morphological analyses, EMPA/WDS and SEM/EDS applications, and unit-cell determination through X-ray diffraction data. Such mineralogical investigation aims to provide suitable tools to the identification of fibrous natural phases, to improve the knowledge of both geochemical, petrogenetic and regional mineralogy of Somma-Vesuvius area, and to emphasize the presence of minerals with fibrous habit in all volcanic environments. The survey also fits well in the calls of health and environment of Horizon 2020 program of the European Commission (Climate Action, Environment, Resource Efficiency and Raw Materials).
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
MIDAS: Software for the detection and analysis of lunar impact flashes
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús
2015-06-01
Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.
High-Efficiency and High-Power Mid-Wave Infrared Cascade Lasers
2012-10-01
internal quantum efficiency () and factor (2) is usually called the optical extraction efficiency (). The optical extraction efficiency ... quantum efficiency involves more fundamental parameters corresponding to the microscopic processes of the device operation, nevertheless, it can be...deriving parameters such as the internal quantum efficiency of a QC laser, the entire injector miniband can be treated as a single virtual state
ERIC Educational Resources Information Center
Shaw, Yun
2010-01-01
Many of the commercial Computer-Assisted Language Learning (CALL) programs available today typically take a generic approach. This approach standardizes the program so that it can be used to teach any language merely by translating the content from one language to another. These CALL programs rarely consider the cultural background or preferred…
Distributed Function Mining for Gene Expression Programming Based on Fast Reduction.
Deng, Song; Yue, Dong; Yang, Le-chan; Fu, Xiong; Feng, Ya-zhou
2016-01-01
For high-dimensional and massive data sets, traditional centralized gene expression programming (GEP) or improved algorithms lead to increased run-time and decreased prediction accuracy. To solve this problem, this paper proposes a new improved algorithm called distributed function mining for gene expression programming based on fast reduction (DFMGEP-FR). In DFMGEP-FR, fast attribution reduction in binary search algorithms (FAR-BSA) is proposed to quickly find the optimal attribution set, and the function consistency replacement algorithm is given to solve integration of the local function model. Thorough comparative experiments for DFMGEP-FR, centralized GEP and the parallel gene expression programming algorithm based on simulated annealing (parallel GEPSA) are included in this paper. For the waveform, mushroom, connect-4 and musk datasets, the comparative results show that the average time-consumption of DFMGEP-FR drops by 89.09%%, 88.85%, 85.79% and 93.06%, respectively, in contrast to centralized GEP and by 12.5%, 8.42%, 9.62% and 13.75%, respectively, compared with parallel GEPSA. Six well-studied UCI test data sets demonstrate the efficiency and capability of our proposed DFMGEP-FR algorithm for distributed function mining.
Exploring EFL Teachers' CALL Knowledge and Competencies: In-Service Program Perspectives
ERIC Educational Resources Information Center
Liu, Mei-Hui; Kleinsasser, Robert C.
2015-01-01
This article describes quantitative and qualitative data providing perspectives on how six English as a Foreign Language (EFL) vocational high school teachers perceived CALL knowledge and competencies in a yearlong technology-enriched professional development program. The teachers' developing technological pedagogical content knowledge (TPACK) and…
Program Flow Analyzer. Volume 3
1984-08-01
metrics are defined using these basic terms. Of interest is another measure for the size of the program, called the volume: V N x log 2 n. 5 The unit of...correlated to actual data and most useful for test. The formula des - cribing difficulty may be expressed as: nl X N2D - 2 -I/L *Difficulty then, is the...linearly independent program paths through any program graph. A maximal set of these linearly independent paths, called a "basis set," can always be found
Cook, Tessa S; Hernandez, Jessica; Scanlon, Mary; Langlotz, Curtis; Li, Chun-Der L
2016-07-01
Despite its increasing use in training other medical specialties, high-fidelity simulation to prepare diagnostic radiology residents for call remains an underused educational resource. To attempt to characterize the barriers toward adoption of this technology, we conducted a survey of academic radiologists and radiology trainees. An Institutional Review Board-approved survey was distributed to the Association of University Radiologists members via e-mail. Survey results were collected electronically, tabulated, and analyzed. A total of 68 survey responses representing 51 programs were received from program directors, department chairs, chief residents, and program administrators. The most common form of educational activity for resident call preparation was lectures. Faculty supervised "baby call" was also widely reported. Actual simulated call environments were quite rare with only three programs reporting this type of educational activity. Barriers to the use of simulation include lack of faculty time, lack of faculty expertise, and lack of perceived need. High-fidelity simulation can be used to mimic the high-stress, high-stakes independent call environment that the typical radiology resident encounters during the second year of training, and can provide objective data for program directors to assess the Accreditation Council of Graduate Medical Education milestones. We predict that this technology will begin to supplement traditional diagnostic radiology teaching methods and to improve patient care and safety in the next decade. Published by Elsevier Inc.
A Note on Improving Process Efficiency in Panel Surveys with Paradata
ERIC Educational Resources Information Center
Kreuter, Frauke; Müller, Gerrit
2015-01-01
Call scheduling is a challenge for surveys around the world. Unlike cross-sectional surveys, panel surveys can use information from prior waves to enhance call-scheduling algorithms. Past observational studies showed the benefit of calling panel cases at times that had been successful in the past. This article is the first to experimentally assign…
Efficient Ada multitasking on a RISC register window architecture
NASA Technical Reports Server (NTRS)
Kearns, J. P.; Quammen, D.
1987-01-01
This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.
NASA Technical Reports Server (NTRS)
1979-01-01
One of the most comprehensive and most effective programs is NECAP, an acronym for NASA Energy Cost Analysis Program. Developed by Langley Research Center, NECAP operates according to heating/cooling calculation procedures formulated by the American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE). The program enables examination of a multitude of influences on heat flow into and out of buildings. For example, NECAP considers traditional weather patterns for a given locale and predicts the effects on a particular building design of sun, rain, wind, even shadows from other buildings. It takes into account the mass of structural materials, insulating values, the type of equipment the building will house, equipment operating schedules, heat by people and machinery, heat loss or gain through windows and other openings and a variety of additional details. NECAP ascertains how much energy the building should require ideally, aids selection of the most economical and most efficient energy systems and suggests design and operational measures for reducing the building's energy needs. Most importantly, NECAP determines cost effectiveness- whether an energy-saving measure will pay back its installation cost through monetary savings in energy bills. thrown off
Vlaisavljevich, Bess; Shiozaki, Toru
2016-08-09
We report the development of the theory and computer program for analytical nuclear energy gradients for (extended) multistate complete active space perturbation theory (CASPT2) with full internal contraction. The vertical shifts are also considered in this work. This is an extension of the fully internally contracted CASPT2 nuclear gradient program recently developed for a state-specific variant by us [MacLeod and Shiozaki, J. Chem. Phys. 2015, 142, 051103]; in this extension, the so-called λ equation is solved to account for the variation of the multistate CASPT2 energies with respect to the change in the amplitudes obtained in the preceding state-specific CASPT2 calculations, and the Z vector equations are modified accordingly. The program is parallelized using the MPI3 remote memory access protocol that allows us to perform efficient one-sided communication. The optimized geometries of the ground and excited states of a copper corrole and benzophenone are presented as numerical examples. The code is publicly available under the GNU General Public License.
NASA Technical Reports Server (NTRS)
Brown, David B.
1990-01-01
The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.
Johnson, Bill
2014-01-01
Medical practices receive hundreds if not thousands of calls every week from patients, payers, pharmacies, and others. Outsourcing call centers can be a smart move to improve efficiency, lower costs, improve customer care, ensure proper payer management, and ensure regulatory compliance. This article discusses how to know when it's time to move to an outsourced call center, the benefits of making the move, how to choose the right call center, and how to make the transition. It also provides tips on how to manage the call center to ensure the objectives are being met.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, Charles A.; Stuart, Elizabeth; Hoffman, Ian
2011-02-25
Since the spring of 2009, billions of federal dollars have been allocated to state and local governments as grants for energy efficiency and renewable energy projects and programs. The scale of this American Reinvestment and Recovery Act (ARRA) funding, focused on 'shovel-ready' projects to create and retain jobs, is unprecedented. Thousands of newly funded players - cities, counties, states, and tribes - and thousands of programs and projects are entering the existing landscape of energy efficiency programs for the first time or expanding their reach. The nation's experience base with energy efficiency is growing enormously, fed by federal dollars andmore » driven by broader objectives than saving energy alone. State and local officials made countless choices in developing portfolios of ARRA-funded energy efficiency programs and deciding how their programs would relate to existing efficiency programs funded by utility customers. Those choices are worth examining as bellwethers of a future world where there may be multiple program administrators and funding sources in many states. What are the opportunities and challenges of this new environment? What short- and long-term impacts will this large, infusion of funds have on utility customer-funded programs; for example, on infrastructure for delivering energy efficiency services or on customer willingness to invest in energy efficiency? To what extent has the attribution of energy savings been a critical issue, especially where administrators of utility customer-funded energy efficiency programs have performance or shareholder incentives? Do the new ARRA-funded energy efficiency programs provide insights on roles or activities that are particularly well-suited to state and local program administrators vs. administrators or implementers of utility customer-funded programs? The answers could have important implications for the future of U.S. energy efficiency. This report focuses on a selected set of ARRA-funded energy efficiency programs administered by state energy offices: the State Energy Program (SEP) formula grants, the portion of Energy Efficiency and Conservation Block Grant (EECBG) formula funds administered directly by states, and the State Energy Efficient Appliance Rebate Program (SEEARP). Since these ARRA programs devote significant monies to energy efficiency and serve similar markets as utility customer-funded programs, there are frequent interactions between programs. We exclude the DOE low-income weatherization program and EECBG funding awarded directly to the over 2,200 cities, counties and tribes from our study to keep its scope manageable. We summarize the energy efficiency program design and funding choices made by the 50 state energy offices, 5 territories and the District of Columbia. We then focus on the specific choices made in 12 case study states. These states were selected based on the level of utility customer program funding, diversity of program administrator models, and geographic diversity. Based on interviews with more than 80 energy efficiency actors in those 12 states, we draw observations about states strategies for use of Recovery Act funds. We examine interactions between ARRA programs and utility customer-funded energy efficiency programs in terms of program planning, program design and implementation, policy issues, and potential long-term impacts. We consider how the existing regulatory policy framework and energy efficiency programs in these 12 states may have impacted development of these selected ARRA programs. Finally, we summarize key trends and highlight issues that evaluators of these ARRA programs may want to examine in more depth in their process and impact evaluations.« less
47 CFR 64.1503 - Termination of pay-per-call and other information programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 3 2010-10-01 2010-10-01 false Termination of pay-per-call and other information programs. 64.1503 Section 64.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED... interstate information service through any 800 telephone number, or other telephone number advertised or...
33 CFR 402.7 - Service Incentive Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of calls scheduled for the Navigation Season. Additional calls to the system may be added during the season. (f) The carrier will advise the Manager of port rotation, outlining core ports of calls... carrier must meet 75% schedule adherence with a minimum of four (4) Great Lakes calls during the...
Data-Driven Benchmarking of Building Energy Efficiency Utilizing Statistical Frontier Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kavousian, A; Rajagopal, R
2014-01-01
Frontier methods quantify the energy efficiency of buildings by forming an efficient frontier (best-practice technology) and by comparing all buildings against that frontier. Because energy consumption fluctuates over time, the efficiency scores are stochastic random variables. Existing applications of frontier methods in energy efficiency either treat efficiency scores as deterministic values or estimate their uncertainty by resampling from one set of measurements. Availability of smart meter data (repeated measurements of energy consumption of buildings) enables using actual data to estimate the uncertainty in efficiency scores. Additionally, existing applications assume a linear form for an efficient frontier; i.e.,they assume that themore » best-practice technology scales up and down proportionally with building characteristics. However, previous research shows that buildings are nonlinear systems. This paper proposes a statistical method called stochastic energy efficiency frontier (SEEF) to estimate a bias-corrected efficiency score and its confidence intervals from measured data. The paper proposes an algorithm to specify the functional form of the frontier, identify the probability distribution of the efficiency score of each building using measured data, and rank buildings based on their energy efficiency. To illustrate the power of SEEF, this paper presents the results from applying SEEF on a smart meter data set of 307 residential buildings in the United States. SEEF efficiency scores are used to rank individual buildings based on energy efficiency, to compare subpopulations of buildings, and to identify irregular behavior of buildings across different time-of-use periods. SEEF is an improvement to the energy-intensity method (comparing kWh/sq.ft.): whereas SEEF identifies efficient buildings across the entire spectrum of building sizes, the energy-intensity method showed bias toward smaller buildings. The results of this research are expected to assist researchers and practitioners compare and rank (i.e.,benchmark) buildings more robustly and over a wider range of building types and sizes. Eventually, doing so is expected to result in improved resource allocation in energy-efficiency programs.« less
Guide to Operating and Maintaining EnergySmart Schools
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Through a commitment to high performance, school districts are discovering that smart energy choices can create lasting benefits for students, communities, and the environment. For example, an energy efficient school district with 4,000 students can save as much as $160,000 a year in energy costs. Over 10 years, those savings can reach $1.6 million, translating into the ability to hire more teachers, purchase more textbooks and computers, or invest in additional high performance facilities. Beyond these bottomline benefits, schools can better foster student health, decrease absenteeism, and serve as centers of community life. The U.S. Department of Energy's EnergySmart Schoolsmore » Program promotes a 30 percent improvement in existing school energy use. It also encourages the building of new schools that exceed code (ASHRAE 90.11999) by 50 percent or more. The program provides resources like this Guide to Operating and Maintaining EnergySmart Schools to assist school decisionmakers in planning, financing, operating, and maintaining energy efficient, high performance schools. It also offers education and training for building industry professionals. Operations and maintenance refer to all scheduled and unscheduled actions for preventing equipment failure or decline with the goal of increasing efficiency, reliability, and safety. A preventative maintenance program is the organized and planned performance of maintenance activities in order to prevent system or production problems or failures from occurring. In contrast, deferred maintenance or reactive maintenance (also called diagnostic or corrective maintenance) is conducted to address an existing problem. This guide is a primary resource for developing and implementing a districtor schoolwide operations and maintenance (O&M) program that focuses on energy efficiency. The EnergySmart Schools Solutions companion CD contains additional supporting information for design, renovation, and retrofit projects. The objective of this guide is to provide organizational and technical information for integrating energy and high performance facility management into existing O&M practices. The guide allows users to adapt and implement suggested O&M strategies to address specific energy efficiency goals. It recognizes and expands on existing tools and resources that are widely used throughout the high performance school industry. External resources are referenced throughout the guide and are also listed within the EnergySmart Schools O&M Resource List (Appendix J). While this guide emphasizes the impact of the energy efficiency component of O&M, it encourages taking a holistic approach to maintaining a high-performance school. This includes considering various environmental factors where energy plays an indirect or direct role. For example, indoor air quality, site selection, building orientation, and water efficiency should be considered. Resources to support these overlapping aspects will be cited throughout the guide.« less
Design, construction, operation, and evaluation of a prototype culm combustion boiler/heater unit
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Aciermo, J.; Richards, H.; Spindler, F.
1983-10-01
A process for utilizing anthracite culm in a fluidized bed combustion system was demonstrated by the design and construction of a prototype steam plant at Shamokin, PA, and operation of the plant for parametric tests and a nine month extended durability test. The parametric tests evaluated turndown capability of the plant and established turndown techniques to be used to achieve best performance. Throughout the test program the fluidized bed boiler durability was excellent, showing very high resistence to corrosion and erosion. A series of 39 parametric tests was performed in order to demonstrate turndown capabilities of the atmospheric fluidized bedmore » boiler burning anthracite culm. Four tests were performed with bituminous coal waste (called gob) which contains 4.8 to 5.5% sulfur. Heating value of both fuels is approximately 3000 Btu/lb and ash content is approximately 70%. Combustion efficiency, boiler efficiency, and emissions of NO/sub x/ and SO/sub 2/ were also determined for the tests.« less
Automated Array Assembly, Phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
The solar cell module process development activities in the areas of surface preparation are presented. The process step development was carried out on texture etching including the evolution of a conceptual process model for the texturing process; plasma etching; and diffusion studies that focused on doped polymer diffusion sources. Cell processing was carried out to test process steps and a simplified diode solar cell process was developed. Cell processing was also run to fabricate square cells to populate sample minimodules. Module fabrication featured the demonstration of a porcelainized steel glass structure that should exceed the 20 year life goal of the low cost silicon array program. High efficiency cell development was carried out in the development of the tandem junction cell and a modification of the TJC called the front surface field cell. Cell efficiencies in excess of 16 percent at AM1 have been attained with only modest fill factors. The transistor-like model was proposed that fits the cell performance and provides a guideline for future improvements in cell performance.
Optimization of composite sandwich cover panels subjected to compressive loadings
NASA Technical Reports Server (NTRS)
Cruz, Juan R.
1991-01-01
An analysis and design method is presented for the design of composite sandwich cover panels that includes transverse shear effects and damage tolerance considerations. This method is incorporated into an optimization program called SANDOP (SANDwich OPtimization). SANDOP is used in the present study to design optimized composite sandwich cover panels for transport aircraft wing applications as a demonstration of its capabilities. The results of this design study indicate that optimized composite sandwich cover panels have approximately the same structural efficiency as stiffened composite cover panels designed to identical constraints. Results indicate that inplane stiffness requirements have a large effect on the weight of these composite sandwich cover panels at higher load levels. Increasing the maximum allowable strain and the upper percentage limit of the 0 degree and plus or minus 45 degree plies can yield significant weight savings. The results show that the structural efficiency of these optimized composite sandwich cover panels is relatively insensitive to changes in core density.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for improving Medicare program efficiency and to reward suggesters for monetary savings. 420.410... Program Efficiency and to Reward Suggesters for Monetary Savings § 420.410 Establishment of a program to collect suggestions for improving Medicare program efficiency and to reward suggesters for monetary...
Code of Federal Regulations, 2011 CFR
2011-10-01
... for improving Medicare program efficiency and to reward suggesters for monetary savings. 420.410... Program Efficiency and to Reward Suggesters for Monetary Savings § 420.410 Establishment of a program to collect suggestions for improving Medicare program efficiency and to reward suggesters for monetary...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol L.
2016-07-05
Compare Gene Calls (CGC) is a Python code used for combining and comparing gene calls from any number of gene callers. A gene caller is a computer program that predicts the extends of open reading frames within genomes of biological organisms.
Chang, Larry William; Kagaayi, Joseph; Nakigozi, Gertrude; Galiwango, Ronald; Mulamba, Jeremiah; Ludigo, James; Ruwangula, Andrew; Gray, Ronald H; Quinn, Thomas C; Bollinger, Robert C; Reynolds, Steven J
2008-01-01
Hotlines and warmlines have been successfully used in the developed world to provide clinical advice; however, reports on their replicability in resource-limited settings are limited. A warmline was established in Rakai, Uganda, to support an antiretroviral therapy program. Over a 17-month period, a database was kept of who called, why they called, and the result of the call. A program evaluation was also administered to clinical staff. A total of 1303 calls (3.5 calls per weekday) were logged. The warmline was used mostly by field staff and peripherally based peer health workers. Calls addressed important clinical issues, including the need for urgent care, medication side effects, and follow-up needs. Most clinical staff felt that the warmline made their jobs easier and improved the health of patients. An HIV/AIDS warmline leveraged the skills of a limited workforce to provide increased access to HIV/AIDS care, advice, and education.
NASA Technical Reports Server (NTRS)
Agrawal, Gagan; Sussman, Alan; Saltz, Joel
1993-01-01
Scientific and engineering applications often involve structured meshes. These meshes may be nested (for multigrid codes) and/or irregularly coupled (called multiblock or irregularly coupled regular mesh problems). A combined runtime and compile-time approach for parallelizing these applications on distributed memory parallel machines in an efficient and machine-independent fashion was described. A runtime library which can be used to port these applications on distributed memory machines was designed and implemented. The library is currently implemented on several different systems. To further ease the task of application programmers, methods were developed for integrating this runtime library with compilers for HPK-like parallel programming languages. How this runtime library was integrated with the Fortran 90D compiler being developed at Syracuse University is discussed. Experimental results to demonstrate the efficacy of our approach are presented. A multiblock Navier-Stokes solver template and a multigrid code were experimented with. Our experimental results show that our primitives have low runtime communication overheads. Further, the compiler parallelized codes perform within 20 percent of the code parallelized by manually inserting calls to the runtime library.
A Concept for Run-Time Support of the Chapel Language
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.
Kirsch, Sallie Davis; Wilson, Lauren S; Harkins, Michelle; Albin, Dawn; Del Beccaro, Mark A
2015-01-01
The primary aim of this intervention was to assess the feasibility of using call center nurses who are experts in telephone triage to conduct post discharge telephone calls, as part of a quality improvement effort to prevent hospital readmission. Families of patients with bronchiolitis were called between 24 and 48 hours after discharge. The calls conducted by the nurses were efficient (average time was 12 minutes), and their assessments helped to identify gaps in inpatient family education. Overall, the project demonstrated the efficacy in readmission prevention by using nurses who staff a call center to conduct post-hospitalization telephone calls. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Phillip N.
2014-11-01
Snohomish County Public Utilities District (the District or Snohomish PUD) provides electricity to about 325,000 customers in Snohomish County, Washington. The District has an incentive programs to encourage commercial customers to improve energy efficiency: the District partially reimburses the cost of approved retrofits if they provide a level of energy performance improvement that is specified by contract. In 2013 the District contracted with Lawrence Berkeley National Laboratory to provide a third-party review of the Monitoring and Verification (M&V) practices the District uses to evaluate whether companies are meeting their contractual obligations. This work helps LBNL understand the challenges faced bymore » real-world practitioners of M&V of energy savings, and builds on a body of related work such as Price et al. (2013). The District selected a typical project for which they had already performed an evaluation. The present report includes the District's original evaluation as well as LBNL's review of their approach. The review is based on the document itself; on investigation of the load data and outdoor air temperature data from the building evaluated in the document; and on phone discussions with Bill Harris of the Snohomish County Public Utilities District. We will call the building studied in the document the subject building, the original Snohomish PUD report will be referred to as the Evaluation, and this discussion by LBNL is called the Review.« less
Answering the Call: How Group Mentoring Makes a Difference
ERIC Educational Resources Information Center
Altus, Jillian
2015-01-01
Mentoring programs answer the call for social justice for many students who are in success-inhibiting environments. This study employed a case study design to investigate the perceived benefits from a group mentoring program. Data was collected from pre- and post-assessments focus groups, and artifacts. Four participant benefits were revealed:…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Call for Co-Sponsors for Office of Healthcare Quality's Programs to Strengthen Coordination and Impact National Efforts in the Prevention of Healthcare-Associated... Health and Science, Office of Healthcare Quality. ACTION: Notice. SUMMARY: The U.S. Department of Health...
76 FR 54240 - National Institute of Allergy and Infectious Diseases; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
...: Robert G. Keefe, PhD, Scientific Review Officer, Scientific Review Program, DEA/NIAID/NIH/DHHS, Room 3256... Conference Call). Contact Person: Robert G. Keefe, PhD, Scientific Review Officer, Scientific Review Program... Drive, Bethesda, MD 20817 (Telephone Conference Call). Contact Person: Robert G. Keefe, PhD, Scientific...
Integrating CALL into the Classroom: The Role of Podcasting in an ESL Listening Strategies Course
ERIC Educational Resources Information Center
O'Brien, Anne; Hegelheimer, Volker
2007-01-01
Despite the increase of teacher preparation programs that emphasize the importance of training teachers to select and develop appropriate computer-assisted language learning (CALL) materials, integration of CALL into classroom settings is still frequently relegated to the use of selected CALL activities to supplement instruction or to provide…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... DEPARTMENT OF ENERGY [Docket No. EESEP0216] State Energy Program and Energy Efficiency and Conservation Block Grant (EECBG) Program; Request for Information AGENCY: Office of Energy Efficiency and... (SEP) and Energy Efficiency and Conservation Block Grant (EECBG) program, in support of energy...
Sgaier, Sema K.; Reed, Jason B.; Thomas, Anne; Njeuhmeli, Emmanuel
2014-01-01
Voluntary medical male circumcision (VMMC) is capable of reducing the risk of sexual transmission of HIV from females to males by approximately 60%. In 2007, the WHO and the Joint United Nations Programme on HIV/AIDS (UNAIDS) recommended making VMMC part of a comprehensive HIV prevention package in countries with a generalized HIV epidemic and low rates of male circumcision. Modeling studies undertaken in 2009–2011 estimated that circumcising 80% of adult males in 14 priority countries in Eastern and Southern Africa within five years, and sustaining coverage levels thereafter, could avert 3.4 million HIV infections within 15 years and save US$16.5 billion in treatment costs. In response, WHO/UNAIDS launched the Joint Strategic Action Framework for accelerating the scale-up of VMMC for HIV prevention in Southern and Eastern Africa, calling for 80% coverage of adult male circumcision by 2016. While VMMC programs have grown dramatically since inception, they appear unlikely to reach this goal. This review provides an overview of findings from the PLOS Collection “Voluntary Medical Male Circumcision for HIV Prevention: Improving Quality, Efficiency, Cost Effectiveness, and Demand for Services during an Accelerated Scale-up.” The use of devices for VMMC is also explored. We propose emphasizing management solutions to help VMMC programs in the priority countries achieve the desired impact of averting the greatest possible number of HIV infections. Our recommendations include advocating for prioritization and funding of VMMC, increasing strategic targeting to achieve the goal of reducing HIV incidence, focusing on programmatic efficiency, exploring the role of new technologies, rethinking demand creation, strengthening data use for decision-making, improving governments' program management capacity, strategizing for sustainability, and maintaining a flexible scale-up strategy informed by a strong monitoring, learning, and evaluation platform. PMID:24800840
Sgaier, Sema K; Reed, Jason B; Thomas, Anne; Njeuhmeli, Emmanuel
2014-05-01
Voluntary medical male circumcision (VMMC) is capable of reducing the risk of sexual transmission of HIV from females to males by approximately 60%. In 2007, the WHO and the Joint United Nations Programme on HIV/AIDS (UNAIDS) recommended making VMMC part of a comprehensive HIV prevention package in countries with a generalized HIV epidemic and low rates of male circumcision. Modeling studies undertaken in 2009-2011 estimated that circumcising 80% of adult males in 14 priority countries in Eastern and Southern Africa within five years, and sustaining coverage levels thereafter, could avert 3.4 million HIV infections within 15 years and save US$16.5 billion in treatment costs. In response, WHO/UNAIDS launched the Joint Strategic Action Framework for accelerating the scale-up of VMMC for HIV prevention in Southern and Eastern Africa, calling for 80% coverage of adult male circumcision by 2016. While VMMC programs have grown dramatically since inception, they appear unlikely to reach this goal. This review provides an overview of findings from the PLOS Collection "Voluntary Medical Male Circumcision for HIV Prevention: Improving Quality, Efficiency, Cost Effectiveness, and Demand for Services during an Accelerated Scale-up." The use of devices for VMMC is also explored. We propose emphasizing management solutions to help VMMC programs in the priority countries achieve the desired impact of averting the greatest possible number of HIV infections. Our recommendations include advocating for prioritization and funding of VMMC, increasing strategic targeting to achieve the goal of reducing HIV incidence, focusing on programmatic efficiency, exploring the role of new technologies, rethinking demand creation, strengthening data use for decision-making, improving governments' program management capacity, strategizing for sustainability, and maintaining a flexible scale-up strategy informed by a strong monitoring, learning, and evaluation platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeLaski, A.; Gauthier, J.; Shugars, J.
Distribution transformers offer a largely untapped opportunity for efficiency improvements in buildings. Application of energy-efficient equipment can reduce transformer losses by about 20%, substantially cutting a facility's total electricity bill and offering typical paybacks less than three years. Since nearly all of the electricity powering the commercial and industrial sectors is stepped down in voltage by facility-owned distribution transformers, broad application of energy-efficient equipment will lead to huge economy-wide energy and dollar savings as well as associated environmental benefits. This opportunity has led to a multi-party coordinated effort that offers a new model for national partnerships to pursue market transformation.more » The model, called the Informal Collaborative Model for the purposes of this paper, is characterized by voluntary commitments of multiple stakeholders to carry out key market interventions in a coordinated fashion, but without pooling resources or control. Collaborative participants are joined by a common interest in establishing and expanding the market for a new product, service, or practice that will yield substantial energy savings. This paper summarizes the technical efficiency opportunity available in distribution transformers; discusses the market barriers to widespread adoption of energy-efficient transformers; and details an overall market transformation strategy to address the identified market barriers. The respective roles of each of the diverse players--manufacturers, government agencies, and utility and regional energy efficiency programs--are given particular attention. Each of the organizations involved brings a particular set of tools and capabilities for addressing the market barriers to more efficient transformers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, Amelie; Hedman, Bruce; Taylor, Robert P.
Many states have implemented ratepayer-funded programs to acquire energy efficiency as a predictable and reliable resource for meeting existing and future energy demand. These programs have become a fixture in many U.S. electricity and natural gas markets as they help postpone or eliminate the need for expensive generation and transmission investments. Industrial energy efficiency (IEE) is an energy efficiency resource that is not only a low cost option for many of these efficiency programs, but offers productivity and competitive benefits to manufacturers as it reduces their energy costs. However, some industrial customers are less enthusiastic about participating in these programs.more » IEE ratepayer programs suffer low participation by industries across many states today despite a continual increase in energy efficiency program spending across all types of customers, and significant energy efficiency funds can often go unused for industrial customers. This paper provides four detailed case studies of companies that benefited from participation in their utility’s energy efficiency program offerings and highlights the business value brought to them by participation in these programs. The paper is designed both for rate-payer efficiency program administrators interested in improving the attractiveness and effectiveness of industrial efficiency programs for their industrial customers and for industrial customers interested in maximizing the value of participating in efficiency programs.« less
SCHOOL-BASED PROMOTION OF FRUIT AND VEGETABLE CONSUMPTION IN MULTICULTURALLY DIVERSE, URBAN SCHOOLS
BLOM-HOFFMAN, JESSICA
2009-01-01
Rates of childhood overweight1 have reached epidemic proportions (U.S. Department of Health and Human Services, 2001), and schools have been called on to play a role in the prevention of this medical condition. This article describes a multiyear health promotion effort—the Athletes in Service fruit and vegetable (F&V) promotion program—which is based on social learning theory for urban, elementary school children in kindergarten through third grade. Children participate in the program for a period of 3 years. The goals of the program are to increase opportunities for children to be more physically active during the school day and to help students increase their F&V consumption. This article describes the F&V promotion components of the program that were implemented in year 1, including implementation integrity and treatment acceptability data. Year 1 evaluation data demonstrated that the program is acceptable from the perspective of school staff and was implemented by school staff with high levels of integrity. Hallmarks of the program’s successful implementation and high acceptability include (a) having a school-based program champion; (b) designing the program to include low-cost, attractive, interactive materials; (c) including many school staff members to facilitate a culture of healthy eating in the school; and (d) spreading out implementation responsibilities among the multiple staff members so that each individual’s involvement is time efficient. PMID:19834582
Ackerman, Sara L; Boscardin, Christy; Karliner, Leah; Handley, Margaret A; Cheng, Sarah; Gaither, Thomas W; Hagey, Jill; Hennein, Lauren; Malik, Faizan; Shaw, Brian; Trinidad, Norver; Zahner, Greg; Gonzales, Ralph
2016-01-01
Systems-based practice focuses on the organization, financing, and delivery of medical services. The American Association of Medical Colleges has recommended that systems-based practice be incorporated into medical schools' curricula. However, experiential learning in systems-based practice, including practical strategies to improve the quality and efficiency of clinical care, is often absent from or inconsistently included in medical education. A multidisciplinary clinician and nonclinician faculty team partnered with a cardiology outpatient clinic to design a 9-month clerkship for 1st-year medical students focused on systems-based practice, delivery of clinical care, and strategies to improve the quality and efficiency of clinical operations. The clerkship was called the Action Research Program. In 2013-2014, 8 trainees participated in educational seminars, research activities, and 9-week clinic rotations. A qualitative process and outcome evaluation drew on interviews with students, clinic staff, and supervising physicians, as well as students' detailed field notes. The Action Research Program was developed and implemented at the University of California, San Francisco, an academic medical center in the United States. All educational activities took place at the university's medical school and at the medical center's cardiology outpatient clinic. Students reported and demonstrated increased understanding of how care delivery systems work, improved clinical skills, growing confidence in interactions with patients, and appreciation for patients' experiences. Clinicians reported increased efficiency at the clinic level and improved performance and job satisfaction among medical assistants as a result of their unprecedented mentoring role with students. Some clinicians felt burdened when students shadowed them and asked questions during interactions with patients. Most student-led improvement projects were not fully implemented. The Action Research Program is a small pilot project that demonstrates an innovative pairing of experiential and didactic training in systems-based practice. Lessons learned include the need for dedicated time and faculty support for students' improvement projects, which were the least successful aspect of the program. We recommend that future projects aiming to combine clinical training and quality improvement projects designate distinct blocks of time for trainees to pursue each of these activities independently. In 2014-2015, the University of California, San Francisco School of Medicine incorporated key features of the Action Research Program into the standard curriculum, with plans to build upon this foundation in future curricular innovations.
75 FR 31458 - Infrastructure Protection Data Call Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-03
...-0022] Infrastructure Protection Data Call Survey AGENCY: National Protection and Programs Directorate... New Information Collection Request, Infrastructure Protection Data Call Survey. DHS previously... territories are able to achieve this mission, IP requests opinions and information in a survey from IP Data...
Bennett, Jeffrey I; Dzara, Kristina; Mazhar, Mir Nadeem; Behere, Aniruddh
2011-03-01
The Accreditation Council for Graduate Medical Education (ACGME) requirements stipulate that psychiatry residents need to be educated in the area of emergency psychiatry. Existing research investigating the current state of this training is limited, and no research to date has assessed whether the ACGME Residency Review Committee requirements for psychiatry residency training are followed by psychiatry residency training programs. We administered, to chief resident attendees of a national leadership conference, a 24-item paper survey on the types and amount of emergency psychiatry training provided by their psychiatric residency training programs. Descriptive statistics were used in the analysis. Of 154 surveys distributed, 111 were returned (72% response rate). Nearly one-third of chief resident respondents indicated that more than 50% of their program's emergency psychiatry training was provided during on-call periods. A minority indicated that they were aware of the ACGME program requirements for emergency psychiatry training. While training in emergency psychiatry occurred in many programs through rotations-different from the on-call period-direct supervision was available during on-call training only about one-third of the time. The findings suggest that about one-third of psychiatry residency training programs do not adhere to the ACGME standards for emergency psychiatry training. Enhanced knowledge of the ACGME requirements may enhance psychiatry residents' understanding on how their programs are fulfilling the need for more emergency psychiatry training. Alternative settings to the on-call period for emergency psychiatry training are more likely to provide for direct supervision.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The report is an overview of electric energy efficiency programs. It takes a concise look at what states are doing to encourage energy efficiency and how it impacts electric utilities. Energy efficiency programs began to be offered by utilities as a response to the energy crises of the 1970s. These regulatory-driven programs peaked in the early-1990s and then tapered off as deregulation took hold. Today, rising electricity prices, environmental concerns, and national security issues have renewed interest in increasing energy efficiency as an alternative to additional supply. In response, new methods for administering, managing, and delivering energy efficiency programs aremore » being implemented. Topics covered in the report include: Analysis of the benefits of energy efficiency and key methods for achieving energy efficiency; evaluation of the business drivers spurring increased energy efficiency; Discussion of the major barriers to expanding energy efficiency programs; evaluation of the economic impacts of energy efficiency; discussion of the history of electric utility energy efficiency efforts; analysis of the impact of energy efficiency on utility profits and methods for protecting profitability; Discussion of non-utility management of energy efficiency programs; evaluation of major methods to spur energy efficiency - systems benefit charges, resource planning, and resource standards; and, analysis of the alternatives for encouraging customer participation in energy efficiency programs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mallay, D.; Wiehagen, J.
2014-07-01
Winchester/Camberley Homes with the Building America program and its NAHB Research Center Industry Partnership collaborated to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone four and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltrationmore » rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, this report demonstrates through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the heating, cooling, air distribution, and ventilation systems intended to optimize the equipment size and configuration to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.« less
NASA Technical Reports Server (NTRS)
Sang, Janche
2003-01-01
Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.
HITRAN2016 : new and improved data and tools towards studies of planetary atmospheres
NASA Astrophysics Data System (ADS)
Gordon, Iouli; Rothman, Laurence S.; Wilzewski, Jonas S.; Kochanov, Roman V.; Hill, Christian; Tan, Yan; Wcislo, Piotr
2016-10-01
The HITRAN2016 molecular spectroscopic database is scheduled to be released this year. It will replace the current edition, HITRAN2012 [1], which has been in use, along with some intermediate updates, since 2012.We have added, revised, and improved many transitions and bands of molecular species and their isotopologues. Also, the amount of parameters has also been significantly increased, now incorporating, for instance, broadening by He, H2 and CO2 which are dominant in different planetary atmospheres [2]; non-Voigt line profiles [3]; and other phenomena. This poster will provide a summary of the updates, emphasizing details of some of the most important or drastic improvements or additions.To allow flexible incorporation of the new parameters and improve the efficiency of the database usage, the whole database has been reorganized into a relational database structure and presented to the user by means of a very powerful, easy-to-use internet program called HITRANonline [4] accessible at
2015-11-24
This final rule implements a new Medicare Part A and B payment model under section 1115A of the Social Security Act, called the Comprehensive Care for Joint Replacement (CJR) model, in which acute care hospitals in certain selected geographic areas will receive retrospective bundled payments for episodes of care for lower extremity joint replacement (LEJR) or reattachment of a lower extremity. All related care within 90 days of hospital discharge from the joint replacement procedure will be included in the episode of care. We believe this model will further our goals in improving the efficiency and quality of care for Medicare beneficiaries with these common medical procedures.
A computationally efficient modelling of laminar separation bubbles
NASA Technical Reports Server (NTRS)
Dini, Paolo; Maughmer, Mark D.
1989-01-01
The goal is to accurately predict the characteristics of the laminar separation bubble and its effects on airfoil performance. Toward this end, a computational model of the separation bubble was developed and incorporated into the Eppler and Somers airfoil design and analysis program. Thus far, the focus of the research was limited to the development of a model which can accurately predict situations in which the interaction between the bubble and the inviscid velocity distribution is weak, the so-called short bubble. A summary of the research performed in the past nine months is presented. The bubble model in its present form is then described. Lastly, the performance of this model in predicting bubble characteristics is shown for a few cases.
Heliostat cost optimization study
NASA Astrophysics Data System (ADS)
von Reeken, Finn; Weinrebe, Gerhard; Keck, Thomas; Balz, Markus
2016-05-01
This paper presents a methodology for a heliostat cost optimization study. First different variants of small, medium sized and large heliostats are designed. Then the respective costs, tracking and optical quality are determined. For the calculation of optical quality a structural model of the heliostat is programmed and analyzed using finite element software. The costs are determined based on inquiries and from experience with similar structures. Eventually the levelised electricity costs for a reference power tower plant are calculated. Before each annual simulation run the heliostat field is optimized. Calculated LCOEs are then used to identify the most suitable option(s). Finally, the conclusions and findings of this extensive cost study are used to define the concept of a new cost-efficient heliostat called `Stellio'.
O'keefe, Matthew; Parr, Terence; Edgar, B. Kevin; ...
1995-01-01
Massively parallel processors (MPPs) hold the promise of extremely high performance that, if realized, could be used to study problems of unprecedented size and complexity. One of the primary stumbling blocks to this promise has been the lack of tools to translate application codes to MPP form. In this article we show how applications codes written in a subset of Fortran 77, called Fortran-P, can be translated to achieve good performance on several massively parallel machines. This subset can express codes that are self-similar, where the algorithm applied to the global data domain is also applied to each subdomain. Wemore » have found many codes that match the Fortran-P programming style and have converted them using our tools. We believe a self-similar coding style will accomplish what a vectorizable style has accomplished for vector machines by allowing the construction of robust, user-friendly, automatic translation systems that increase programmer productivity and generate fast, efficient code for MPPs.« less
Programming Native CRISPR Arrays for the Generation of Targeted Immunity.
Hynes, Alexander P; Labrie, Simon J; Moineau, Sylvain
2016-05-03
The adaptive immune system of prokaryotes, called CRISPR-Cas (clustered regularly interspaced short palindromic repeats and CRISPR-associated genes), results in specific cleavage of invading nucleic acid sequences recognized by the cell's "memory" of past encounters. Here, we exploited the properties of native CRISPR-Cas systems to program the natural "memorization" process, efficiently generating immunity not only to a bacteriophage or plasmid but to any specifically chosen DNA sequence. CRISPR-Cas systems have entered the public consciousness as genome editing tools due to their readily programmable nature. In industrial settings, natural CRISPR-Cas immunity is already exploited to generate strains resistant to potentially disruptive viruses. However, the natural process by which bacteria acquire new target specificities (adaptation) is difficult to study and manipulate. The target against which immunity is conferred is selected stochastically. By biasing the immunization process, we offer a means to generate customized immunity, as well as provide a new tool to study adaptation. Copyright © 2016 Hynes et al.
Penalty Dynamic Programming Algorithm for Dim Targets Detection in Sensor Systems
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations. PMID:22666074
Evaluation of a telephone advice nurse in a nursing faculty managed pediatric community clinic.
Beaulieu, Richard; Humphreys, Janice
2008-01-01
Nurse-managed health centers face increasing obstacles to financial viability. Efficient use of clinic resources and timely and appropriate patient care are necessary for sustainability. A registered nurse with adequate education and support can provide high-quality triage and advice in community-based practice sites. The purpose of this program evaluation was to examine the effect of a telephone advice nurse service on parent/caregiver satisfaction and access to care. A quasi-experimental separate pre-post sample design study investigated parent/caregiver satisfaction with a telephone advice nurse in an urban pediatric nurse-managed health center. The clinic medical information system was used to retrieve client visit data prior to the service and in the first year of the program. Statistically significant differences were found on two items from the satisfaction with the advice nurse survey: the reason for calling (P < .05), and the importance of being involved in decision making (P < .05). A telephone advice nurse may increase both parent/caregiver and provider satisfaction and access to care.
Precise and Efficient Static Array Bound Checking for Large Embedded C Programs
NASA Technical Reports Server (NTRS)
Venet, Arnaud
2004-01-01
In this paper we describe the design and implementation of a static array-bound checker for a family of embedded programs: the flight control software of recent Mars missions. These codes are large (up to 250 KLOC), pointer intensive, heavily multithreaded and written in an object-oriented style, which makes their analysis very challenging. We designed a tool called C Global Surveyor (CGS) that can analyze the largest code in a couple of hours with a precision of 80%. The scalability and precision of the analyzer are achieved by using an incremental framework in which a pointer analysis and a numerical analysis of array indices mutually refine each other. CGS has been designed so that it can distribute the analysis over several processors in a cluster of machines. To the best of our knowledge this is the first distributed implementation of static analysis algorithms. Throughout the paper we will discuss the scalability setbacks that we encountered during the construction of the tool and their impact on the initial design decisions.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
Communication library for run-time visualization of distributed, asynchronous data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowlan, J.; Wightman, B.T.
1994-04-01
In this paper we present a method for collecting and visualizing data generated by a parallel computational simulation during run time. Data distributed across multiple processes is sent across parallel communication lines to a remote workstation, which sorts and queues the data for visualization. We have implemented our method in a set of tools called PORTAL (for Parallel aRchitecture data-TrAnsfer Library). The tools comprise generic routines for sending data from a parallel program (callable from either C or FORTRAN), a semi-parallel communication scheme currently built upon Unix Sockets, and a real-time connection to the scientific visualization program AVS. Our methodmore » is most valuable when used to examine large datasets that can be efficiently generated and do not need to be stored on disk. The PORTAL source libraries, detailed documentation, and a working example can be obtained by anonymous ftp from info.mcs.anl.gov from the file portal.tar.Z from the directory pub/portal.« less
VASP- VARIABLE DIMENSION AUTOMATIC SYNTHESIS PROGRAM
NASA Technical Reports Server (NTRS)
White, J. S.
1994-01-01
VASP is a variable dimension Fortran version of the Automatic Synthesis Program, ASP. The program is used to implement Kalman filtering and control theory. Basically, it consists of 31 subprograms for solving most modern control problems in linear, time-variant (or time-invariant) control systems. These subprograms include operations of matrix algebra, computation of the exponential of a matrix and its convolution integral, and the solution of the matrix Riccati equation. The user calls these subprograms by means of a FORTRAN main program, and so can easily obtain solutions to most general problems of extremization of a quadratic functional of the state of the linear dynamical system. Particularly, these problems include the synthesis of the Kalman filter gains and the optimal feedback gains for minimization of a quadratic performance index. VASP, as an outgrowth of the Automatic Synthesis Program, has the following improvements: more versatile programming language; more convenient input/output format; some new subprograms which consolidate certain groups of statements that are often repeated; and variable dimensioning. The pertinent difference between the two programs is that VASP has variable dimensioning and more efficient storage. The documentation for the VASP program contains a VASP dictionary and example problems. The dictionary contains a description of each subroutine and instructions on its use. The example problems include dynamic response, optimal control gain, solution of the sampled data matrix Riccati equation, matrix decomposition, and a pseudo-inverse of a matrix. This program is written in FORTRAN IV and has been implemented on the IBM 360. The VASP program was developed in 1971.
Developing Advanced Human Support Technologies for Planetary Exploration Missions
NASA Technical Reports Server (NTRS)
Berdich, Debra P.; Campbell, Paul D.; Jernigan, J. Mark
2004-01-01
The United States Vision for Space Exploration calls for sending robots and humans to explore the Earth's moon, the planet Mars, and beyond. The National Aeronautics and Space Administration (NASA) is developing a set of design reference missions that will provide further detail to these plans. Lunar missions are expected to provide a stepping stone, through operational research and evaluation, in developing the knowledge base necessary to send crews on long duration missions to Mars and other distant destinations. The NASA Exploration Systems Directorate (ExSD), in its program of bioastronautics research, manages the development of technologies that maintain human life, health, and performance in space. Using a system engineering process and risk management methods, ExSD's Human Support Systems (HSS) Program selects and performs research and technology development in several critical areas and transfers the results of its efforts to NASA exploration mission/systems development programs in the form of developed technologies and new knowledge about the capabilities and constraints of systems required to support human existence beyond Low Earth Orbit. HSS efforts include the areas of advanced environmental monitoring and control, extravehicular activity, food technologies, life support systems, space human factors engineering, and systems integration of all these elements. The HSS Program provides a structured set of deliverable products to meet the needs of exploration programs. These products reduce the gaps that exist in our knowledge of and capabilities for human support for long duration, remote space missions. They also reduce the performance gap between the efficiency of current space systems and the greater efficiency that must be achieved to make human planetary exploration missions economically and logistically feasible. In conducting this research and technology development program, it is necessary for HSS technologists and program managers to develop a common currency for decision making and the allocation of funding. A high level assessment is made of both the knowledge gaps and the system performance gaps across the program s technical project portfolio. This allows decision making that assures proper emphasis areas and provides a key measure of annual technological progress, as exploration mission plans continue to mature.
Developing Advanced Support Technologies for Planetary Exploration Missions
NASA Technical Reports Server (NTRS)
Berdich, Debra P.; Campbel, Paul D.; Jernigan, J. Mark
2004-01-01
The United States Vision for Space Exploration calls for sending robots and humans to explore the Earth s moon, the planet Mars, and beyond. The National Aeronautics and Space Administration (NASA) is developing a set of design reference missions that will provide further detail to these plans. Lunar missions are expected to provide a stepping stone, through operational research and evaluation, in developing the knowledge base necessary to send crews on long duration missions to Mars and other distant destinations. The NASA Exploration Systems Directorate (ExSD), in its program of bioastronautics research, manages the development of technologies that maintain human life, health, and performance in space. Using a systems engineering process and risk management methods, ExSD s Human Support Systems (HSS) Program selects and performs research and technology development in several critical areas and transfers the results of its efforts to NASA exploration mission/systems development programs in the form of developed technologies and new knowledge about the capabilities and constraints of systems required to support human existence beyond Low Earth Orbit. HSS efforts include the areas of advanced environmental monitoring and control, extravehicular activity, food technologies, life support systems, space human factors engineering, and systems integration of all these elements. The HSS Program provides a structured set of deliverable products to meet the needs of exploration programs. these products reduce the gaps that exist in our knowledge of and capabilities for human support for long duration, remote space missions. They also reduce the performance gap between the efficiency of current space systems and the greater efficiency that must be achieved to make human planetary exploration missions economically and logistically feasible. In conducting this research and technology development program, it is necessary for HSS technologists and program managers to develop a common currency for decision making and the allocation of funding. A high level assessment is made of both the knowledge gaps and the system performance gaps across the program s technical project portfolio. This allows decision making that assures proper emphasis areas and provides a key measure of annual technological progress, as exploration mission plans continue to mature.
Following the water, the new program for Mars exploration.
Hubbard, G Scott; Naderi, Firouz M; Garvin, James B
2002-01-01
In the wake of the loss of Mars Climate Orbiter and Mars Polar Lander in late 1999, NASA embarked on a major review of the failures and subsequently restructured all aspects of what was then called the Mars Surveyor Program--now renamed the Mars Exploration Program. This paper presents the process and results of this reexamination and defines a new approach which we have called "Program System Engineering". Emphasis is given to the scientific, technological, and programmatic strategies that were used to shape the new Program. A scientific approach known as "follow the water" is described, as is an exploration strategy we have called "seek--in situ--sample". An overview of the mission queue from continuing Mars Global Surveyor through a possible Mars Sample Return Mission launch in 2011 is provided. In addition, key proposed international collaborations, especially those between NASA, CNES and ASI are outlined, as is an approach for a robust telecommunications infrastructure. c2002 Published by Elsevier Science Ltd.
Following the water, the new program for Mars exploration
NASA Technical Reports Server (NTRS)
Hubbard, G. Scott; Naderi, Firouz M.; Garvin, James B.
2002-01-01
In the wake of the loss of Mars Climate Orbiter and Mars Polar Lander in late 1999, NASA embarked on a major review of the failures and subsequently restructured all aspects of what was then called the Mars Surveyor Program--now renamed the Mars Exploration Program. This paper presents the process and results of this reexamination and defines a new approach which we have called "Program System Engineering". Emphasis is given to the scientific, technological, and programmatic strategies that were used to shape the new Program. A scientific approach known as "follow the water" is described, as is an exploration strategy we have called "seek--in situ--sample". An overview of the mission queue from continuing Mars Global Surveyor through a possible Mars Sample Return Mission launch in 2011 is provided. In addition, key proposed international collaborations, especially those between NASA, CNES and ASI are outlined, as is an approach for a robust telecommunications infrastructure. c2002 Published by Elsevier Science Ltd.
The Probability of Hitting a Polygonal Target
1981-04-01
required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The
The Ghost in the Machine: Are "Teacherless" CALL Programs Really Possible?
ERIC Educational Resources Information Center
Davies, Ted; Williamson, Rodney
1998-01-01
Reflects critically on pedagogical issues in the production of computer-assisted language learning (CALL) courseware and ways CALL has affected the practice of language learning. Concludes that if CALL is to reach full potential, it must be more than a simple medium of information; it should provide a teaching/learning process, with the real…
The Evidence on Universal Preschool: Are Benefits Worth the Cost? Policy Analysis. Number 760
ERIC Educational Resources Information Center
Armor, David J.
2014-01-01
Calls for universal preschool programs have become commonplace, reinforced by President Obama's call for "high-quality preschool for all" in 2013. Any program that could cost state and federal taxpayers $50 billion per year warrants a closer look at the evidence on its effectiveness. This report reviews the major evaluations of preschool…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
... by calling the Regulations Division at 202-708-3055 (this is not a toll-free number). Individuals with speech or hearing impairments may access this number through TTY by calling the toll-free Federal... a toll-free number). Persons with hearing or speech impairments may access this number through TTY...
78 FR 68367 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio NOX
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... Clean Air Act, which allows for Ohio's Clean Air Interstate Rule (CAIR) NO X Ozone Season Trading Program rules to supersede Ohio's nitrogen oxides (NO X ) State Implementation Plan (SIP) Call Budget Trading Program rules, but leave other requirements of the NO X SIP Call in place for units not covered by...
Preventing Boys' Problems in Schools through Psychoeducational Programming: A Call to Action
ERIC Educational Resources Information Center
O'Neil, James M.; Lujan, Melissa L.
2009-01-01
Controversy currently exists on whether boys are in crises and, if so, what to do about it. Research is reviewed that indicates that boys have problems that affect their emotional and interpersonal functioning. Psychoeducational and preventive programs for boys are recommended as a call to action in schools. Thematic areas for boys' programming…
USDA-ARS?s Scientific Manuscript database
Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...
Implementation of a successful on-call system in clinical chemistry.
Hobbs, G A; Jortani, S A; Valdes, R
1997-11-01
Successful practice of clinical pathology depends on a wide variety of laboratory, clinical, and managerial decisions. The skills needed to make these decisions can most effectively be learned by residents and fellows in pathology using a service-oriented on-call approach. We report our experience implementing an on-call system in the clinical chemistry laboratory at the University of Louisville Hospital (Ky). We detail the guidelines used to establish this system and the elements required for its successful implementation. The system emphasizes a laboratory-initiated approach to linking laboratory results to patient care. From inception of the program during late 1990 through 1995, the number of beeper calls (including clinician contacts) steadily increased and is currently 8 to 20 per week. The on-call system is active 24 hours per day, 7 days per week, thus representing activity on all three laboratory shifts. Types of responses were separated into administrative (12%), analytical (42%), clinical (63%), quality control or quality assurance (12%), and consultation (13%) categories. We also present 6 case reports as examples demonstrating multiple elements in these categories. In 23% of the calls, clinician contact was required and achieved by the fellow or resident on call for the laboratory. The on-call reports are documented and presented informally at weekly on-call report sessions. Emphasis is placed on learning and refinement of investigative skills needed to function as an effective laboratory director. Educational emphasis for the medical staff is in establishing awareness of the presence of the laboratory as an important interactive component of patient care. In addition, we found this program to be beneficial to the hospital and to the department of pathology in fulfilling its clinical service and teaching missions. Our experience may be helpful to other institutions establishing such a program.
RACER: Effective Race Detection Using AspectJ
NASA Technical Reports Server (NTRS)
Bodden, Eric; Havelund, Klaus
2008-01-01
The limits of coding with joint constraints on detected and undetected error rates Programming errors occur frequently in large software systems, and even more so if these systems are concurrent. In the past, researchers have developed specialized programs to aid programmers detecting concurrent programming errors such as deadlocks, livelocks, starvation and data races. In this work we propose a language extension to the aspect-oriented programming language AspectJ, in the form of three new built-in pointcuts, lock(), unlock() and may be Shared(), which allow programmers to monitor program events where locks are granted or handed back, and where values are accessed that may be shared amongst multiple Java threads. We decide thread-locality using a static thread-local objects analysis developed by others. Using the three new primitive pointcuts, researchers can directly implement efficient monitoring algorithms to detect concurrent programming errors online. As an example, we expose a new algorithm which we call RACER, an adoption of the well-known ERASER algorithm to the memory model of Java. We implemented the new pointcuts as an extension to the Aspect Bench Compiler, implemented the RACER algorithm using this language extension and then applied the algorithm to the NASA K9 Rover Executive. Our experiments proved our implementation very effective. In the Rover Executive RACER finds 70 data races. Only one of these races was previously known.We further applied the algorithm to two other multi-threaded programs written by Computer Science researchers, in which we found races as well.
Cockpit Resource Management (CRM) for FAR Parts 91 and 135 operators
NASA Technical Reports Server (NTRS)
Schwartz, Douglas
1987-01-01
The why, what, and how of CRM at Flight Safety International (FSI)--that is, the philosophy behind the program, the content of the program, and some insight regarding how it delivers that to the pilot is presented. A few of the concepts that are part of the program are discussed. This includes a view of statistics called the Safety Window, the concept of situational awareness, and an approach to training that we called the Cockpit Management Concept (CMC).
NASA Technical Reports Server (NTRS)
1995-01-01
As a Jet Propulsion Laboratory astronomer, John D. Callahan developed a computer program called Multimission Interactive Planner (MIP) to help astronomers analyze scientific and optical data collected on the Voyager's Grand Tour. The commercial version of the program called XonVu is published by XonTech, Inc. Callahan has since developed two more advanced programs based on MIP technology, Grand Tour and Jovian Traveler, which simulate Voyager and Giotto missions. The software allows astronomers and space novices to view the objects seen by the spacecraft, manipulating perspective, distance and field of vision.
Impact of an after-hours on-call emergency physician on ambulance transports from a county jail.
Chan, Theodore C; Vilke, Gary M; Smith, Sue; Sparrow, William; Dunford, James V
2003-01-01
The authors sought to determine if the availability of an after-hours on-call emergency physician by telephone for consultation to the staff at a county jail would safely reduce ambulance emergency department (ED) transport of inmates in the community. The authors conducted a prospective comparison study during the first ten months of an emergency physician on-call program for the county jail in which prospective data were collected on all consultations, including reason for call and disposition (ambulance, deputy, or no ED transport of inmate). They compared this time with a similar period a year before the program in terms of total ambulance transports from the jail. They also reviewed all hospital and jail medical records to assess for any adverse consequences within one month, or subsequent ambulance transport within 24 hours as a result of inmate care after the consultation call. Total after-hours ambulance transports from the jail decreased significantly from 30.3 transports/month (95% confidence interval [CI], 21.0-39.6) to 9.1 transports/month (95% CI, 4.1-14.0) (p < 0.05). The most common reasons for consultation calls were chest pain (16%), trauma (15%), and abnormal laboratory or radiology results (14%). Of all calls, only 30% resulted in ambulance transport to the ED. On review of records, no adverse outcome or subsequent ambulance transport was identified. The initiation of an on-call emergency physician program for after-hours consultation to jail nursing and law enforcement staff safely reduced ambulance transports from a county jail with no adverse outcomes identified.
Confidential close call reporting system (C3RS) lessons learned team baseline phased report
DOT National Transportation Integrated Search
2015-05-08
The Federal Railroad Administration (FRA) has established a program called the Confidential Close Call Reporting System : (C3RS), which allows events to be reported anonymously and dealt with non-punitively and without fear or reprisal through : stru...
Confidential close call reporting system (C3RS) lessons learned team baseline phase report.
DOT National Transportation Integrated Search
2015-05-01
The Federal Railroad Administration (FRA) has established a program called the Confidential Close Call Reporting System : (C3 : RS), which allows events to be reported anonymously and dealt with non-punitively and without fear or reprisal through : s...
HDF-EOS 2 and HDF-EOS 5 Compatibility Library
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
The HDF-EOS 2 and HDF-EOS 5 Compatibility Library contains C-language functions that provide uniform access to HDF-EOS 2 and HDF-EOS 5 files through one set of application programming interface (API) calls. ("HDFEOS 2" and "HDF-EOS 5" are defined in the immediately preceding article.) Without this library, differences between the APIs of HDF-EOS 2 and HDF-EOS 5 would necessitate writing of different programs to cover HDF-EOS 2 and HDF-EOS 5. The API associated with this library is denoted "he25." For nearly every HDF-EOS 5 API call, there is a corresponding he25 API call. If a file in question is in the HDF-EOS 5 format, the code reverts to the corresponding HDF-EOS 5 call; if the file is in the HDF-EOS 2 format, the code translates the arguments to HDF-EOS 2 equivalents (if necessary), calls the HDFEOS 2 call, and retranslates the results back to HDF-EOS 5 (if necessary).
Chang, Larry William; Kagaayi, Joseph; Nakigozi, Gertrude; Galiwango, Ronald; Mulamba, Jeremiah; Ludigo, James; Ruwangula, Andrew; Gray, Ronald H.; Quinn, Thomas C.; Bollinger, Robert C.; Reynolds, Steven J.
2009-01-01
Hotlines and warmlines have been successfully used in the developed world to provide clinical advice; however, reports on their replicability in resource-limited settings are limited. A warmline was established in Rakai, Uganda, to support an antiretroviral therapy program. Over a 17-month period, a database was kept of who called, why they called, and the result of the call. A program evaluation was also administered to clinical staff. A total of 1303 calls (3.5 calls per weekday) were logged. The warmline was used mostly by field staff and peripherally based peer health workers. Calls addressed important clinical issues, including the need for urgent care, medication side effects, and follow-up needs. Most clinical staff felt that the warmline made their jobs easier and improved the health of patients. An HIV/AIDS warmline leveraged the skills of a limited workforce to provide increased access to HIV/AIDS care, advice, and education. PMID:18441254
The Louisiana State University waste-to-energy incinerator
NASA Astrophysics Data System (ADS)
1994-10-01
This proposed action is for cost-shared construction of an incinerator/steam-generation facility at Louisiana State University under the State Energy Conservation Program (SECP). The SECP, created by the Energy Policy and Conservation Act, calls upon DOE to encourage energy conservation, renewable energy, and energy efficiency by providing Federal technical and financial assistance in developing and implementing comprehensive state energy conservation plans and projects. Currently, LSU runs a campus-wide recycling program in order to reduce the quantity of solid waste requiring disposal. This program has removed recyclable paper from the waste stream; however, a considerable quantity of other non-recyclable combustible wastes are produced on campus. Until recently, these wastes were disposed of in the Devil's Swamp landfill (also known as the East Baton Rouge Parish landfill). When this facility reached its capacity, a new landfill was opened a short distance away, and this new site is now used for disposal of the University's non-recyclable wastes. While this new landfill has enough capacity to last for at least 20 years (from 1994), the University has identified the need for a more efficient and effective manner of waste disposal than landfilling. The University also has non-renderable biological and potentially infectious waste materials from the School of Veterinary Medicine and the Student Health Center, primarily the former, whose wastes include animal carcasses and bedding materials. Renderable animal wastes from the School of Veterinary Medicine are sent to a rendering plant. Non-renderable, non-infectious animal wastes currently are disposed of in an existing on-campus incinerator near the School of Veterinary Medicine building.
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
ERIC Educational Resources Information Center
Hardin, Julia P., Ed.; Moulden, Richard G., Ed.
This compilation of over 40 lesson plans on various topics in law related education was written by classroom teachers from around the United States who had participated in the fifth of an annual series called Special Programs in Citizenship Education (SPICE)--weeklong institutes devoted to learning about different cultures and laws. Called SPICE V…
ERIC Educational Resources Information Center
Kathi, Pradeep Chandra
2012-01-01
The School of Planning Policy and Development at the University of Southern California brought together representatives of neighborhood councils and city agencies of the city of Los Angeles together in an action research program. This action research program called the Collaborative Learning Project developed a collaboration process called the…
Justice Education as a Schoolwide Effort: Effective Religious Education in the Catholic School
ERIC Educational Resources Information Center
Horan, Michael P.
2005-01-01
This essay describes and analyzes one successful justice education program flowing from community service, and demonstrates how such a program in a Catholic school responds to several important "calls" to Catholic educators. These "calls" are issued by (a) the needs of the learners and the signs of the times, (b) official documents of the Church…
Evaluating the Generality and Limits of Blind Return-Oriented Programming Attacks
2015-12-01
consider a recently proposed information disclosure vulnerability called blind return-oriented programming (BROP). Under certain conditions, this...implementation disclosure attacks 15. NUMBER OF PAGES 75 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF...Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT We consider a recently proposed information disclosure vulnerability called blind return
Evaluating the effectiveness of the Safety Investment Program (SIP) policies for Oregon.
DOT National Transportation Integrated Search
2009-10-01
The Safety Investment Program (SIP) was originally called the Statewide Transportation Improvement Program - : Safety Investment Program (STIP-SIP). The concept of the program was first discussed in October 1997 and the : program was adopted by the O...
Coalescent: an open-science framework for importance sampling in coalescent theory.
Tewari, Susanta; Spouge, John L
2015-01-01
Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.
Duty hours and home call: the experience of plastic surgery residents and fellows.
Drolet, Brian C; Prsic, Adnan; Schmidt, Scott T
2014-05-01
Although resident duty hours are strictly regulated by the Accreditation Council for Graduate Medical Education, there are fewer restrictions on at-home call for residents. To date, no studies have examined the experience of home call for plastic surgery trainees or the impact of home call on patient care and education in plastic surgery. an anonymous electronic survey to plastic surgery trainees at 41 accredited programs. They sought to produce a descriptive assessment of home call and to evaluate the perceived impact of home call on training and patient care. A total of 214 responses were obtained (58.3 percent completion rate). Nearly all trainees reported taking home call (98.6 percent), with 66.7 percent reporting call frequency every third or fourth night. Most respondents (63.3 percent) felt that home call regulations are vague but that Council regulation (44.9 percent) and programmatic oversight (56.5 percent) are adequate. Most (91.2 percent) believe their program could not function without home call and that home call helps to avoid strict duty hour restrictions (71.5 percent). Nearly all respondents (92.3 percent) preferred home call to in-house call. This is the first study to examine how plastic surgery residents experience and perceive home call within the framework of Accreditation Council for Graduate Medical Education duty hour regulations. Most trainees feel the impact of home call is positive for education (50.2 percent) and quality of life (56.5 percent), with a neutral impact on patient care (66.7 percent). Under the Council's increasing regulations, home call provides a balance of education and patient care appropriate for training in plastic and reconstructive surgery.
Transmission and Distribution Efficiency Improvement Rearch and Development Survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, C.L.; Westinghouse Electric Corporation. Advanced Systems Technology.
Purpose of this study was to identify and quantify those technologies for improving transmission and distribution (T and D) system efficiency that could provide the greatest benefits for utility customers in the Pacific Northwest. Improving the efficiency of transmission and distribution systems offers a potential source of conservation within the utility sector. An extensive review of this field resulted in a list of 49 state-of-the-art technologies and 39 future technologies. Of these, 15 from the former list and 7 from the latter were chosen as the most promising and then submitted to an evaluative test - a modeled sample systemmore » for Benton County PUD, a utility with characteristics typical of a BPA customer system. Reducing end-use voltage on secondary distribution systems to decrease the energy consumption of electrical users when possible, called ''Conservation Voltage Reduction,'' was found to be the most cost effective state-of-the-art technology. Voltampere reactive (var) optimization is a similarly cost effective alternative. The most significant reduction in losses on the transmission and distribution system would be achieved through the replacement of standard transformers with high efficiency transformers, such as amorphous steel transformers. Of the future technologies assessed, the ''Distribution Static VAR Generator'' appears to have the greatest potential for technological breakthroughs and, therefore in time, commercialization. ''Improved Dielectric Materials,'' with a relatively low cost and high potential for efficiency improvement, warrant R and D consideration. ''Extruded Three-Conductor Cable'' and ''Six- and Twelve-Phase Transmission'' programs provide only limited gains in efficiency and applicability and are therefore the least cost effective.« less
TPMG Northern California appointments and advice call center.
Conolly, Patricia; Levine, Leslie; Amaral, Debra J; Fireman, Bruce H; Driscoll, Tom
2005-08-01
Kaiser Permanente (KP) has been developing its use of call centers as a way to provide an expansive set of healthcare services to KP members efficiently and cost effectively. Since 1995, when The Permanente Medical Group (TPMG) began to consolidate primary care phone services into three physical call centers, the TPMG Appointments and Advice Call Center (AACC) has become the "front office" for primary care services across approximately 89% of Northern California. The AACC provides primary care phone service for approximately 3 million Kaiser Foundation Health Plan members in Northern California and responds to approximately 1 million calls per month across the three AACC sites. A database records each caller's identity as well as the day, time, and duration of each call; reason for calling; services provided to callers as a result of calls; and clinical outcomes of calls. We here summarize this information for the period 2000 through 2003.
Fracture network evaluation program (FraNEP): A software for analyzing 2D fracture trace-line maps
NASA Astrophysics Data System (ADS)
Zeeb, Conny; Gomez-Rivas, Enrique; Bons, Paul D.; Virgo, Simon; Blum, Philipp
2013-10-01
Fractures, such as joints, faults and veins, strongly influence the transport of fluids through rocks by either enhancing or inhibiting flow. Techniques used for the automatic detection of lineaments from satellite images and aerial photographs, LIDAR technologies and borehole televiewers significantly enhanced data acquisition. The analysis of such data is often performed manually or with different analysis software. Here we present a novel program for the analysis of 2D fracture networks called FraNEP (Fracture Network Evaluation Program). The program was developed using Visual Basic for Applications in Microsoft Excel™ and combines features from different existing software and characterization techniques. The main novelty of FraNEP is the possibility to analyse trace-line maps of fracture networks applying the (1) scanline sampling, (2) window sampling or (3) circular scanline and window method, without the need of switching programs. Additionally, binning problems are avoided by using cumulative distributions, rather than probability density functions. FraNEP is a time-efficient tool for the characterisation of fracture network parameters, such as density, intensity and mean length. Furthermore, fracture strikes can be visualized using rose diagrams and a fitting routine evaluates the distribution of fracture lengths. As an example of its application, we use FraNEP to analyse a case study of lineament data from a satellite image of the Oman Mountains.
75 FR 6435 - Sunshine Act Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-09
... will answer questions from the news media following the Board meeting. Status: Open. Agenda Old... Efficiency Committee. For more information: Please call TVA Media Relations at (865) 632- 6000, Knoxville, Tennessee. People who plan to attend the meeting and have special needs should call (865) 632-6000. Anyone...
Heget, Jeffrey R; Bagian, James P; Lee, Caryl Z; Gosbee, John W
2002-12-01
In 1998 the Veterans Health Administration (VHA) created the National Center for Patient Safety (NCPS) to lead the effort to reduce adverse events and close calls systemwide. NCPS's aim is to foster a culture of safety in the Department of Veterans Affairs (VA) by developing and providing patient safety programs and delivering standardized tools, methods, and initiatives to the 163 VA facilities. To create a system-oriented approach to patient safety, NCPS looked for models in fields such as aviation, nuclear power, human factors, and safety engineering. Core concepts included a non-punitive approach to patient safety activities that emphasizes systems-based learning, the active seeking out of close calls, which are viewed as opportunities for learning and investigation, and the use of interdisciplinary teams to investigate close calls and adverse events through a root cause analysis (RCA) process. Participation by VA facilities and networks was voluntary. NCPS has always aimed to develop a program that would be applicable both within the VA and beyond. NCPS's full patient safety program was tested and implemented throughout the VA system from November 1999 to August 2000. Program components included an RCA system for use by caregivers at the front line, a system for the aggregate review of RCA results, information systems software, alerts and advisories, and cognitive acids. Following program implementation, NCPS saw a 900-fold increase in reporting of close calls of high-priority events, reflecting the level of commitment to the program by VHA leaders and staff.
Cooking and disgust sensitivity influence preference for attending insect-based food events.
Hamerman, Eric J
2016-01-01
Insects are energy-efficient and sustainable sources of animal protein in a world with insufficient food resources to feed an ever-increasing population. However, much of the western world refuses to eat insects because they perceive them as disgusting. This research finds that both animal reminder disgust and core disgust reduced people's willingness to attend a program called "Bug Appétit" in which insects were served as food. Additionally, people who were low in sensitivity to animal reminder disgust were more willing to attend this program after having been primed to think about cooking. Cooking is a process by which raw ingredients are transformed into finished products, reducing the "animalness" of meat products that renders them disgusting. Sensitivity to core disgust did not interact with cooking to influence willingness to attend the program. While prior research has emphasized that direct education campaigns about the benefits of entomophagy (the consumption of insects) can increase willingness to attend events at which insect-based food is served, this is the first demonstration that indirect priming can have a similar effect among a subset of the population. Copyright © 2015 Elsevier Ltd. All rights reserved.
Use of Medicare summary notice inserts to generate interest in the Medicare stop smoking program.
Maglione, Margaret; Larson, Carrie; Giannotti, Tierney; Lapin, Pauline
2007-01-01
Evaluations of outreach strategies that effectively and efficiently reach the senior population often go unreported. The Medicare Stop Smoking Program (MSSP) was a seven-state demonstration project funded by the Centers for Medicare and Medicaid Services. The 1-year recruitment plan for MSSP included a multifaceted paid media campaign; however, enrollment was slower than anticipated. The purpose of this substudy was to test the effects of including envelope-sized advertisement inserts with Medicare Summary Notices (MSNs) as a supplemental recruitment strategy. Information obtained from enrollees on where they had learned about the program as well as overall enrollment rates were analyzed and compared with the time periods during which the inserts were included in MSN mailings. Average call volume to the enrollment center increased by 65.7% in Alabama, the pilot state, and by more than 200% in the subsequent demonstration states. Despite the introduction of the MSN inserts late in the recruitment period, 32.2 % of the 7354 total enrollees stated that they learned about the project through the inserts. This recruitment method is highly recommended as a cost-effective way to reach the senior population.
NASA Technical Reports Server (NTRS)
Johnson, F. T.; Samant, S. S.; Bieterman, M. B.; Melvin, R. G.; Young, D. P.; Bussoletti, J. E.; Hilmes, C. L.
1992-01-01
A new computer program, called TranAir, for analyzing complex configurations in transonic flow (with subsonic or supersonic freestream) was developed. This program provides accurate and efficient simulations of nonlinear aerodynamic flows about arbitrary geometries with the ease and flexibility of a typical panel method program. The numerical method implemented in TranAir is described. The method solves the full potential equation subject to a set of general boundary conditions and can handle regions with differing total pressure and temperature. The boundary value problem is discretized using the finite element method on a locally refined rectangular grid. The grid is automatically constructed by the code and is superimposed on the boundary described by networks of panels; thus no surface fitted grid generation is required. The nonlinear discrete system arising from the finite element method is solved using a preconditioned Krylov subspace method embedded in an inexact Newton method. The solution is obtained on a sequence of successively refined grids which are either constructed adaptively based on estimated solution errors or are predetermined based on user inputs. Many results obtained by using TranAir to analyze aerodynamic configurations are presented.
Moseley, Hunter N B; Riaz, Nadeem; Aramini, James M; Szyperski, Thomas; Montelione, Gaetano T
2004-10-01
We present an algorithm and program called Pattern Picker that performs editing of raw peak lists derived from multidimensional NMR experiments with characteristic peak patterns. Pattern Picker detects groups of correlated peaks within peak lists from reduced dimensionality triple resonance (RD-TR) NMR spectra, with high fidelity and high yield. With typical quality RD-TR NMR data sets, Pattern Picker performs almost as well as human analysis, and is very robust in discriminating real peak sets from noise and other artifacts in unedited peak lists. The program uses a depth-first search algorithm with short-circuiting to efficiently explore a search tree representing every possible combination of peaks forming a group. The Pattern Picker program is particularly valuable for creating an automated peak picking/editing process. The Pattern Picker algorithm can be applied to a broad range of experiments with distinct peak patterns including RD, G-matrix Fourier transformation (GFT) NMR spectra, and experiments to measure scalar and residual dipolar coupling, thus promoting the use of experiments that are typically harder for a human to analyze. Since the complexity of peak patterns becomes a benefit rather than a drawback, Pattern Picker opens new opportunities in NMR experiment design.
NASA Technical Reports Server (NTRS)
1975-01-01
This NASA Dryden Flight Research Center photograph taken in 1975 shows the General Dynamic IPCS/F-111E Aardvark with a camouflage paint pattern. This prototype F-111E was used during the flight testing of the Integrated Propulsion Control System (IPCS). The wings of the IPCS/F-111E are swept back to near 60 degrees for supersonic flight. During the same period as F-111 TACT program, an F-111E Aardvark (#67-0115) was flown at the NASA Flight Research Center to investigate an electronic versus a conventional hydro-mechanical controlled engine. The program called integrated propulsion control system (IPCS) was a joint effort by NASA's Lewis Research Center and Flight Research Center, the Air Force's Flight Propulsion Laboratory and the Boeing, Honeywell and Pratt & Whitney companies. The left engine of the F-111E was selected for modification to an all electronic system. A Pratt & Whitney TF30-P-9 engine was modified and extensively laboratory, and ground-tested before installation into the F-111E. There were 14 IPCS flights made from 1975 through 1976. The flight demonstration program proved an engine could be controlled electronically, leading to a more efficient Digital Electronic Engine Control System flown in the F-15.
GPUmotif: An Ultra-Fast and Energy-Efficient Motif Analysis Program Using Graphics Processing Units
Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui
2012-01-01
Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a “fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/ PMID:22662128
Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki
2013-01-01
A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.
GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.
Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui
2012-01-01
Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/
An index-based algorithm for fast on-line query processing of latent semantic analysis
Li, Pohan; Wang, Wei
2017-01-01
Latent Semantic Analysis (LSA) is widely used for finding the documents whose semantic is similar to the query of keywords. Although LSA yield promising similar results, the existing LSA algorithms involve lots of unnecessary operations in similarity computation and candidate check during on-line query processing, which is expensive in terms of time cost and cannot efficiently response the query request especially when the dataset becomes large. In this paper, we study the efficiency problem of on-line query processing for LSA towards efficiently searching the similar documents to a given query. We rewrite the similarity equation of LSA combined with an intermediate value called partial similarity that is stored in a designed index called partial index. For reducing the searching space, we give an approximate form of similarity equation, and then develop an efficient algorithm for building partial index, which skips the partial similarities lower than a given threshold θ. Based on partial index, we develop an efficient algorithm called ILSA for supporting fast on-line query processing. The given query is transformed into a pseudo document vector, and the similarities between query and candidate documents are computed by accumulating the partial similarities obtained from the index nodes corresponds to non-zero entries in the pseudo document vector. Compared to the LSA algorithm, ILSA reduces the time cost of on-line query processing by pruning the candidate documents that are not promising and skipping the operations that make little contribution to similarity scores. Extensive experiments through comparison with LSA have been done, which demonstrate the efficiency and effectiveness of our proposed algorithm. PMID:28520747
First-order irreversible thermodynamic approach to a simple energy converter
NASA Astrophysics Data System (ADS)
Arias-Hernandez, L. A.; Angulo-Brown, F.; Paez-Hernandez, R. T.
2008-01-01
Several authors have shown that dissipative thermal cycle models based on finite-time thermodynamics exhibit loop-shaped curves of power output versus efficiency, such as it occurs with actual dissipative thermal engines. Within the context of first-order irreversible thermodynamics (FOIT), in this work we show that for an energy converter consisting of two coupled fluxes it is also possible to find loop-shaped curves of both power output and the so-called ecological function versus efficiency. In a previous work Stucki [J. W. Stucki, Eur. J. Biochem. 109, 269 (1980)] used a FOIT approach to describe the modes of thermodynamic performance of oxidative phosphorylation involved in adenosine triphosphate (ATP) synthesis within mithochondrias. In that work the author did not use the mentioned loop-shaped curves and he proposed that oxidative phosphorylation operates in a steady state at both minimum entropy production and maximum efficiency simultaneously, by means of a conductance matching condition between extreme states of zero and infinite conductances, respectively. In the present work we show that all Stucki’s results about the oxidative phosphorylation energetics can be obtained without the so-called conductance matching condition. On the other hand, we also show that the minimum entropy production state implies both null power output and efficiency and therefore this state is not fulfilled by the oxidative phosphorylation performance. Our results suggest that actual efficiency values of oxidative phosphorylation performance are better described by a mode of operation consisting of the simultaneous maximization of both the so-called ecological function and the efficiency.
An index-based algorithm for fast on-line query processing of latent semantic analysis.
Zhang, Mingxi; Li, Pohan; Wang, Wei
2017-01-01
Latent Semantic Analysis (LSA) is widely used for finding the documents whose semantic is similar to the query of keywords. Although LSA yield promising similar results, the existing LSA algorithms involve lots of unnecessary operations in similarity computation and candidate check during on-line query processing, which is expensive in terms of time cost and cannot efficiently response the query request especially when the dataset becomes large. In this paper, we study the efficiency problem of on-line query processing for LSA towards efficiently searching the similar documents to a given query. We rewrite the similarity equation of LSA combined with an intermediate value called partial similarity that is stored in a designed index called partial index. For reducing the searching space, we give an approximate form of similarity equation, and then develop an efficient algorithm for building partial index, which skips the partial similarities lower than a given threshold θ. Based on partial index, we develop an efficient algorithm called ILSA for supporting fast on-line query processing. The given query is transformed into a pseudo document vector, and the similarities between query and candidate documents are computed by accumulating the partial similarities obtained from the index nodes corresponds to non-zero entries in the pseudo document vector. Compared to the LSA algorithm, ILSA reduces the time cost of on-line query processing by pruning the candidate documents that are not promising and skipping the operations that make little contribution to similarity scores. Extensive experiments through comparison with LSA have been done, which demonstrate the efficiency and effectiveness of our proposed algorithm.
76 FR 17933 - Infrastructure Protection Data Call Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-31
... Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 60-Day Notice and request for... mission, IP requests opinions and information in a survey from IP Data Call participants regarding the IP Data Call process and the web-based application used to collect the CIKR data. The survey data...
77 FR 74828 - Call for Applications for the International Buyer Program Calendar Years 2014 and 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
... DEPARTMENT OF COMMERCE International Trade Administration [Docket No. 120913451-2681-02] Call for... Administration, Department of Commerce. ACTION: Notice extending application deadline. SUMMARY: The U.S. Department of Commerce (DOC) is amending the Notice and Call for Applications for the International Buyer...
Pedagogy and Related Criteria: The Selection of Software for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Samuels, Jeffrey D.
2013-01-01
Computer-Assisted Language Learning (CALL) is an established field of academic inquiry with distinct applications for second language teaching and learning. Many CALL professionals direct language labs or language resource centers (LRCs) in which CALL software applications and generic software applications support language learning programs and…
ERIC Educational Resources Information Center
Mangan, Marianne
2013-01-01
Call it physical activity, call it games, or call it play. Whatever its name, it's a place we all need to return to. In the physical education, recreation, and dance professions, we need to redesign programs to address the need for and want of play that is inherent in all of us.
Llusia, Diego; Gómez, Miguel; Penna, Mario; Márquez, Rafael
2013-01-01
Invasive species are a leading cause of the current biodiversity decline, and hence examining the major traits favouring invasion is a key and long-standing goal of invasion biology. Despite the prominent role of the advertisement calls in sexual selection and reproduction, very little attention has been paid to the features of acoustic communication of invasive species in nonindigenous habitats and their potential impacts on native species. Here we compare for the first time the transmission efficiency of the advertisement calls of native and invasive species, searching for competitive advantages for acoustic communication and reproduction of introduced taxa, and providing insights into competing hypotheses in evolutionary divergence of acoustic signals: acoustic adaptation vs. morphological constraints. Using sound propagation experiments, we measured the attenuation rates of pure tones (0.2-5 kHz) and playback calls (Lithobates catesbeianus and Pelophylax perezi) across four distances (1, 2, 4, and 8 m) and over two substrates (water and soil) in seven Iberian localities. All factors considered (signal type, distance, substrate, and locality) affected transmission efficiency of acoustic signals, which was maximized with lower frequency sounds, shorter distances, and over water surface. Despite being broadcast in nonindigenous habitats, the advertisement calls of invasive L. catesbeianus were propagated more efficiently than those of the native species, in both aquatic and terrestrial substrates, and in most of the study sites. This implies absence of optimal relationship between native environments and propagation of acoustic signals in anurans, in contrast to what predicted by the acoustic adaptation hypothesis, and it might render these vertebrates particularly vulnerable to intrusion of invasive species producing low frequency signals, such as L. catesbeianus. Our findings suggest that mechanisms optimizing sound transmission in native habitat can play a less significant role than other selective forces or biological constraints in evolutionary design of anuran acoustic signals.
Llusia, Diego; Gómez, Miguel; Penna, Mario; Márquez, Rafael
2013-01-01
Invasive species are a leading cause of the current biodiversity decline, and hence examining the major traits favouring invasion is a key and long-standing goal of invasion biology. Despite the prominent role of the advertisement calls in sexual selection and reproduction, very little attention has been paid to the features of acoustic communication of invasive species in nonindigenous habitats and their potential impacts on native species. Here we compare for the first time the transmission efficiency of the advertisement calls of native and invasive species, searching for competitive advantages for acoustic communication and reproduction of introduced taxa, and providing insights into competing hypotheses in evolutionary divergence of acoustic signals: acoustic adaptation vs. morphological constraints. Using sound propagation experiments, we measured the attenuation rates of pure tones (0.2–5 kHz) and playback calls (Lithobates catesbeianus and Pelophylax perezi) across four distances (1, 2, 4, and 8 m) and over two substrates (water and soil) in seven Iberian localities. All factors considered (signal type, distance, substrate, and locality) affected transmission efficiency of acoustic signals, which was maximized with lower frequency sounds, shorter distances, and over water surface. Despite being broadcast in nonindigenous habitats, the advertisement calls of invasive L. catesbeianus were propagated more efficiently than those of the native species, in both aquatic and terrestrial substrates, and in most of the study sites. This implies absence of optimal relationship between native environments and propagation of acoustic signals in anurans, in contrast to what predicted by the acoustic adaptation hypothesis, and it might render these vertebrates particularly vulnerable to intrusion of invasive species producing low frequency signals, such as L. catesbeianus. Our findings suggest that mechanisms optimizing sound transmission in native habitat can play a less significant role than other selective forces or biological constraints in evolutionary design of anuran acoustic signals. PMID:24155940
Code of Federal Regulations, 2012 CFR
2012-01-01
... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Demonstrating the value of a social science research program to a natural resource management agency
Pamela J. Jakes; John F. Dwyer; Deborah S. Carr
1998-01-01
With ever tightening resources to address an increased number of diverse and complex issues, it has become common for scientists and managers to be called upon to demonstrate the value of their programs. In the spring of 1995, social scientists at the USDA Forest Service North Central Forest Experiment Station we so called upon. This paper discusses an effort to...
Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki
2012-01-01
Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.
Rimmer, James H; Vanderbom, Kerri A
2016-01-01
The growing evidence base of childhood obesity prevention and treatment programs do not adequately consider how to adapt these programs for children with disabilities. We propose a Call to Action for health researchers who conduct studies focused on the general population (i.e., without a disability) to work closely with disability researchers to adapt their programs (e.g., obesity management, increased physical activity, and caregiver training in diet and nutrition) to be relevant to both groups. We refer to this approach as inclusion team science. The hope for this Call to Action is that there will be greater synergy between researchers who have high levels of expertise in a specialty area of health (but little or no knowledge of how to adapt their program for children with disabilities) to work more closely with researchers who have a high level of expertise in adapting evidence-based health promotion recommendations and strategies for children with disabilities. Together, these two areas of expertise will lead to inclusive physical activity and nutrition programs for all children.
47 CFR 74.791 - Digital call signs.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., AUXILIARY, SPECIAL BROADCAST AND OTHER PROGRAM DISTRIBUTIONAL SERVICES Low Power TV, TV Translator, and TV... −D. (b) Digital television translator stations. Call signs for digital television translator stations...
NASA Technical Reports Server (NTRS)
Sullivan, Thomas A.; Perchonek, M. H.; Ott, C. M.; Kaiser, M. K.
2011-01-01
Exploration missions will carry crews far beyond the relatively safe environs of cis-lunar space. Such trips will have little or no opportunity for resupply or rapid aborts and will be of a duration that far exceeds our experience to date. The challenges this imposes on the requirements of systems that monitor the life support and provide food and shelter for the crew are the focus of much research within the Human Research Program. Making all of these technologies robust and reliable enough for multi-year missions with little or no ability to run for home calls for a thorough understanding of the risks and impacts of failure. The way we currently monitor for microbial contamination of water, air, and surfaces, by sampling and growing cultures on nutrient media, must be reconsidered for exploration missions which have limited capacity for consumables. Likewise, the shelf life of food must be increased so that the nutrients required to keep the crewmembers healthy do not degrade over the life of the mission. Improved formulations, preservation, packaging, and storage technologies are all being investigated for ways slow this process or replace stowed food with key food items grown fresh in situ. Ensuring that the mass and volume of a spacecraft are used to maximum efficiency calls for infusing human factors into the design from its inception to increase efficiency, improve performance, and retain robustness toward operational realities. Integrating the human system with the spacecraft systems is the focus of many lines of investigation.
Ackerman, Sara L.; Boscardin, Christy; Karliner, Leah; Handley, Margaret A.; Cheng, Sarah; Gaither, Tom; Hagey, Jill; Hennein, Lauren; Malik, Faizan; Shaw, Brian; Trinidad, Norver; Zahner, Greg; Gonzales, Ralph
2016-01-01
Problem Systems-based practice focuses on the organization, financing, and delivery of medical services. The American Association of Medical Colleges has recommended that systems-based practice be incorporated into medical schools’ curricula. However, experiential learning in systems-based practice, including practical strategies to improve the quality and efficiency of clinical care, is often absent from or inconsistently included in medical education. Intervention A multidisciplinary clinician and non-clinician faculty team partnered with a cardiology outpatient clinic to design a nine-month clerkship for first-year medical students focused on systems-based practice, delivery of clinical care, and strategies to improve the quality and efficiency of clinical operations. The clerkship was called the Action Research Program. In 2013–2014, eight trainees participated in educational seminars, research activities, and nine-week clinic rotations. A qualitative process and outcome evaluation drew on interviews with students, clinic staff, and supervising physicians, as well as students’ detailed field notes. Context The Action Research Program was developed and implemented at the University of California, San Francisco, an academic medical center in the U.S. All educational activities took place at the university’s medical school and at the medical center’s cardiology outpatient clinic. Outcome Students reported and demonstrated increased understanding of how care delivery systems work, improved clinical skills, growing confidence in interactions with patients, and appreciation for patients’ experiences. Clinicians reported increased efficiency at the clinic level and improved performance and job satisfaction among medical assistants as a result of their unprecedented mentoring role with students. Some clinicians felt burdened when students shadowed them and asked questions during interactions with patients. Most student-led improvement projects were not fully implemented. Lessons Learned The Action Research Program is a small pilot project that demonstrates an innovative pairing of experiential and didactic training in systems-based practice. Lessons learned include the need for dedicated time and faculty support for students’ improvement projects, which were the least successful aspect of the program. We recommend that future projects aiming to combine clinical training and quality improvement projects designate distinct blocks of time for trainees to pursue each of these activities independently. In 2014–2015, the University of California, San Francisco School of Medicine incorporated key features of the Action Research Program into the standard curriculum, with plans to build upon this foundation in future curricular innovations. PMID:27064720
Krenzelok, Edward P; Mrvos, Rita
2009-05-01
In 2007, medication identification requests (MIRs) accounted for 26.2% of all calls to U.S. poison centers. MIRs are documented with minimal information, but they still require an inordinate amount of work by specialists in poison information (SPI). An analysis was undertaken to identify options to reduce the impact of MIRs on both human and financial resources. All MIRs (2003-2007) to a certified regional poison information center were analyzed to determine call patterns and staffing. The data were used to justify an efficient and cost-effective solution. MIRs represented 42.3% of the 2007 call volume. Optimal staffing would require hiring an additional four full-time equivalent SPI. An interactive voice response (IVR) system was developed to respond to the MIRs. The IVR was used to develop the Medication Identification System that allowed the diversion of up to 50% of the MIRs, enhancing surge capacity and allowing specialists to address the more emergent poison exposure calls. This technology is an entirely voice-activated response call management system that collects zip code, age, gender and drug data and stores all responses as .csv files for reporting purposes. The query bank includes the 200 most common MIRs, and the system features text-to-voice synthesis that allows easy modification of the drug identification menu. Callers always have the option of engaging a SPI at any time during the IVR call flow. The IVR is an efficient and effective alternative that creates better staff utilization.
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
Evaluation of a Postdischarge Call System Using the Logic Model.
Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary
2018-02-01
This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.
NASA Astrophysics Data System (ADS)
Xing, F.; Masson, R.; Lopez, S.
2017-09-01
This paper introduces a new discrete fracture model accounting for non-isothermal compositional multiphase Darcy flows and complex networks of fractures with intersecting, immersed and non-immersed fractures. The so called hybrid-dimensional model using a 2D model in the fractures coupled with a 3D model in the matrix is first derived rigorously starting from the equi-dimensional matrix fracture model. Then, it is discretized using a fully implicit time integration combined with the Vertex Approximate Gradient (VAG) finite volume scheme which is adapted to polyhedral meshes and anisotropic heterogeneous media. The fully coupled systems are assembled and solved in parallel using the Single Program Multiple Data (SPMD) paradigm with one layer of ghost cells. This strategy allows for a local assembly of the discrete systems. An efficient preconditioner is implemented to solve the linear systems at each time step and each Newton type iteration of the simulation. The numerical efficiency of our approach is assessed on different meshes, fracture networks, and physical settings in terms of parallel scalability, nonlinear convergence and linear convergence.
treeman: an R package for efficient and intuitive manipulation of phylogenetic trees.
Bennett, Dominic J; Sutton, Mark D; Turvey, Samuel T
2017-01-07
Phylogenetic trees are hierarchical structures used for representing the inter-relationships between biological entities. They are the most common tool for representing evolution and are essential to a range of fields across the life sciences. The manipulation of phylogenetic trees-in terms of adding or removing tips-is often performed by researchers not just for reasons of management but also for performing simulations in order to understand the processes of evolution. Despite this, the most common programming language among biologists, R, has few class structures well suited to these tasks. We present an R package that contains a new class, called TreeMan, for representing the phylogenetic tree. This class has a list structure allowing phylogenetic trees to be manipulated more efficiently. Computational running times are reduced because of the ready ability to vectorise and parallelise methods. Development is also improved due to fewer lines of code being required for performing manipulation processes. We present three use cases-pinning missing taxa to a supertree, simulating evolution with a tree-growth model and detecting significant phylogenetic turnover-that demonstrate the new package's speed and simplicity.
Giving sustainable agriculture really good odds through innovative rainfall index insurance
NASA Astrophysics Data System (ADS)
Muneepeerakul, C. P.; Muneepeerakul, R.
2017-12-01
Population growth, increasing demands for food, and increasingly uncertain and limited water availability amidst competing demands for water by other users and the environment call for a novel approach to manage water in food production systems to be developed now. Tapping into broad popularity of crop insurance as a risk management intervention, we propose an innovative rainfall index insurance program as a novel systems approach that addresses water conservation in food production systems by exploiting two common currencies that tie the food production systems and others together, namely water and money. Our novel methodology allows for optimizing diverse farm and financial strategies together, revealing strategy portfolios that result in greater water use efficiency and higher incomes at a lower level of water use. Furthermore, it allows targeted interventions to achieve reduction in irrigation water, while providing financial protection to farmers against the increasing uncertainty in water availability. Not only would such a tool result in efficiently less use of water, it would also encourage diversification in farm practices, which reduces the farm's vulnerability against crop price volatility and pest or disease outbreaks and contributes to more sustainable agriculture.
Ecological study of ruffed grouse broods in Virginia
Stewart, R.E.
1956-01-01
The Ruffed Grouse (Bonasa umbellus), commonly called "pheasant" throughout the southern Appalachian region, is a popular game bird in the mountains of Virginia. Unfortunately, however, the grouse populations in this State have declined noticeably during the past fifty years. Because of this, special field studies were designed through the cooperation of the U. S. Fish and Wildlife Service and U. S. Forest Service, which would provide information that could be used in devising more efficient grouse management practices. As part of this program, I was assigned to investigate the ecology and habits of this species in the Shenandoah Mountains during the spring and summer of 1941. These studies were conducted within the George Washington National Forest in northwestern Augusta County, southwestern Rockingham County, and northeastern Highland County, Virginia.
Cavity radiation model for solar central receivers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipps, F.W.
1981-01-01
The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less
Body Building Boons From Apollo
NASA Technical Reports Server (NTRS)
1978-01-01
The Exer-Genie program utilizes familiar types of exercise, such as isometrics (pushing or pulling against an immovable object) and isotonics (motive exercises such as calisthenics or weight lifting) but with the important added factor of controlled resistance. The device is an arrangement of hand grips and nylon cord wrapped around an aluminum shaft. Controlled friction determines the resistance and the user can set the amount of resistive force to his own physical conditioning needs. Since Apollo days, the Exer-Genie and a similar device called the Apollo Exerciser have found wide acceptance among professional, collegiate and high school athletic teams, and among the growing number of individuals interested in physical fitness. These devices are efficient and economical replacements for conventional conditioning equipment and extremely versatile, allowing more than 100 basic exercises for shaping up specific muscle groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NREL's Sustainability Program is responsible for upholding all executive orders, federal regulations, U.S. Department of Energy (DOE) orders, and goals related to sustainable and resilient facility operations. But NREL continues to expand sustainable practices above and beyond the laboratory's regulations and requirements to ensure that the laboratory fulfills its mission into the future, leaves the smallest possible legacy footprint, and models sustainable operations and behaviors on national, regional, and local levels. The report, per the GRI reporting format, elaborates on multi-year goals relative to executive orders, achievements, and challenges; and success stories provide specific examples. A section called 'NREL's Resiliencymore » is Taking Many Forms' provides insight into how NREL is drawing on its deep knowledge of renewable energy and energy efficiency to help mitigate or avoid climate change impacts.« less
The new education frontier: clinical teaching at night.
Hanson, Joshua T; Pierce, Read G; Dhaliwal, Gurpreet
2014-02-01
Regulations that restrict resident work hours and call for increased resident supervision have increased attending physician presence in the hospital during the nighttime. The resulting increased interactions between attendings and trainees provide an important opportunity and obligation to enhance the quality of learning that takes place in the hospital between 6 PM and 8 AM. Nighttime education should be transformed in a way that maintains clinical productivity for both attending and resident physicians, integrates high-quality teaching and curricula, and achieves a balance between patient safety and resident autonomy. Direct observation of trainees, instruction in communication, and modeling of cost-efficient medical practice may be more feasible during the night than during daytime hours. To realize the potential of this educational opportunity, training programs should develop skilled nighttime educators and establish metrics to define success.
A Direct Algorithm Maple Package of One-Dimensional Optimal System for Group Invariant Solutions
NASA Astrophysics Data System (ADS)
Zhang, Lin; Han, Zhong; Chen, Yong
2018-01-01
To construct the one-dimensional optimal system of finite dimensional Lie algebra automatically, we develop a new Maple package One Optimal System. Meanwhile, we propose a new method to calculate the adjoint transformation matrix and find all the invariants of Lie algebra in spite of Killing form checking possible constraints of each classification. Besides, a new conception called invariance set is raised. Moreover, this Maple package is proved to be more efficiency and precise than before by applying it to some classic examples. Supported by the Global Change Research Program of China under Grant No. 2015CB95390, National Natural Science Foundation of China under Grant Nos. 11675054 and 11435005, and Shanghai Collaborative Innovation Center of Trustworthy Software for Internet of Things under Grant No. ZF1213
2012-01-01
Background The economic downturn exacerbates the inadequacy of resources for combating the worldwide HIV/AIDS pandemic and amplifies the need to improve the efficiency of HIV/AIDS programs. Methods We used data envelopment analysis (DEA) to evaluate efficiency of national HIV/AIDS programs in transforming funding into services and implemented a Tobit model to identify determinants of the efficiency in 68 low- and middle-income countries. We considered the change from the lowest quartile to the average value of a variable a "notable" increase. Results Overall, the average efficiency in implementing HIV/AIDS programs was moderate (49.8%). Program efficiency varied enormously among countries with means by quartile of efficiency of 13.0%, 36.4%, 54.4% and 96.5%. A country's governance, financing mechanisms, and economic and demographic characteristics influence the program efficiency. For example, if countries achieved a notable increase in "voice and accountability" (e.g., greater participation of civil society in policy making), the efficiency of their HIV/AIDS programs would increase by 40.8%. For countries in the lowest quartile of per capita gross national income (GNI), a notable increase in per capita GNI would increase the efficiency of AIDS programs by 45.0%. Conclusions There may be substantial opportunity for improving the efficiency of AIDS services, by providing more services with existing resources. Actions beyond the health sector could be important factors affecting HIV/AIDS service delivery. PMID:22443135
76 FR 54748 - State Energy Advisory Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy State Energy Advisory Board AGENCY: Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of open teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB). The...
76 FR 16763 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy State Energy Advisory Board (STEAB) AGENCY: Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of open teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB...
76 FR 60012 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-28
... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy State Energy Advisory Board (STEAB) AGENCY: Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of open teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB...
76 FR 25317 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-04
... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy State Energy Advisory Board (STEAB) AGENCY: Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of Open Teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB...
76 FR 75876 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... DEPARTMENT OF ENERGY Energy Efficiency and Renewable Energy State Energy Advisory Board (STEAB) AGENCY: Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of open teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB...
NASA Technical Reports Server (NTRS)
Sayfi, Elias
2004-01-01
MER SPICE Interface is a software module for use in conjunction with the Mars Exploration Rover (MER) mission and the SPICE software system of the Navigation and Ancillary Information Facility (NAIF) at NASA's Jet Propulsion Laboratory. (SPICE is used to acquire, record, and disseminate engineering, navigational, and other ancillary data describing circumstances under which data were acquired by spaceborne scientific instruments.) Given a Spacecraft Clock value, MER SPICE Interface extracts MER-specific data from SPICE kernels (essentially, raw data files) and calculates values for Planet Day Number, Local Solar Longitude, Local Solar Elevation, Local Solar Azimuth, and Local Solar Time (UTC). MER SPICE Interface was adapted from a subroutine, denoted m98SpiceIF written by Payam Zamani, that was intended to calculate SPICE values for the Mars Polar Lander. The main difference between MER SPICE Interface and m98SpiceIf is that MER SPICE Interface does not explicitly call CHRONOS, a time-conversion program that is part of a library of utility subprograms within SPICE. Instead, MER SPICE Interface mimics some portions of the CHRONOS code, the advantage being that it executes much faster and can efficiently be called from a pipeline of events in a parallel processing environment.
Pool, René; Heringa, Jaap; Hoefling, Martin; Schulz, Roland; Smith, Jeremy C; Feenstra, K Anton
2012-05-05
We report on a python interface to the GROMACS molecular simulation package, GromPy (available at https://github.com/GromPy). This application programming interface (API) uses the ctypes python module that allows function calls to shared libraries, for example, written in C. To the best of our knowledge, this is the first reported interface to the GROMACS library that uses direct library calls. GromPy can be used for extending the current GROMACS simulation and analysis modes. In this work, we demonstrate that the interface enables hybrid Monte-Carlo/molecular dynamics (MD) simulations in the grand-canonical ensemble, a simulation mode that is currently not implemented in GROMACS. For this application, the interplay between GromPy and GROMACS requires only minor modifications of the GROMACS source code, not affecting the operation, efficiency, and performance of the GROMACS applications. We validate the grand-canonical application against MD in the canonical ensemble by comparison of equations of state. The results of the grand-canonical simulations are in complete agreement with MD in the canonical ensemble. The python overhead of the grand-canonical scheme is only minimal. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Gaponenko, A. M.; Kagramanova, A. A.
2017-11-01
The opportunity of application of Stirling engine with non-conventional and renewable sources of energy. The advantage of such use. The resulting expression for the thermal efficiency of the Stirling engine. It is shown that the work per cycle is proportional to the quantity of matter, and hence the pressure of the working fluid, the temperature difference and, to a lesser extent, depends on the expansion coefficient; efficiency of ideal Stirling cycle coincides with the efficiency of an ideal engine working on the Carnot cycle, which distinguishes a Stirling cycle from the cycles of Otto and Diesel underlying engine. It has been established that the four input parameters, the only parameter which can be easily changed during operation, and which effectively affects the operation of the engine is the phase difference. Dependence of work per cycle of the phase difference, called the phase characteristic, visually illustrates mode of operation of Stirling engine. The mathematical model of the cycle of Schmidt and the analysis of operation of Stirling engine in the approach of Schmidt with the aid of numerical analysis. To conduct numerical experiments designed program feature in the language MathLab. The results of numerical experiments are illustrated by graphical charts.
Handling Data Skew in MapReduce Cluster by Using Partition Tuning
Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai
2017-01-01
The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. © 2017 Yufei Gao et al.
Handling Data Skew in MapReduce Cluster by Using Partition Tuning.
Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai
2017-01-01
The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data.
Handling Data Skew in MapReduce Cluster by Using Partition Tuning
Zhou, Yanjie; Zhou, Bing; Shi, Lei
2017-01-01
The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. PMID:29065568
ERIC Educational Resources Information Center
Pressley, Michael; And Others
1994-01-01
Describes a comprehension strategies instruction program called Students Achieving Independent Learning (SAIL). Relates the program to reader response and transactional theories of reading. Shows how the program works in one school system. Compares SAIL with basal series instruction programs. (HB)
31 CFR 50.94 - Data call authority.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Data call authority. 50.94 Section 50.94 Money and Finance: Treasury Office of the Secretary of the Treasury TERRORISM RISK INSURANCE PROGRAM Cap on Annual Liability § 50.94 Data call authority. For the purpose of determining initial or...
Formal and Informal CALL Preparation and Teacher Attitude toward Technology
ERIC Educational Resources Information Center
Kessler, Greg
2007-01-01
Recent research suggests that there is a general lack of a computer-assisted language learning (CALL) presence in teacher preparation programs. There is also evidence that teachers obtain a majority of their CALL knowledge from informal sources and personal experience rather than through formalized preparation. Further, graduates of these programs…
Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and
Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... (IP), Infrastructure Information Collection Division (IICD) published a 60-day comment period notice..., ``IP Data Call.'' This is a correction notice to correct the title of the published 60-day notice to read, ``IP Data Call Survey.'' There are no further updates. This correction notice is issued as...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, T.P.; Clark, R.M.; Mostrom, M.A.
This report discusses the following topics on the LAMDA program: General maintenance; CTSS FCL script; DOS batch files; Macintosh MPW scripts; UNICOS FCL script; VAX/MS command file; LINC calling tree; and LAMDA calling tree.
NASA Technical Reports Server (NTRS)
Spuler, Linda M.; Ford, Patricia K.; Skeete, Darren C.; Hershman, Scot; Raviprakash, Pushpa; Arnold, John W.; Tran, Victor; Haenze, Mary Alice
2005-01-01
"Close Call Action Log Form" ("CCALF") is the name of both a computer program and a Web-based service provided by the program for creating an enhanced database of close calls (in the colloquial sense of mishaps that were avoided by small margins) assigned to the Center Operations Directorate (COD) at Johnson Space Center. CCALF provides a single facility for on-line collaborative review of close calls. Through CCALF, managers can delegate responses to employees. CCALF utilizes a pre-existing e-mail system to notify managers that there are close calls to review, but eliminates the need for the prior practices of passing multiple e-mail messages around the COD, then collecting and consolidating them into final responses: CCALF now collects comments from all responders for incorporation into reports that it generates. Also, whereas it was previously necessary to manually calculate metrics (e.g., numbers of maintenance-work orders necessitated by close calls) for inclusion in the reports, CCALF now computes the metrics, summarizes them, and displays them in graphical form. The reports and all pertinent information used to generate the reports are logged, tracked, and retained by CCALF for historical purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messenger, Mike; Bharvirkar, Ranjit; Golemboski, Bill
Public and private funding for end-use energy efficiency actions is expected to increase significantly in the United States over the next decade. For example, Barbose et al (2009) estimate that spending on ratepayer-funded energy efficiency programs in the U.S. could increase frommore » $3.1 billion in 2008 to $$7.5 and 12.4 billion by 2020 under their medium and high scenarios. This increase in spending could yield annual electric energy savings ranging from 0.58% - 0.93% of total U.S. retail sales in 2020, up from 0.34% of retail sales in 2008. Interest in and support for energy efficiency has broadened among national and state policymakers. Prominent examples include {approx}$$18 billion in new funding for energy efficiency programs (e.g., State Energy Program, Weatherization, and Energy Efficiency and Conservation Block Grants) in the 2009 American Recovery and Reinvestment Act (ARRA). Increased funding for energy efficiency should result in more benefits as well as more scrutiny of these results. As energy efficiency becomes a more prominent component of the U.S. national energy strategy and policies, assessing the effectiveness and energy saving impacts of energy efficiency programs is likely to become increasingly important for policymakers and private and public funders of efficiency actions. Thus, it is critical that evaluation, measurement, and verification (EM&V) is carried out effectively and efficiently, which implies that: (1) Effective program evaluation, measurement, and verification (EM&V) methodologies and tools are available to key stakeholders (e.g., regulatory agencies, program administrators, consumers, and evaluation consultants); and (2) Capacity (people and infrastructure resources) is available to conduct EM&V activities and report results in ways that support program improvement and provide data that reliably compares achieved results against goals and similar programs in other jurisdictions (benchmarking). The National Action Plan for Energy Efficiency (2007) presented commonly used definitions for EM&V in the context of energy efficiency programs: (1) Evaluation (E) - The performance of studies and activities aimed at determining the effects and effectiveness of EE programs; (2) Measurement and Verification (M&V) - Data collection, monitoring, and analysis associated with the calculation of gross energy and demand savings from individual measures, sites or projects. M&V can be a subset of program evaluation; and (3) Evaluation, Measurement, and Verification (EM&V) - This term is frequently seen in evaluation literature. EM&V is a catchall acronym for determining both the effectiveness of program designs and estimates of load impacts at the portfolio, program and project level. This report is a scoping study that assesses current practices and methods in the evaluation, measurement and verification (EM&V) of ratepayer-funded energy efficiency programs, with a focus on methods and practices currently used for determining whether projected (ex-ante) energy and demand savings have been achieved (ex-post). M&V practices for privately-funded energy efficiency projects (e.g., ESCO projects) or programs where the primary focus is greenhouse gas reductions were not part of the scope of this study. We identify and discuss key purposes and uses of current evaluations of end-use energy efficiency programs, methods used to evaluate these programs, processes used to determine those methods; and key issues that need to be addressed now and in the future, based on discussions with regulatory agencies, policymakers, program administrators, and evaluation practitioners in 14 states and national experts in the evaluation field. We also explore how EM&V may evolve in a future in which efficiency funding increases significantly, innovative mechanisms for rewarding program performance are adopted, the role of efficiency in greenhouse gas mitigation is more closely linked, and programs are increasingly funded from multiple sources often with multiple program administrators and intended to meet multiple purposes.« less
Coberley, Carter R; McGinnis, Matthew; Orr, Patty M; Coberley, Sadie S; Hobgood, Adam; Hamar, Brent; Gandy, Bill; Pope, James; Hudson, Laurel; Hara, Pam; Shurney, Dexter; Clarke, Janice L; Crawford, Albert; Goldfarb, Neil I
2007-04-01
Diabetes disease management (DM) programs strive to promote healthy behaviors, including obtaining hemoglobin A1c (A1c) and low-density lipoprotein (LDL) tests as part of standards of care. The purpose of this study was to examine the relationship between frequency of telephonic contact and A1c and LDL testing rates. A total of 245,668 members continuously enrolled in diabetes DM programs were evaluated for performance of an A1c or LDL test during their first 12 months in the programs. The association between the number of calls a member received and clinical testing rates was examined. Members who received four calls demonstrated a 24.1% and 21.5% relative increase in A1c and LDL testing rates, respectively, compared to members who received DM mailings alone. Response to the telephonic intervention as part of the diabetes DM programs was influenced by member characteristics including gender, age, and disease burden. For example, females who received four calls achieved a 27.7% and 23.6% increase in A1c and LDL testing, respectively, compared to females who received mailings alone; by comparison, males who were called achieved 21.2% and 19.9% relative increase in A1c and LDL testing, respectively, compared to those who received mailings alone. This study demonstrates a positive association between frequency of telephonic contact and increased performance of an A1c or LDL test in a large, diverse diabetes population participating in DM programs. The impact of member characteristics on the responsiveness to these programs provides DM program designers with knowledge for developing strategies to promote healthy behaviors and improve diabetes outcomes.
Estimating customer electricity savings from projects installed by the U.S. ESCO industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvallo, Juan Pablo; Larsen, Peter H.; Goldman, Charles A.
The U.S. energy service company (ESCO) industry has a well-established track record of delivering substantial energy and dollar savings in the public and institutional facilities sector, typically through the use of energy savings performance contracts (ESPC) (Larsen et al. 2012; Goldman et al. 2005; Hopper et al. 2005, Stuart et al. 2013). This ~$6.4 billion industry, which is expected to grow significantly over the next five years, may play an important role in achieving demand-side energy efficiency under local/state/federal environmental policy goals. To date, there has been little or no research in the public domain to estimate electricity savings formore » the entire U.S. ESCO industry. Estimating these savings levels is a foundational step in order to determine total avoided greenhouse gas (GHG) emissions from demand-side energy efficiency measures installed by U.S. ESCOs. We introduce a method to estimate the total amount of electricity saved by projects implemented by the U.S. ESCO industry using the Lawrence Berkeley National Laboratory (LBNL) /National Association of Energy Service Companies (NAESCO) database of projects and LBNL’s biennial industry survey. We report two metrics: incremental electricity savings and savings from ESCO projects that are active in a given year (e.g., 2012). Overall, we estimate that in 2012 active U.S. ESCO industry projects generated about 34 TWh of electricity savings—15 TWh of these electricity savings were for MUSH market customers who did not rely on utility customer-funded energy efficiency programs (see Figure 1). This analysis shows that almost two-thirds of 2012 electricity savings in municipal, local and state government facilities, universities/colleges, K-12 schools, and healthcare facilities (i.e., the so-called “MUSH” market) were not supported by a utility customer-funded energy efficiency program.« less
Transportation Energy Efficiency Program (TEEP) Report Abstracts
DOT National Transportation Integrated Search
1977-04-15
This bibliography summarizes the published research accomplished for the Department of Transportation's Transportation Energy Efficiency Program and its predecessor, the Automotive Energy Efficiency Program. The reports are indexed by corporate autho...
University of Maryland MRSEC - For Members
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
University of Maryland MRSEC - News: Calendar
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
Wright, J. A.; Phillips, B.D.; Watson, B.L.; Newby, P.K.; Norman, G. J.; Adams, W.G.
2013-01-01
Objective To evaluate the acceptability and feasibility of a scalable obesity treatment program integrated with pediatric primary care and delivered using interactive voice technology (IVR) to families from underserved populations. Design and Methods Fifty parent-child dyads (child 9–12 yrs, BMI >95th percentile) were recruited from a pediatric primary care clinic and randomized to either an IVR or a wait-list control (WLC) group. The majority were lower-income, African-American (72%) families. Dyads received IVR calls for 12 weeks. Call content was informed by two evidenced-based interventions. Anthropometric and behavioral variables were assessed at baseline and 3 mo follow-up. Results Forty-three dyads completed the study. IVR parents ate 1 cup more fruit than WLC (p < .05). No other groups differences were found. Children classified as high users of the IVR decreased weight, BMI and BMI z-score compared to low users (p<.05). Mean number of calls for parents and children were 9.1 (5.2 SD) and 9.0 (5.7 SD), respectively. Of those who made calls, >75% agreed that the calls were useful, made for people like them, credible, and helped them eat healthy foods. Conclusion An obesity treatment program delivered via IVR may be an acceptable and feasible resource for families from underserved populations. PMID:23512915
Saurman, Emily; Lyle, David; Kirby, Sue; Roberts, Russell
2014-07-31
The Mental Health Emergency Care-Rural Access Program (MHEC-RAP) is a telehealth solution providing specialist emergency mental health care to rural and remote communities across western NSW, Australia. This is the first time and motion (T&M) study to examine program efficiency and capacity for a telepsychiatry program. Clinical services are an integral aspect of the program accounting for 6% of all activities and 50% of the time spent conducting program activities, but half of this time is spent completing clinical paperwork. This finding emphasizes the importance of these services to program efficiency and the need to address variability of service provision to impact capacity. Currently, there is no efficiency benchmark for emergency telepsychiatry programs. Findings suggest that MHEC-RAP could increase its activity without affecting program responsiveness. T&M studies not only determine activity and time expenditure, but have a wider application assessing program efficiency by understanding, defining, and calculating capacity. T&M studies can inform future program development of MHEC-RAP and similar telehealth programs, both in Australia and overseas.
Contextuality and Cultural Texts: A Case Study of Workplace Learning in Call Centres
ERIC Educational Resources Information Center
Crouch, Margaret
2006-01-01
Purpose: The paper seeks to show the contextualisation of call centres as a work-specific ethnographically and culturally based community, which, in turn, influences pedagogical practices through the encoding and decoding of cultural texts in relation to two logics: cost-efficiency and customer-orientation. Design/methodology/approach: The paper…
76 FR 36103 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-21
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy State Energy Advisory Board... Open Teleconference. SUMMARY: This notice announces an open teleconference call of the State Energy... Energy Efficiency and Renewable Energy, 1000 Independence Ave, SW., Washington DC, 20585 or telephone...
77 FR 43067 - State Energy Advisory Board
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy State Energy Advisory Board AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice of open teleconference. SUMMARY: This notice announces a teleconference call of the State Energy Advisory Board (STEAB...
13 CFR 101.500 - Small Business Energy Efficiency Program.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Small Business Energy Efficiency... ADMINISTRATION Small Business Energy Efficiency § 101.500 Small Business Energy Efficiency Program. (a) The.../energy, building on the Energy Star for Small Business Program, to assist small business concerns in...
13 CFR 101.500 - Small Business Energy Efficiency Program.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Small Business Energy Efficiency... ADMINISTRATION Small Business Energy Efficiency § 101.500 Small Business Energy Efficiency Program. (a) The.../energy, building on the Energy Star for Small Business Program, to assist small business concerns in...
13 CFR 101.500 - Small Business Energy Efficiency Program.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Small Business Energy Efficiency... ADMINISTRATION Small Business Energy Efficiency § 101.500 Small Business Energy Efficiency Program. (a) The.../energy, building on the Energy Star for Small Business Program, to assist small business concerns in...
13 CFR 101.500 - Small Business Energy Efficiency Program.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Small Business Energy Efficiency... ADMINISTRATION Small Business Energy Efficiency § 101.500 Small Business Energy Efficiency Program. (a) The.../energy, building on the Energy Star for Small Business Program, to assist small business concerns in...
13 CFR 101.500 - Small Business Energy Efficiency Program.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Small Business Energy Efficiency... ADMINISTRATION Small Business Energy Efficiency § 101.500 Small Business Energy Efficiency Program. (a) The.../energy, building on the Energy Star for Small Business Program, to assist small business concerns in...
University of Maryland MRSEC - For Members: Templates
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
Model Energy Efficiency Program Impact Evaluation Guide
Find guidance on model approaches for calculating energy, demand, and emissions savings resulting from energy efficiency programs. It describes several standard approaches that can be used in order to make these programs more efficient.
Reporting Newborn Audiologic Results to State EHDI Programs.
Chung, Winnie; Beauchaine, Kathryn L; Grimes, Alison; O'Hollearn, Tammy; Mason, Craig; Ringwalt, Sharon
All US states and territories have an Early Hearing Detection and Intervention (EHDI) program to facilitate early hearing evaluation and intervention for infants who are deaf or hard of hearing. To ensure efficient coordination of care, the state EHDI programs rely heavily on audiologists' prompt reporting of a newborn's hearing status. Several states have regulations requiring mandatory reporting of a newborn's hearing status. This is an important public health responsibility of pediatric audiologists. Reasons for failing to report vary. The Early Hearing Detection and Intervention-Pediatric Audiology Links to Services (EHDI) facility survey was used to inform reporting compliance of audiology facilities throughout the United States. The survey was disseminated via articles, newsletters, and call-to-action notices to audiologists. Among 1024 facilities surveyed, 88 (8.6%) reported that they did not report newborn's hearing findings to their state EHDI program. Not knowing how to report to the state EHDI program was the most frequently chosen reason (60%). However, among the 936 facilities that were compliant with the reporting requirements, 51 estimated that they reported less than two-third of all hearing evaluation results (5.4%). Some facilities did not report a normal-hearing result and some failed to report because they assumed another facility would report the hearing results. Survey results indicated that audiologists were compliant reporting hearing results to the state EHDI programs. However, there is room for improvement. Regular provider outreach and training by the state EHDI program is necessary to ensure those who are not reporting will comply and to clarify reporting requirements for those who are already compliant.
NASA Astrophysics Data System (ADS)
1997-10-01
NSF-Course and Curriculum Development Program Call for Award Nominations Gordon Conference- Innocations in College Chemistry Teaching Summer Opportunity for Students High School Chemistry Day ACS Satellite TV Seminars Wanted - Newletter Editor ACS Abstract Deadline Call for Award Nominations
Practical Efficiency of Photovoltaic Panel Used for Solar Vehicles
NASA Astrophysics Data System (ADS)
Koyuncu, T.
2017-08-01
In this experimental investigation, practical efficiency of semi-flexible monocrystalline silicon solar panel used for a solar powered car called “Firat Force” and a solar powered minibus called “Commagene” was determined. Firat Force has 6 solar PV modules, a maintenance free long life gel battery pack, a regenerative brushless DC electric motor and Commagene has 12 solar PV modules, a maintenance free long life gel battery pack, a regenerative brushless DC electric motor. In addition, both solar vehicles have MPPT (Maximum power point tracker), ECU (Electronic control unit), differential, instrument panel, steering system, brake system, brake and gas pedals, mechanical equipments, chassis and frame. These two solar vehicles were used for people transportation in Adiyaman city, Turkey, during one year (June 2010-May 2011) of test. As a result, the practical efficiency of semi-flexible monocrystalline silicon solar panel used for Firat Force and Commagene was determined as 13 % in despite of efficiency value of 18% (at 1000 W/m2 and 25 °C ) given by the producer company. Besides, the total efficiency (from PV panels to vehicle wheel) of the system was also defined as 9%.
Perceptions of self-esteem in a welfare-to-wellness-to-work program.
Martin, Carolyn Thompson; Keswick, Judith L; Crayton, Diane; Leveck, Paula
2012-01-01
The study investigates welfare recipients' perceptions of personal self-esteem in relationship with their participation in a welfare-to-wellness-to-work program. The cross-sectional, mixed-methods design examined a convenience sample of 33 participants who attended a welfare-to-wellness-to-work program called Work Wellness: The Basics that is based in an agency called Wellness Works!. A demographic survey, Rosenberg's Self-Esteem scale, and qualitative interviews were used. Even with normal self-esteem scores, the participants credited the program with decreasing negative thoughts and improving self-esteem. The themes identified include program, self-esteem, mental health, and domestic violence. Information about the benefits of a holistic wellness program and its relationship with self-reported enhanced self-esteem can be used to assist with health promotion, policy, and the development of innovative programs that assist with transition from public assistance. © 2011 Wiley Periodicals, Inc.
Estimating means and variances: The comparative efficiency of composite and grab samples.
Brumelle, S; Nemetz, P; Casey, D
1984-03-01
This paper compares the efficiencies of two sampling techniques for estimating a population mean and variance. One procedure, called grab sampling, consists of collecting and analyzing one sample per period. The second procedure, called composite sampling, collectsn samples per period which are then pooled and analyzed as a single sample. We review the well known fact that composite sampling provides a superior estimate of the mean. However, it is somewhat surprising that composite sampling does not always generate a more efficient estimate of the variance. For populations with platykurtic distributions, grab sampling gives a more efficient estimate of the variance, whereas composite sampling is better for leptokurtic distributions. These conditions on kurtosis can be related to peakedness and skewness. For example, a necessary condition for composite sampling to provide a more efficient estimate of the variance is that the population density function evaluated at the mean (i.e.f(μ)) be greater than[Formula: see text]. If[Formula: see text], then a grab sample is more efficient. In spite of this result, however, composite sampling does provide a smaller estimate of standard error than does grab sampling in the context of estimating population means.
Directory of Post-Secondary Retailing and Marketing Vocational Programs.
ERIC Educational Resources Information Center
American Vocational Association, Inc., Washington, DC.
This directory lists 357 general and 135 special retailing and marketing vocational programs at the post secondary level. Institutions vary somewhat in the identification of general programs; for example, they may be called retailing, merchandising, marketing, mid-management, or distributive education programs. Specialized programs offered by…
The Louisiana State University waste-to-energy incinerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-10-26
This proposed action is for cost-shared construction of an incinerator/steam-generation facility at Louisiana State University under the State Energy Conservation Program (SECP). The SECP, created by the Energy Policy and Conservation Act, calls upon DOE to encourage energy conservation, renewable energy, and energy efficiency by providing Federal technical and financial assistance in developing and implementing comprehensive state energy conservation plans and projects. Currently, LSU runs a campus-wide recycling program in order to reduce the quantity of solid waste requiring disposal. This program has removed recyclable paper from the waste stream; however, a considerable quantity of other non-recyclable combustible wastes aremore » produced on campus. Until recently, these wastes were disposed of in the Devil`s Swamp landfill (also known as the East Baton Rouge Parish landfill). When this facility reached its capacity, a new landfill was opened a short distance away, and this new site is now used for disposal of the University`s non-recyclable wastes. While this new landfill has enough capacity to last for at least 20 years (from 1994), the University has identified the need for a more efficient and effective manner of waste disposal than landfilling. The University also has non-renderable biological and potentially infectious waste materials from the School of Veterinary Medicine and the Student Health Center, primarily the former, whose wastes include animal carcasses and bedding materials. Renderable animal wastes from the School of Veterinary Medicine are sent to a rendering plant. Non-renderable, non-infectious animal wastes currently are disposed of in an existing on-campus incinerator near the School of Veterinary Medicine building.« less
Communicating and Interacting: An Exploration of the Changing Roles of Media in CALL/CMC
ERIC Educational Resources Information Center
Hoven, Debra
2006-01-01
The sites of learning and teaching using CALL are shifting from CD-based, LAN-based, or stand-alone programs to the Internet. As this change occurs, pedagogical approaches to using CALL are also shifting to forms which better exploit the communication, collaboration, and negotiation aspects of the Internet. Numerous teachers and designers have…
Plans, providers experimenting with outbound call programs for Medicare risk seniors.
1997-10-01
Putting a new spin on health care call centers: They've been used for commercial and Medicaid populations, but now plans and providers are testing the call center concept among their Medicare seniors. And while it may hold great promise for controlling utilization, there are big start-up costs and serious liability concerns.
Steven MacCall: Winner of LJ's 2010 Teaching Award
ERIC Educational Resources Information Center
Berry, John N., III
2010-01-01
This article profiles Steven L. MacCall, winner of "Library Journal's" 2010 Teaching Award. An associate professor at the School of Library and Information Studies (SLIS) at the University of Alabama, Tuscaloosa, MacCall was nominated by Kathie Popadin, known as "Kpop" to the members of her cohort in the online MLIS program at SLIS. Sixteen of…
A Shellcode Detection Method Based on Full Native API Sequence and Support Vector Machine
NASA Astrophysics Data System (ADS)
Cheng, Yixuan; Fan, Wenqing; Huang, Wei; An, Jing
2017-09-01
Dynamic monitoring the behavior of a program is widely used to discriminate between benign program and malware. It is usually based on the dynamic characteristics of a program, such as API call sequence or API call frequency to judge. The key innovation of this paper is to consider the full Native API sequence and use the support vector machine to detect the shellcode. We also use the Markov chain to extract and digitize Native API sequence features. Our experimental results show that the method proposed in this paper has high accuracy and low detection rate.
Multipole Algorithms for Molecular Dynamics Simulation on High Performance Computers.
NASA Astrophysics Data System (ADS)
Elliott, William Dewey
1995-01-01
A fundamental problem in modeling large molecular systems with molecular dynamics (MD) simulations is the underlying N-body problem of computing the interactions between all pairs of N atoms. The simplest algorithm to compute pair-wise atomic interactions scales in runtime {cal O}(N^2), making it impractical for interesting biomolecular systems, which can contain millions of atoms. Recently, several algorithms have become available that solve the N-body problem by computing the effects of all pair-wise interactions while scaling in runtime less than {cal O}(N^2). One algorithm, which scales {cal O}(N) for a uniform distribution of particles, is called the Greengard-Rokhlin Fast Multipole Algorithm (FMA). This work describes an FMA-like algorithm called the Molecular Dynamics Multipole Algorithm (MDMA). The algorithm contains several features that are new to N-body algorithms. MDMA uses new, efficient series expansion equations to compute general 1/r^{n } potentials to arbitrary accuracy. In particular, the 1/r Coulomb potential and the 1/r^6 portion of the Lennard-Jones potential are implemented. The new equations are based on multivariate Taylor series expansions. In addition, MDMA uses a cell-to-cell interaction region of cells that is closely tied to worst case error bounds. The worst case error bounds for MDMA are derived in this work also. These bounds apply to other multipole algorithms as well. Several implementation enhancements are described which apply to MDMA as well as other N-body algorithms such as FMA and tree codes. The mathematics of the cell -to-cell interactions are converted to the Fourier domain for reduced operation count and faster computation. A relative indexing scheme was devised to locate cells in the interaction region which allows efficient pre-computation of redundant information and prestorage of much of the cell-to-cell interaction. Also, MDMA was integrated into the MD program SIgMA to demonstrate the performance of the program over several simulation timesteps. One MD application described here highlights the utility of including long range contributions to Lennard-Jones potential in constant pressure simulations. Another application shows the time dependence of long range forces in a multiple time step MD simulation.
77 FR 39690 - State Energy Advisory Board (STEAB)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... DEPARTMENT OF ENERGY State Energy Advisory Board (STEAB) AGENCY: Energy Efficiency and Renewable... teleconference call of the State Energy Advisory Board (STEAB). The Federal Advisory Committee Act (Pub. L. 92... Energy, Office of Energy Efficiency and Renewable Energy, 1000 Independence Ave. SW., Washington, DC...
2009-01-01
Background Marginal posterior genotype probabilities need to be computed for genetic analyses such as geneticcounseling in humans and selective breeding in animal and plant species. Methods In this paper, we describe a peeling based, deterministic, exact algorithm to compute efficiently genotype probabilities for every member of a pedigree with loops without recourse to junction-tree methods from graph theory. The efficiency in computing the likelihood by peeling comes from storing intermediate results in multidimensional tables called cutsets. Computing marginal genotype probabilities for individual i requires recomputing the likelihood for each of the possible genotypes of individual i. This can be done efficiently by storing intermediate results in two types of cutsets called anterior and posterior cutsets and reusing these intermediate results to compute the likelihood. Examples A small example is used to illustrate the theoretical concepts discussed in this paper, and marginal genotype probabilities are computed at a monogenic disease locus for every member in a real cattle pedigree. PMID:19958551
Towards a flexible middleware for context-aware pervasive and wearable systems.
Muro, Marco; Amoretti, Michele; Zanichelli, Francesco; Conte, Gianni
2012-11-01
Ambient intelligence and wearable computing call for innovative hardware and software technologies, including a highly capable, flexible and efficient middleware, allowing for the reuse of existing pervasive applications when developing new ones. In the considered application domain, middleware should also support self-management, interoperability among different platforms, efficient communications, and context awareness. In the on-going "everything is networked" scenario scalability appears as a very important issue, for which the peer-to-peer (P2P) paradigm emerges as an appealing solution for connecting software components in an overlay network, allowing for efficient and balanced data distribution mechanisms. In this paper, we illustrate how all these concepts can be placed into a theoretical tool, called networked autonomic machine (NAM), implemented into a NAM-based middleware, and evaluated against practical problems of pervasive computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Ian M.; Goldman, Charles A.; Murphy, Sean
The average cost to utilities to save a kilowatt-hour (kWh) in the United States is 2.5 cents, according to the most comprehensive assessment to date of the cost performance of energy efficiency programs funded by electricity customers. These costs are similar to those documented earlier. Cost-effective efficiency programs help ensure electricity system reliability at the most affordable cost as part of utility planning and implementation activities for resource adequacy. Building on prior studies, Berkeley Lab analyzed the cost performance of 8,790 electricity efficiency programs between 2009 and 2015 for 116 investor-owned utilities and other program administrators in 41 states. Themore » Berkeley Lab database includes programs representing about three-quarters of total spending on electricity efficiency programs in the United States.« less
Automatic computer subprogram selection from application program libraries
NASA Technical Reports Server (NTRS)
Drozdowski, J. M.
1972-01-01
The program ALTLIB (ALTernate LIBrary) which allows a user access to an alternate subprogram library with a minimum effort is discussed. The ALTLIB program selects subprograms from an alternate library file and merges them with the user's program load file. Only subprograms that are called for (directly or indirectly) by the user's programs and that are available on the alternate library file will be selected. ALTLIB eliminates the need for elaborate control-card manipulations to add subprograms from a subprogram file. ALTLIB returns to the user his binary file and the selected subprograms in correct order for a call to the loader. The user supplies the alternate library file. Subprogram requests which are not satisfied from the alternate library file will be satisfied at load time from the system library.
LAMDA programmer`s manual. [Final report, Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, T.P.; Clark, R.M.; Mostrom, M.A.
This report discusses the following topics on the LAMDA program: General maintenance; CTSS FCL script; DOS batch files; Macintosh MPW scripts; UNICOS FCL script; VAX/MS command file; LINC calling tree; and LAMDA calling tree.
NASA Astrophysics Data System (ADS)
Iskin, Ibrahim
Energy efficiency stands out with its potential to address a number of challenges that today's electric utilities face, including increasing and changing electricity demand, shrinking operating capacity, and decreasing system reliability and flexibility. Being the least cost and least risky alternative, the share of energy efficiency programs in utilities' energy portfolios has been on the rise since the 1980s, and their increasing importance is expected to continue in the future. Despite holding great promise, the ability to determine and invest in only the most promising program alternatives plays a key role in the successful use of energy efficiency as a utility-wide resource. This issue becomes even more significant considering the availability of a vast number of potential energy efficiency programs, the rapidly changing business environment, and the existence of multiple stakeholders. This dissertation introduces hierarchical decision modeling as the framework for energy efficiency program planning in electric utilities. The model focuses on the assessment of emerging energy efficiency programs and proposes to bridge the gap between technology screening and cost/benefit evaluation practices. This approach is expected to identify emerging technology alternatives which have the highest potential to pass cost/benefit ratio testing procedures and contribute to the effectiveness of decision practices in energy efficiency program planning. The model also incorporates rank order analysis and sensitivity analysis for testing the robustness of results from different stakeholder perspectives and future uncertainties in an attempt to enable more informed decision-making practices. The model was applied to the case of 13 high priority emerging energy efficiency program alternatives identified in the Pacific Northwest, U.S.A. The results of this study reveal that energy savings potential is the most important program management consideration in selecting emerging energy efficiency programs. Market dissemination potential and program development and implementation potential are the second and third most important, whereas ancillary benefits potential is the least important program management consideration. The results imply that program value considerations, comprised of energy savings potential and ancillary benefits potential; and program feasibility considerations, comprised of program development and implementation potential and market dissemination potential, have almost equal impacts on assessment of emerging energy efficiency programs. Considering the overwhelming number of value-focused studies and the few feasibility-focused studies in the literature, this finding clearly shows that feasibility-focused studies are greatly understudied. The hierarchical decision model developed in this dissertation is generalizable. Thus, other utilities or power systems can adopt the research steps employed in this study as guidelines and conduct similar assessment studies on emerging energy efficiency programs of their interest.
A Global Review of Incentive Programs to Accelerate Energy-Efficient Appliances and Equipment
DOE Office of Scientific and Technical Information (OSTI.GOV)
de la Rue du Can, Stephane; Phadke, Amol; Leventis, Greg
Incentive programs are an essential policy tool to move the market toward energy-efficient products. They offer a favorable complement to mandatory standards and labeling policies by accelerating the market penetration of energy-efficient products above equipment standard requirements and by preparing the market for increased future mandatory requirements. They sway purchase decisions and in some cases production decisions and retail stocking decisions toward energy-efficient products. Incentive programs are structured according to their regulatory environment, the way they are financed, by how the incentive is targeted, and by who administers them. This report categorizes the main elements of incentive programs, using casemore » studies from the Major Economies Forum to illustrate their characteristics. To inform future policy and program design, it seeks to recognize design advantages and disadvantages through a qualitative overview of the variety of programs in use around the globe. Examples range from rebate programs administered by utilities under an Energy-Efficiency Resource Standards (EERS) regulatory framework (California, USA) to the distribution of Eco-Points that reward customers for buying efficient appliances under a government recovery program (Japan). We found that evaluations have demonstrated that financial incentives programs have greater impact when they target highly efficient technologies that have a small market share. We also found that the benefits and drawbacks of different program design aspects depend on the market barriers addressed, the target equipment, and the local market context and that no program design surpasses the others. The key to successful program design and implementation is a thorough understanding of the market and effective identification of the most important local factors hindering the penetration of energy-efficient technologies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John, C.J.; Maciasz, G.; Harder, B.J.
1998-06-01
The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Design well program; LaFourche Crossing; MG-T/DOE Amoco Fee No. 1 (Sweet Lake); Environmental monitoring at Sweet Lake; Air quality; Water quality; Microseismic monitoring; Subsidence; Dow/DOE L.R. Sweezy No. 1more » well; Reservoir testing; Environmental monitoring at Parcperdue; Air monitoring; Water runoff; Groundwater; Microseismic events; Subsidence; Environmental consideration at site; Gladys McCall No. 1 well; Test results of Gladys McCall; Hydrocarbons in production gas and brine; Environmental monitoring at the Gladys McCall site; Pleasant Bayou No. 2 well; Pleasant Bayou hybrid power system; Environmental monitoring at Pleasant Bayou; and Plug abandonment and well site restoration of three geopressured-geothermal test sites. 197 figs., 64 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, J.R.
Loss control, both as a phrase and a concept, isn't used very widely in the U.S. coal industry although a U.S. manufacturer has cut accidents 71% and increased productivity 40% using the system. Safety is a part of the loss control concept, but it goes beyond traditional accident and illness prevention to become management control of anything that can result in loss or property damage. This includes what ILCI calls incidents, that is, ''any undesired or unwanted event that could (or does) degrade the efficiency of the business operation.'' These incidents could be accidents, quality or production problems, or evenmore » security breaches (such as thefts). So while safety is always a basic element-loss control also includes absenteeism control, security, fire prevention and industrial hygiene, since they're all interrelated disciplines for reducing loss. A baseline evaluation is followed by recommendations and guidance in self-sustaining corrective measures. This program would cost about $3,500 the first year. Possibly this approach is not used in the U.S. because miners feel that with all the legislation and regulation of the industry no further program is needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, D. B.
2015-01-30
The Adversary & Interdiction Methods (AIM) program provides training and capability assessment services to government agencies around the country. Interdisciplinary teams equipped with gear and radioactive sources are repeatedly fielded to offsite events to collaborate with law enforcement agencies at all levels of government. AIM has grown rapidly over the past three years. A knowledge management system as evolved along with the program but it has failed to keep pace. A new system is needed. The new system must comply with cybersecurity and information technology solutions already in place at an institutional level. The offsite nature of AIM activities mustmore » also be accommodated. Cost and schedule preclude the commissioning of new software and the procurement of expensive hardware. The new system must exploit in-house capabilities and be established quickly. A novel system is proposed. This solution centers on a recently introduced institutional file sharing capability called Syncplicity. AIM-authored software will be combined with a dedicated institutional account to vastly extend the capability of this resource. The new knowledge management system will reduce error and increase efficiency through automation and be accessible offsite via mobile devices.« less
Certainty grids for mobile robots
NASA Technical Reports Server (NTRS)
Moravec, H. P.
1987-01-01
A numerical representation of uncertain and incomplete sensor knowledge called Certainty Grids has been used successfully in several mobile robot control programs, and has proven itself to be a powerful and efficient unifying solution for sensor fusion, motion planning, landmark identification, and many other central problems. Researchers propose to build a software framework running on processors onboard the new Uranus mobile robot that will maintain a probabilistic, geometric map of the robot's surroundings as it moves. The certainty grid representation will allow this map to be incrementally updated in a uniform way from various sources including sonar, stereo vision, proximity and contact sensors. The approach can correctly model the fuzziness of each reading, while at the same time combining multiple measurements to produce sharper map features, and it can deal correctly with uncertainties in the robot's motion. The map will be used by planning programs to choose clear paths, identify locations (by correlating maps), identify well-known and insufficiently sensed terrain, and perhaps identify objects by shape. The certainty grid representation can be extended in the same dimension and used to detect and track moving objects.
The MAP Autonomous Mission Control System
NASA Technical Reports Server (NTRS)
Breed, Juile; Coyle, Steven; Blahut, Kevin; Dent, Carolyn; Shendock, Robert; Rowe, Roger
2000-01-01
The Microwave Anisotropy Probe (MAP) mission is the second mission in NASA's Office of Space Science low-cost, Medium-class Explorers (MIDEX) program. The Explorers Program is designed to accomplish frequent, low cost, high quality space science investigations utilizing innovative, streamlined, efficient management, design and operations approaches. The MAP spacecraft will produce an accurate full-sky map of the cosmic microwave background temperature fluctuations with high sensitivity and angular resolution. The MAP spacecraft is planned for launch in early 2001, and will be staffed by only single-shift operations. During the rest of the time the spacecraft must be operated autonomously, with personnel available only on an on-call basis. Four (4) innovations will work cooperatively to enable a significant reduction in operations costs for the MAP spacecraft. First, the use of a common ground system for Spacecraft Integration and Test (I&T) as well as Operations. Second, the use of Finite State Modeling for intelligent autonomy. Third, the integration of a graphical planning engine to drive the autonomous systems without an intermediate manual step. And fourth, the ability for distributed operations via Web and pager access.
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Non-Formal Educator Use of Evaluation Results
ERIC Educational Resources Information Center
Baughman, Sarah; Boyd, Heather H.; Franz, Nancy K.
2012-01-01
Increasing demands for accountability in educational programming have resulted in increasing calls for program evaluation in educational organizations. Many organizations include conducting program evaluations as part of the job responsibilities of program staff. Cooperative Extension is a complex organization offering non-formal educational…
Achieving environmental excellence through a multidisciplinary grassroots movement.
Herechuk, Bryan; Gosse, Carolyn; Woods, John N
2010-01-01
St. Joseph's Healthcare Hamilton (SJHH) supports a grassroots green team, called Environmental Vision and Action (EVA). Since the creation of EVA, a healthy balance between corporate projects led by corporate leaders and grassroots initiatives led by informal leaders has resulted in many successful environmental initiatives. Over a relatively short period of time, environmental successes at SJHH have included waste diversion programs, energy efficiency and reduction initiatives, alternative commuting programs, green purchasing practices, clinical and pharmacy greening and increased staff engagement and awareness. Knowledge of social movements theory helped EVA leaders to understand the internal processes of a grassroots movement and helped to guide it. Social movements theory may also have broader applicability in health care by understanding the passionate engagement that people bring to a common cause and how to evolve sources of opposition into engines for positive change. After early successes, as the limitations of a grassroots movement began to surface, the EVA team revived the concept of evolving the grassroots green program into a corporate program for environmental stewardship. It is hard to quantify the importance of allowing our staff, physicians, volunteers and patients to engage in changes that they feel passionately about. However, at SJHH, the transformation of a group of people unsatisfied with the organization's environmental performance into an 'engine for change' has led to a rapid improvement in environmental stewardship at SJHH that is now regarded as a success.
ERIC Educational Resources Information Center
Griffin, William H.; Carter, James D.
The strategy used in evaluating an out-of-doors resident camping program for emotionally disturbed children is outlined. This strategy calls for examining the following elements in the program: (1) program goals and objectives; (2) collection and processing program data; (3) camper progress assessment; (4) program audit; (5) assessment of past…
Foshee, Vangie A; Reyes, Luz McNaughton; Agnew-Brune, Christine B; Simon, Thomas R; Vagi, Kevin J; Lee, Rosalyn D; Suchindran, Chiravath
2014-12-01
In response to recent calls for programs that can prevent multiple types of youth violence, the current study examined whether Safe Dates, an evidence-based dating violence prevention program, was effective in preventing other forms of youth violence. Using data from the original Safe Dates randomized controlled trial, this study examined (1) the effectiveness of Safe Dates in preventing peer violence victimization and perpetration and school weapon carrying 1 year after the intervention phase was completed and (2) moderation of program effects by the sex or race/ethnicity of the adolescent. Ninety percent (n = 1,690) of the eighth and ninth graders who completed baseline questionnaires completed the 1-year follow-up assessment. The sample was 51 % female and 26 % minority (of whom 69 % was black and 31 % was of another minority race/ethnicity). There were no baseline treatment group differences in violence outcomes. Treatment condition was significantly associated with peer violence victimization and school weapon carrying at follow-up; there was 12 % less victimization and 31 % less weapon carrying among those exposed to Safe Dates than those among controls. Treatment condition was significantly associated with perpetration among the minority but not among white adolescents; there was 23 % less violence perpetration among minority adolescents exposed to Safe Dates than that among controls. The observed effect sizes were comparable with those of other universal school-based youth violence prevention programs. Implementing Safe Dates may be an efficient way of preventing multiple types of youth violence.
LDRD final report on massively-parallel linear programming : the parPCx system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parekh, Ojas; Phillips, Cynthia Ann; Boman, Erik Gunnar
2005-02-01
This report summarizes the research and development performed from October 2002 to September 2004 at Sandia National Laboratories under the Laboratory-Directed Research and Development (LDRD) project ''Massively-Parallel Linear Programming''. We developed a linear programming (LP) solver designed to use a large number of processors. LP is the optimization of a linear objective function subject to linear constraints. Companies and universities have expended huge efforts over decades to produce fast, stable serial LP solvers. Previous parallel codes run on shared-memory systems and have little or no distribution of the constraint matrix. We have seen no reports of general LP solver runsmore » on large numbers of processors. Our parallel LP code is based on an efficient serial implementation of Mehrotra's interior-point predictor-corrector algorithm (PCx). The computational core of this algorithm is the assembly and solution of a sparse linear system. We have substantially rewritten the PCx code and based it on Trilinos, the parallel linear algebra library developed at Sandia. Our interior-point method can use either direct or iterative solvers for the linear system. To achieve a good parallel data distribution of the constraint matrix, we use a (pre-release) version of a hypergraph partitioner from the Zoltan partitioning library. We describe the design and implementation of our new LP solver called parPCx and give preliminary computational results. We summarize a number of issues related to efficient parallel solution of LPs with interior-point methods including data distribution, numerical stability, and solving the core linear system using both direct and iterative methods. We describe a number of applications of LP specific to US Department of Energy mission areas and we summarize our efforts to integrate parPCx (and parallel LP solvers in general) into Sandia's massively-parallel integer programming solver PICO (Parallel Interger and Combinatorial Optimizer). We conclude with directions for long-term future algorithmic research and for near-term development that could improve the performance of parPCx.« less
Boal, Ashley L; Abroms, Lorien C; Simmens, Samuel; Graham, Amanda L; Carpenter, Kelly M
2016-05-01
This study seeks to determine whether comprehensive quitline services combined with text messaging improve smoking cessation rates beyond those achieved by offering comprehensive quitline services alone. The study sample consisted of callers to the Alere Wellbeing, Inc, commercial quitline in 2012. A quasi-experimental design was implemented using propensity score matching to create the intervention and control groups. The intervention group consisted of those who were offered and accepted a text message intervention in addition to usual quitline services, while the control group consisted of those who were not offered the text message intervention. Analyses utilized baseline data collected at intake, program use data (eg, call history and text message use), and reports of smoking behaviors and program satisfaction collected 6 months after intake. Similar rates of 7-day abstinence were reported regardless of whether participants received combined multi-call quitline services plus text messaging (25.3%) or multi-call quitline services in isolation (25.5%), though those who received combined services reported higher treatment satisfaction (P < .05). Among those who received combined services, the number of text messages sent to the text message program predicted 7-day abstinence such that those who sent more text messages were less likely to report 7-day abstinence. Text messaging may not confer additional benefits over and above those received through multi-modal, multi-call quitline programs. Future research should investigate whether text messaging programs improve quit rates when combined with less intensive services such as single-call phone counseling. While the impact of quitline and text messaging services for smoking cessation have been examined in isolation, no study has explored the impact of combined services on smoking outcomes. This study examines the role of text messaging in combination with comprehensive quitline services including multi-call phone counseling, access to an interactive website and nicotine replacement therapy. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Shea, Kevin P.
1975-01-01
A new means of irrigation, called the drip or trickle system, has been proven more efficient and less wasteful than the current system of flood irrigation. As a result of this drip system, fertilizer-use efficiency is improved and crop yield, though never decreased, is sometimes increased in some crops. (MA)
Does Competition Improve Public School Efficiency? A Spatial Analysis
ERIC Educational Resources Information Center
Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.
2012-01-01
Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…
Model Checker for Java Programs
NASA Technical Reports Server (NTRS)
Visser, Willem
2007-01-01
Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.
Nishiyama, Yuichiro; Iwanami, Akio; Kohyama, Jun; Itakura, Go; Kawabata, Soya; Sugai, Keiko; Nishimura, Soraya; Kashiwagi, Rei; Yasutake, Kaori; Isoda, Miho; Matsumoto, Morio; Nakamura, Masaya; Okano, Hideyuki
2016-06-01
Stem cells represent a potential cellular resource in the development of regenerative medicine approaches to the treatment of pathologies in which specific cells are degenerated or damaged by genetic abnormality, disease, or injury. Securing sufficient supplies of cells suited to the demands of cell transplantation, however, remains challenging, and the establishment of safe and efficient cell banking procedures is an important goal. Cryopreservation allows the storage of stem cells for prolonged time periods while maintaining them in adequate condition for use in clinical settings. Conventional cryopreservation systems include slow-freezing and vitrification both have advantages and disadvantages in terms of cell viability and/or scalability. In the present study, we developed an advanced slow-freezing technique using a programmed freezer with a magnetic field called Cells Alive System (CAS) and examined its effectiveness on human induced pluripotent stem cell-derived neural stem/progenitor cells (hiPSC-NS/PCs). This system significantly increased cell viability after thawing and had less impact on cellular proliferation and differentiation. We further found that frozen-thawed hiPSC-NS/PCs were comparable with non-frozen ones at the transcriptome level. Given these findings, we suggest that the CAS is useful for hiPSC-NS/PCs banking for clinical uses involving neural disorders and may open new avenues for future regenerative medicine. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
76 FR 16297 - Drawbridge Operation Regulation; Cerritos Channel, Long Beach, CA
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... Federal holidays. FOR FURTHER INFORMATION CONTACT: If you have questions on this rule, call or e-mail... [email protected] . If you have questions on viewing the docket, call Renee V. Wright, Program...
NASA Technical Reports Server (NTRS)
Hall, Philip; Whitfield, Susan
2011-01-01
As NASA undertakes increasingly complex projects, the need for expert systems engineers and leaders in systems engineering is becoming more pronounced. As a result of this issue, the Agency has undertaken an initiative to develop more systems engineering leaders through its Systems Engineering Leadership Development Program; however, the NASA Office of the Chief Engineer has also called on the field Centers to develop mechanisms to strengthen their expertise in systems engineering locally. In response to this call, Marshall Space Flight Center (MSFC) has developed a comprehensive development program for aspiring systems engineers and systems engineering leaders. This presentation will summarize the two-level program, which consists of a combination of training courses and on-the-job, developmental training assignments at the Center to help develop stronger expertise in systems engineering and technical leadership. In addition, it will focus on the success the program has had in its pilot year. The program hosted a formal kickoff event for Level I on October 13, 2009. The first class includes 42 participants from across MSFC and Michoud Assembly Facility (MAF). A formal call for Level II is forthcoming. With the new Agency focus on research and development of new technologies, having a strong pool of well-trained systems engineers is becoming increasingly more critical. Programs such as the Marshall Systems Engineering Leadership Development Program, as well as those developed at other Centers, help ensure that there is an upcoming generation of trained systems engineers and systems engineering leaders to meet future design challenges.
Instructional Design: Its Relevance for CALL.
ERIC Educational Resources Information Center
England, Elaine
1989-01-01
Describes an interdisciplinary (language and educational technology departments) instructional design program that is intended to develop back-up computer programs for students taking supplementary English as a second language classes. The program encompasses training programs, the psychology of screen reading, task analysis, and color cueing.…
Software Review: Welcome to the World of Delta Drawing.
ERIC Educational Resources Information Center
King, Charles
1983-01-01
Provided is a review of an educational software program called "Delta Drawing." Included are comments on how the graphics program works, programing features, comparison with LOGO, educational value, and availability. Indicates that as a powerful learning program, it is innovative and imaginative. (JN)
Sterling, Lynn; McCaffrey, Carmen; Secter, Michael; Rich, Rebecca; Green, Jessica; Shirreff, Lindsay; Steele, Donna
2016-11-01
The 2013 pan-Canadian consensus Report on Resident Duty Hours identified that traditional 24-hour duty periods pose risks to the well-being of residents and should be avoided. In anticipation of duty-hour restrictions, the Obstetrics and Gynaecology Residency Program at the University of Toronto developed and implemented a night float (NF) call model over a three-year span. Quarterly resident surveys have consistently shown that the NF system is preferred to traditional 24-hour call and has resulted in reduced fatigue and improved continuity of patient care. Through many iterations, the NF model achieved levels of resident morale, surgical experience, and impact on family relationships that are comparable to the 24-hour call system. We review here our process for developing an NF call model and the perceptions and experiences of residents, with the goal of providing insight for other residency programs that are considering or instituting NF call systems. Copyright © 2016 The Society of Obstetricians and Gynaecologists of Canada/La Société des obstétriciens et gynécologues du Canada. Published by Elsevier Inc. All rights reserved.
Answering the Call of the Web: UVA Crafts a Innovative Web Certification Program for Its Staff.
ERIC Educational Resources Information Center
Lee, Sandra T.
2000-01-01
Describes the development of a Web Certification Program at the University of Virginia. This program offers certificates at three levels: Web Basics, Web Designer, and Web Master. The paper focuses on: determination of criteria for awarding certificates; program status; program evaluation and program effectiveness; and future plans for the Web…
ERIC Educational Resources Information Center
Grammatikopoulos, Vasilis
2012-01-01
The current study attempts to integrate parts of program theory and systems-based procedures in educational program evaluation. The educational program that was implemented, called the "Early Steps" project, proposed that physical education can contribute to various educational goals apart from the usual motor skills improvement. Basic…
VENVAL : a plywood mill cost accounting program
Henry Spelter
1991-01-01
This report documents a package of computer programs called VENVAL. These programs prepare plywood mill data for a linear programming (LP) model that, in turn, calculates the optimum mix of products to make, given a set of technologies and market prices. (The software to solve a linear program is not provided and must be obtained separately.) Linear programming finds...
CRAVE: a database, middleware and visualization system for phenotype ontologies.
Gkoutos, Georgios V; Green, Eain C J; Greenaway, Simon; Blake, Andrew; Mallon, Ann-Marie; Hancock, John M
2005-04-01
A major challenge in modern biology is to link genome sequence information to organismal function. In many organisms this is being done by characterizing phenotypes resulting from mutations. Efficiently expressing phenotypic information requires combinatorial use of ontologies. However tools are not currently available to visualize combinations of ontologies. Here we describe CRAVE (Concept Relation Assay Value Explorer), a package allowing storage, active updating and visualization of multiple ontologies. CRAVE is a web-accessible JAVA application that accesses an underlying MySQL database of ontologies via a JAVA persistent middleware layer (Chameleon). This maps the database tables into discrete JAVA classes and creates memory resident, interlinked objects corresponding to the ontology data. These JAVA objects are accessed via calls through the middleware's application programming interface. CRAVE allows simultaneous display and linking of multiple ontologies and searching using Boolean and advanced searches.
Newly emerging resource efficiency manager programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolf, S.; Howell, C.
1997-12-31
Many facilities in the northwest such as K--12 schools, community colleges, and military installations are implementing resource-efficiency awareness programs. These programs are generally referred to as resource efficiency manager (REM) or resource conservation manager (RCM) programs. Resource efficiency management is a systems approach to managing a facility`s energy, water, and solid waste. Its aim is to reduce utility budgets by focusing on behavioral changes, maintenance and operation procedures, resource accounting, education and training, and a comprehensive awareness campaign that involves everyone in the organization.
An Efficiency Comparison of MBA Programs: Top 10 versus Non-Top 10
ERIC Educational Resources Information Center
Hsu, Maxwell K.; James, Marcia L.; Chao, Gary H.
2009-01-01
The authors compared the cohort group of the top-10 MBA programs in the United States with their lower-ranking counterparts on their value-added efficiency. The findings reveal that the top-10 MBA programs in the United States are associated with statistically higher average "technical and scale efficiency" and "scale efficiency", but not with a…
2010-06-15
Partitioning Application to a Cicada Mating Call Albert H. Nuttall Adaptive Methods Inc. Derke R. Hughes NUWC Division Newport IVAVSEA WARFARE...Frequency Partitioning: Application to a Cicada Mating Call 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Albert H... cicada mating call with a distinctly non-white and non-Gaussian excitation gives good results for the estimated first- and second-order kernels and
Mabu, Shingo; Hirasawa, Kotaro; Hu, Jinglu
2007-01-01
This paper proposes a graph-based evolutionary algorithm called Genetic Network Programming (GNP). Our goal is to develop GNP, which can deal with dynamic environments efficiently and effectively, based on the distinguished expression ability of the graph (network) structure. The characteristics of GNP are as follows. 1) GNP programs are composed of a number of nodes which execute simple judgment/processing, and these nodes are connected by directed links to each other. 2) The graph structure enables GNP to re-use nodes, thus the structure can be very compact. 3) The node transition of GNP is executed according to its node connections without any terminal nodes, thus the past history of the node transition affects the current node to be used and this characteristic works as an implicit memory function. These structural characteristics are useful for dealing with dynamic environments. Furthermore, we propose an extended algorithm, "GNP with Reinforcement Learning (GNPRL)" which combines evolution and reinforcement learning in order to create effective graph structures and obtain better results in dynamic environments. In this paper, we applied GNP to the problem of determining agents' behavior to evaluate its effectiveness. Tileworld was used as the simulation environment. The results show some advantages for GNP over conventional methods.
Impact of the Intensive Program of Emotional Intelligence (IPEI) on work supervisors.
Rodríguez García, Gustavo A; López-Pérez, Belén; Férreo Cruzado, Manuel A; Fernández Carrascoso, María E; Fernández, Juan
2017-11-01
This study aimed to evaluate the effect of the Intensive Program of Emotional Intelligence (IPEI; Fernández, 2016; Férreo, 2016) on middle managers’ emotional intelligence, as this variable may have a significant impact on personal satisfaction, task performance, and the work environment. The intervention was applied to work team supervisors in a large call center, as it is an overlooked sector in this topic. Two-hundred and eighty-two supervisors from a Madrid-based, Spanish multinational (51.4% men and 48.6% women) participated in this study. Participants were assigned to the experimental group (n = 190) or the control group (n = 92) by availability, according to management decision. All supervisors filled in two questionnaires to evaluate the different components of intrapersonal emotional intelligence (i.e., attention, clarity, and repair; TMMS-24; Fernández-Berrocal, Extremera, & Ramos, 2004) and cognitive and affective empathy (i.e., perspective taking, emotion understanding, empathic joy, and personal distress; TECA; López-Pérez, Fernández, & Abad, 2008). The findings showed an increase in the studied variables for the experimental group. The results obtained support middle managers’ training in emotional competences through short, efficient, economic programs. Potential limitations and implications of the results are discussed.
Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration
NASA Technical Reports Server (NTRS)
Groce, Alex; Joshi, Rajeev
2008-01-01
Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.
Does Competition Improve Public School Efficiency? A Spatial Analysis
ERIC Educational Resources Information Center
Misra, Kaustav
2010-01-01
Proponents of educational reform often call for policies to increase competition between schools. It is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. In many parts of the country, public schools experience significant competition from private schools; however,…
Catching errors with patient-specific pretreatment machine log file analysis.
Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa
2013-01-01
A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Who watches the watchers?: preventing fault in a fault tolerance library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanavige, C. D.
The Scalable Checkpoint/Restart library (SCR) was developed and is used by researchers at Lawrence Livermore National Laboratory to provide a fast and efficient method of saving and recovering large applications during runtime on high-performance computing (HPC) systems. Though SCR protects other programs, up until June 2017, nothing was actively protecting SCR. The goal of this project was to automate the building and testing of this library on the varying HPC architectures on which it is used. Our methods centered around the use of a continuous integration tool called Bamboo that allowed for automation agents to be installed on the HPCmore » systems themselves. These agents provided a way for us to establish a new and unique way to automate and customize the allocation of resources and running of tests with CMake’s unit testing framework, CTest, as well as integration testing scripts though an HPC package manager called Spack. These methods provided a parallel environment in which to test the more complex features of SCR. As a result, SCR is now automatically built and tested on several HPC architectures any time changes are made by developers to the library’s source code. The results of these tests are then communicated back to the developers for immediate feedback, allowing them to fix functionality of SCR that may have broken. Hours of developers’ time are now being saved from the tedious process of manually testing and debugging, which saves money and allows the SCR project team to focus their efforts towards development. Thus, HPC system users can use SCR in conjunction with their own applications to efficiently and effectively checkpoint and restart as needed with the assurance that SCR itself is functioning properly.« less
Superconductors Enable Lower Cost MRI Systems
NASA Technical Reports Server (NTRS)
2013-01-01
The future looks bright, light, and green, especially where aircraft are concerned. The division of NASA s Fundamental Aeronautics Program called the Subsonic Fixed Wing Project is aiming to reach new heights by 2025-2035, improving the efficiency and environmental impact of air travel by developing new capabilities for cleaner, quieter, and more fuel efficient aircraft. One of the many ways NASA plans to reach its aviation goals is by combining new aircraft configurations with an advanced turboelectric distributed propulsion (TeDP) system. Jeff Trudell, an engineer at Glenn Research Center, says, "The TeDP system consists of gas turbines generating electricity to power a large number of distributed motor-driven fans embedded into the airframe." The combined effect increases the effective bypass ratio and reduces drag to meet future goals. "While room temperature components may help reduce emissions and noise in a TeDP system, cryogenic superconducting electric motors and generators are essential to reduce fuel burn," says Trudell. Superconductors provide significantly higher current densities and smaller and lighter designs than room temperature equivalents. Superconductors are also able to conduct direct current without resistance (loss of energy) below a critical temperature and applied field. Unfortunately, alternating current (AC) losses represent the major part of the heat load and depend on the frequency of the current and applied field. A refrigeration system is necessary to remove the losses and its weight increases with decreasing temperature. In 2001, a material called magnesium diboride (MgB2) was discovered to be superconducting. The challenge, however, has been learning to manufacture MgB2 inexpensively and in long lengths to wind into large coils while meeting the application requirements.
SeqHBase: a big data toolset for family based sequencing data analysis.
He, Min; Person, Thomas N; Hebbring, Scott J; Heinzen, Ethan; Ye, Zhan; Schrodi, Steven J; McPherson, Elizabeth W; Lin, Simon M; Peissig, Peggy L; Brilliant, Murray H; O'Rawe, Jason; Robison, Reid J; Lyon, Gholson J; Wang, Kai
2015-04-01
Whole-genome sequencing (WGS) and whole-exome sequencing (WES) technologies are increasingly used to identify disease-contributing mutations in human genomic studies. It can be a significant challenge to process such data, especially when a large family or cohort is sequenced. Our objective was to develop a big data toolset to efficiently manipulate genome-wide variants, functional annotations and coverage, together with conducting family based sequencing data analysis. Hadoop is a framework for reliable, scalable, distributed processing of large data sets using MapReduce programming models. Based on Hadoop and HBase, we developed SeqHBase, a big data-based toolset for analysing family based sequencing data to detect de novo, inherited homozygous, or compound heterozygous mutations that may contribute to disease manifestations. SeqHBase takes as input BAM files (for coverage at every site), variant call format (VCF) files (for variant calls) and functional annotations (for variant prioritisation). We applied SeqHBase to a 5-member nuclear family and a 10-member 3-generation family with WGS data, as well as a 4-member nuclear family with WES data. Analysis times were almost linearly scalable with number of data nodes. With 20 data nodes, SeqHBase took about 5 secs to analyse WES familial data and approximately 1 min to analyse WGS familial data. These results demonstrate SeqHBase's high efficiency and scalability, which is necessary as WGS and WES are rapidly becoming standard methods to study the genetics of familial disorders. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Pilot study of a multidisciplinary gout patient education and monitoring program.
Fields, Theodore R; Rifaat, Adam; Yee, Arthur M F; Ashany, Dalit; Kim, Katherine; Tobin, Matthew; Oliva, Nicole; Fields, Kara; Richey, Monica; Kasturi, Shanthini; Batterman, Adena
2017-04-01
Gout patient self-management knowledge and adherence to treatment regimens are poor. Our objective was to assess the feasibility and acceptability of a multidisciplinary team-based pilot program for the education and monitoring of gout patients. Subjects completed a gout self-management knowledge exam, along with gout flare history and compliance questionnaires, at enrollment and at 6 and 12 months. Each exam was followed by a nursing educational intervention via a structured gout curriculum. Structured monthly follow-up calls from pharmacists emphasized adherence to management programs. Primary outcomes were subject and provider program evaluation questionnaires at 6 and 12 months, program retention rate and success in reaching patients via monthly calls. Overall, 40/45 subjects remained in the study at 12 months. At 12 months, on a scale of 1 (most) to 5 (least), ratings of 3 or better were given by 84.6% of subjects evaluating the usefulness of the overall program in understanding and managing their gout, 81.0% of subjects evaluating the helpfulness of the nursing education program, and 50.0% of subjects evaluating the helpfulness of the calls from the pharmacists. Knowledge exam questions that were most frequently answered incorrectly on repeat testing concerned bridge therapy, the possibility of being flare-free, and the genetic component of gout. Our multidisciplinary program of gout patient education and monitoring demonstrates feasibility and acceptability. We identified variability in patient preference for components of the program and persistent patient knowledge gaps. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Benedict, Richard; Rochon, Angela
1987-01-01
The authors describe vocational experiential learning programs, called "Enterprise Programs," at St. Clair County Skill Center in Michigan. These programs feature small groups of vocational students engaged in profit-making businesses that allow them to apply what they have learned and earn some money. The authors claim the program helps with…
Student Assistance Program Implementation and Evaluation.
ERIC Educational Resources Information Center
Dykeman, Cass
Recently, educators have initiated programs to help students address the social and emotional problems which can impair academic performance. This paper reviews current knowledge on one such program called a Student Assistance Program (SAP). SAPs were initially designed to intervene with chemically-dependent high school students, but more…
University of Maryland MRSEC - Education: College
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher MRSEC Templates Opportunities Search Home » Education » Undergraduate/Graduate Programs Undergraduate
Management Training for Directors.
ERIC Educational Resources Information Center
Yaptinchay, Karen
1998-01-01
Describes a management program for Head Start directors called the Head Start-Johnson & Johnson Management Fellows program that focuses on issues and problems encountered by directors in implementing and operating programs at the local level. Notes that the management program represents a response to increasing need for cost-effective and…
F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Saini, Subhash (Technical Monitor)
1998-01-01
Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
The assessment of ride service programs as an alcohol countermeasure
DOT National Transportation Integrated Search
1988-01-01
Ride Service Programs, frequently called safe ride or dial-a-ride programs, attempt to reduce alcohol-related crashes by providing alternative transportation to drinkers who would otherwise drive while intoxicated. This study identified 325 Ride Serv...
Efficient Double Auction Mechanisms in the Energy Grid with Connected and Islanded Microgrids
NASA Astrophysics Data System (ADS)
Faqiry, Mohammad Nazif
The future energy grid is expected to operate in a decentralized fashion as a network of autonomous microgrids that are coordinated by a Distribution System Operator (DSO), which should allocate energy to them in an efficient manner. Each microgrid operating in either islanded or grid-connected mode may be considered to manage its own resources. This can take place through auctions with individual units of the microgrid as the agents. This research proposes efficient auction mechanisms for the energy grid, with is-landed and connected microgrids. The microgrid level auction is carried out by means of an intermediate agent called an aggregator. The individual consumer and producer units are modeled as selfish agents. With the microgrid in islanded mode, two aggregator-level auction classes are analyzed: (i) price-heterogeneous, and (ii) price homogeneous. Under the price heterogeneity paradigm, this research extends earlier work on the well-known, single-sided Kelly mechanism to double auctions. As in Kelly auctions, the proposed algorithm implements the bidding without using any agent level private infor-mation (i.e. generation capacity and utility functions). The proposed auction is shown to be an efficient mechanism that maximizes the social welfare, i.e. the sum of the utilities of all the agents. Furthermore, the research considers the situation where a subset of agents act as a coalition to redistribute the allocated energy and price using any other specific fairness criterion. The price homogeneous double auction algorithm proposed in this research ad-dresses the problem of price-anticipation, where each agent tries to influence the equilibri-um price of energy by placing strategic bids. As a result of this behavior, the auction's efficiency is lowered. This research proposes a novel approach that is implemented by the aggregator, called virtual bidding, where the efficiency can be asymptotically maximized, even in the presence of price anticipatory bidders. Next, an auction mechanism for the energy grid, with multiple connected mi-crogrids is considered. A globally efficient bi-level auction algorithm is proposed. At the upper-level, the algorithm takes into account physical grid constraints in allocating energy to the microgrids. It is implemented by the DSO as a linear objective quadratic constraint problem that allows price heterogeneity across the aggregators. In parallel, each aggrega-tor implements its own lower-level price homogeneous auction with virtual bidding. The research concludes with a preliminary study on extending the DSO level auc-tion to multi-period day-ahead scheduling. It takes into account storage units and conven-tional generators that are present in the grid by formulating the auction as a mixed inte-ger linear programming problem.
Park, Haesuk; Adeyemi, Ayoade; Wang, Wei; Roane, Teresa E
To determine the impact of a telephone call reminder program provided by a campus-based medication therapy management call center on medication adherence in Medicare Advantage Part D (MAPD) beneficiaries with hypertension. The reminder call services were offered to eligible MAPD beneficiaries, and they included a live interactive conversation with patients to assess the use of their medications. This study used a quasi-experimental design for comparing the change in medication adherence between the intervention and matched control groups. Adherence, defined by proportion of days covered (PDC), was measured using incurred medication claims 6 months before and after the adherence program was implemented. A difference-in-differences approach with propensity score matching was used. After propensity score matching, paired samples included 563 patients in each of the intervention and control groups. The mean PDC (standard deviation) increased significantly during postintervention period by 17.3% (33.6; P <0.001) and 13.8% (32.3; P <0.001) for the intervention and the control groups, respectively; the greater difference-in-differences increase of 3.5% (36.3) in the intervention group over the control group was statistically significant (P = 0.022). A generalized estimating equation model adjusting for covariates further confirmed that the reminder call group had a significant increase in pre-post PDC (P = 0.021), as compared with the control group. Antihypertensive medication adherence increased in both reminder call and control groups, but the increase was significantly higher in the intervention group. A telephonic outreach program was effective in improving antihypertensive medication adherence in MAPD beneficiaries. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Viewing the Impact of Shared Services through the Four Frames of Bolman and Deal
ERIC Educational Resources Information Center
Schumacher, Kyle A.
2011-01-01
On March 31, 2011, Governor Quinn of Illinois called for schools to consolidate in order to become more financially and administratively efficient. This call for massive school reform is not new. Although consolidation, or reducing the number of school districts to save administrative costs, seemed radical to some, the idea of sharing services to…
Experience with a "hotline" service for outpatients on a ventricular assist device.
Biefer, Hector Rodriguez Cetina; Sündermann, Simon Harald; Emmert, Maximilian Yosri; Hasenclever, Peter; Lachat, Mario Louis; Falk, Volkmar; Wilhelm, Markus Johannes
2014-08-01
With the growing number of outpatients on ventricular assist devices (VADs), there is an increasing need for "home discharge programs." One important feature is a 24-hour telephone service. In our center, the perfusionists run a so-called "hotline" for all of our VAD patients. This study analyzes the hotline calls with regard to frequency, the reason for calling, and the type of action undertaken. Over a period of 5 years, 16 (12 EXCOR and 4 INCOR; Berlin Heart, Berlin, Germany) of 33 VAD patients (48%) were discharged and instructed to use the "hotline" service. All the calls received by the perfusionists were reviewed. We classified the calls into three levels according to the severity of the problem: Level (L) 1 = assistance provided by the perfusionist alone; L2 = calls requiring discussion with the surgeon on duty and/or visit to the outpatient clinic ahead of time; and L3 = immediate action and/or admission to the hospital. Over a period of 2,890 outpatient days (7.9 years), a total of 26 calls were registered. There were 0.9 calls per 100 patient days and 1.6 calls per discharged patient. Out of the 26 calls, 14 calls (54%) were classified as L1, 8 (31%) as L2, and 4 (15%) as L3. The most frequent reasons for L1 or L2 calls were fibrin deposits in the EXCOR pump chamber (39%), followed by battery dysfunction (19%). L3 calls were related to dysfunction of the EXCOR driving units in three cases and to an EXCOR pump chamber disconnection, which the patient did not survive. The institution of a hotline is an essential component of a VAD outpatient program. It provides a certain level of safety for the patient, although a residual risk remains. Georg Thieme Verlag KG Stuttgart · New York.
Projection methods for line radiative transfer in spherical media.
NASA Astrophysics Data System (ADS)
Anusha, L. S.; Nagendra, K. N.
An efficient numerical method called the Preconditioned Bi-Conjugate Gradient (Pre-BiCG) method is presented for the solution of radiative transfer equation in spherical geometry. A variant of this method called Stabilized Preconditioned Bi-Conjugate Gradient (Pre-BiCG-STAB) is also presented. These methods are based on projections on the subspaces of the n dimensional Euclidean space mathbb {R}n called Krylov subspaces. The methods are shown to be faster in terms of convergence rate compared to the contemporary iterative methods such as Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR).
Lean NOx Trap Catalysis for Lean Natural Gas Engine Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parks, II, James E; Storey, John Morse; Theiss, Timothy J
Distributed energy is an approach for meeting energy needs that has several advantages. Distributed energy improves energy security during natural disasters or terrorist actions, improves transmission grid reliability by reducing grid load, and enhances power quality through voltage support and reactive power. In addition, distributed energy can be efficient since transmission losses are minimized. One prime mover for distributed energy is the natural gas reciprocating engine generator set. Natural gas reciprocating engines are flexible and scalable solutions for many distributed energy needs. The engines can be run continuously or occasionally as peak demand requires, and their operation and maintenance ismore » straightforward. Furthermore, system efficiencies can be maximized when natural gas reciprocating engines are combined with thermal energy recovery for cooling, heating, and power applications. Expansion of natural gas reciprocating engines for distributed energy is dependent on several factors, but two prominent factors are efficiency and emissions. Efficiencies must be high enough to enable low operating costs, and emissions must be low enough to permit significant operation hours, especially in non-attainment areas where emissions are stringently regulated. To address these issues the U.S. Department of Energy and the California Energy Commission launched research and development programs called Advanced Reciprocating Engine Systems (ARES) and Advanced Reciprocating Internal Combustion Engines (ARICE), respectively. Fuel efficiency and low emissions are two primary goals of these programs. The work presented here was funded by the ARES program and, thus, addresses the ARES 2010 goals of 50% thermal efficiency (fuel efficiency) and <0.1 g/bhp-hr emissions of oxides of nitrogen (NOx). A summary of the goals for the ARES program is given in Table 1-1. ARICE 2007 goals are 45% thermal efficiency and <0.015 g/bhp-hr NOx. Several approaches for improving the efficiency and emissions of natural gas reciprocating engines are being pursued. Approaches include: stoichiometric engine operation with exhaust gas recirculation and three-way catalysis, advanced combustion modes such as homogeneous charge compression ignition, and extension of the lean combustion limit with advanced ignition concepts and/or hydrogen mixing. The research presented here addresses the technical approach of combining efficient lean spark-ignited natural gas combustion with low emissions obtained from a lean NOx trap catalyst aftertreatment system. This approach can be applied to current lean engine technology or advanced lean engines that may result from related efforts in lean limit extension. Furthermore, the lean NOx trap technology has synergy with hydrogen-assisted lean limit extension since hydrogen is produced from natural gas during the lean NOx trap catalyst system process. The approach is also applicable to other lean engines such as diesel engines, natural gas turbines, and lean gasoline engines; other research activities have focused on those applications. Some commercialization of the technology has occurred for automotive applications (both diesel and lean gasoline engine vehicles) and natural gas turbines for stationary power. The research here specifically addresses barriers to commercialization of the technology for large lean natural gas reciprocating engines for stationary power. The report presented here is a comprehensive collection of research conducted by Oak Ridge National Laboratory (ORNL) on lean NOx trap catalysis for lean natural gas reciprocating engines. The research was performed in the Department of Energy's ARES program from 2003 to 2007 and covers several aspects of the technology. All studies were conducted at ORNL on a Cummins C8.3G+ natural gas engine chosen based on industry input to simulate large lean natural gas engines. Specific technical areas addressed by the research include: NOx reduction efficiency, partial oxidation and reforming chemistry, and the effects of sulfur poisons on the partial oxidation, reformer, and lean NOx trap catalysts. The initial work on NOx reduction efficiency demonstrated that NOx emissions <0.1 g/bhp-hr (the ARES goal) can be achieved with the lean NOx trap catalyst technology. Subsequent work focused on cost and size optimization and durability issues which addressed two specific ARES areas of interest to industry ('Cost of Power' and 'Availability, Reliability, and Maintainability', respectively). Thus, the research addressed the approach of the lean NOx trap catalyst technology toward the ARES goals as shown in Table 1-1.« less
Dynamic Flow Management Problems in Air Transportation
NASA Technical Reports Server (NTRS)
Patterson, Sarah Stock
1997-01-01
In 1995, over six hundred thousand licensed pilots flew nearly thirty-five million flights into over eighteen thousand U.S. airports, logging more than 519 billion passenger miles. Since demand for air travel has increased by more than 50% in the last decade while capacity has stagnated, congestion is a problem of undeniable practical significance. In this thesis, we will develop optimization techniques that reduce the impact of congestion on the national airspace. We start by determining the optimal release times for flights into the airspace and the optimal speed adjustment while airborne taking into account the capacitated airspace. This is called the Air Traffic Flow Management Problem (TFMP). We address the complexity, showing that it is NP-hard. We build an integer programming formulation that is quite strong as some of the proposed inequalities are facet defining for the convex hull of solutions. For practical problems, the solutions of the LP relaxation of the TFMP are very often integral. In essence, we reduce the problem to efficiently solving large scale linear programming problems. Thus, the computation times are reasonably small for large scale, practical problems involving thousands of flights. Next, we address the problem of determining how to reroute aircraft in the airspace system when faced with dynamically changing weather conditions. This is called the Air Traffic Flow Management Rerouting Problem (TFMRP) We present an integrated mathematical programming approach for the TFMRP, which utilizes several methodologies, in order to minimize delay costs. In order to address the high dimensionality, we present an aggregate model, in which we formulate the TFMRP as a multicommodity, integer, dynamic network flow problem with certain side constraints. Using Lagrangian relaxation, we generate aggregate flows that are decomposed into a collection of flight paths using a randomized rounding heuristic. This collection of paths is used in a packing integer programming formulation, the solution of which generates feasible and near-optimal routes for individual flights. The algorithm, termed the Lagrangian Generation Algorithm, is used to solve practical problems in the southwestern portion of United States in which the solutions are within 1% of the corresponding lower bounds.
Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa
2010-02-21
We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.
78 FR 68377 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio NOX
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
...On November 15, 2010, Ohio EPA submitted to EPA revisions to Ohio OAC 3745-14. EPA is proposing to approve these revisions under the Clean Air Act, which allows for Ohio's Clean Air Interstate Rule (CAIR) NOX Ozone Season Trading Program rules to supersede Ohio's nitrogen oxides (NOX) State Implementation Plan (SIP) Call Budget Trading Program rules, but leave other requirements of the NOX SIP Call in place for units not covered by CAIR.
NASA Technical Reports Server (NTRS)
Estabrook, Polly; Moon, Todd; Spade, Rob
1996-01-01
This paper will discuss some of the challenges in connecting mobile satellite users and mobile terrestrial users in a cost efficient manner and with a grade of service comparable to that of satellite to fixed user calls. Issues arising from the translation between the mobility management protocols resident at the satellite Earth station and those resident at cellular switches - either GSM (Group Special Mobile) or IS-41 (used by U.S. digital cellular systems) type - will be discussed. The impact of GSM call routing procedures on the call setup of a satellite to roaming GSM user will be described. Challenges facing provision of seamless call handoff between satellite and cellular systems will be given. A summary of the issues explored in the paper are listed and future work outlined.
Pairing call-response surveys and distance sampling for a mammalian carnivore
Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.
2015-01-01
Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.
77 FR 54839 - Energy Efficiency and Conservation Loan Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... CFR Parts 1710, 1717, 1721, 1724, and 1730 RIN 0572-AC19 Energy Efficiency and Conservation Loan..., proposing policies and procedures for loan and guarantee financial assistance in support of energy efficiency programs (EE Programs) sponsored and implemented by electric utilities for the benefit of rural...
Discovering Knowledge from Noisy Databases Using Genetic Programming.
ERIC Educational Resources Information Center
Wong, Man Leung; Leung, Kwong Sak; Cheng, Jack C. Y.
2000-01-01
Presents a framework that combines Genetic Programming and Inductive Logic Programming, two approaches in data mining, to induce knowledge from noisy databases. The framework is based on a formalism of logic grammars and is implemented as a data mining system called LOGENPRO (Logic Grammar-based Genetic Programming System). (Contains 34…
Debugging a high performance computing program
Gooding, Thomas M.
2014-08-19
Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.
Status Report on the Program for Effective Teaching (PET).
ERIC Educational Resources Information Center
Arkansas State Dept. of Education, Little Rock.
During the 1979-80 school year, the Arkansas Department of Education, in cooperation with institutions of higher education and local education agencies, initiated a comprehensive staff development and instructional supervision program in school districts throughout the state. The purpose of the program, called Program for Effective Teaching (PET),…
Debugging a high performance computing program
Gooding, Thomas M.
2013-08-20
Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.
How the EWD Program Aims to Meet Workforce Needs. Policy Brief
ERIC Educational Resources Information Center
Jez, Su Jin; Nodine, Thad
2016-01-01
The Economic and Workforce Development Program (EWD) aims to support California's economy by aligning community college educational programs with workforce development needs. The program connects employers and community college educators through a network of workforce training resources and partnerships called "Doing What Matters for Jobs and…
NASA Technical Reports Server (NTRS)
Jefferys, S.; Johnson, W.; Lewis, R.; Rich, R.
1981-01-01
The software modules which comprise the IGDS/TRAP Interface Program are described. A hierarchical input processing output (HIPO) chart for each user command is given. The description consists of: (1) function of the user command; (2) calling sequence; (3) moduls which call this use command; (4) modules called by this user command; (5) IGDS commands used by this user command; and (6) local usage of global registers. Each HIPO contains the principal functions performed within the module. Also included with each function are a list of the inputs which may be required to perform the function and a list of the outputs which may be created as a result of performing the function.
Sembower, Mark A.; Ertischek, Michelle D.; Buchholtz, Chloe; Dasgupta, Nabarun; Schnoll, Sidney H.
2013-01-01
This article examines rates of nonmedical use and diversion of extended-release amphetamine and extended-release oral methylphenidate in the United States. Prescription dispensing data were sourced from retail pharmacies. Nonmedical use data were collected from the Researched Abuse, Diversion and Addiction-Related Surveillance (RADARS) System Drug Diversion Program and Poison Center Program. Drug diversion trends nearly overlapped for extended-release amphetamine and extended-release oral methylphenidate. Calls to poison centers were generally similar; however, calls regarding extended-release amphetamine trended slightly lower than those for extended-release oral methylphenidate. Data suggest similar diversion and poison center call rates for extended-release amphetamine and extended-release oral methylphenidate. PMID:23480245
Activity-Centric Approach to Distributed Programming
NASA Technical Reports Server (NTRS)
Levy, Renato; Satapathy, Goutam; Lang, Jun
2004-01-01
The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.
Computer codes for thermal analysis of a solid rocket motor nozzle
NASA Technical Reports Server (NTRS)
Chauhan, Rajinder Singh
1988-01-01
A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.
Energy efficiency in nonprofit agencies: Creating effective program models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, M.A.; Prindle, B.; Scherr, M.I.
Nonprofit agencies are a critical component of the health and human services system in the US. It has been clearly demonstrated by programs that offer energy efficiency services to nonprofits that, with minimal investment, they can educe their energy consumption by ten to thirty percent. This energy conservation potential motivated the Department of Energy and Oak Ridge National Laboratory to conceive a project to help states develop energy efficiency programs for nonprofits. The purpose of the project was two-fold: (1) to analyze existing programs to determine which design and delivery mechanisms are particularly effective, and (2) to create model programsmore » for states to follow in tailoring their own plans for helping nonprofits with energy efficiency programs. Twelve existing programs were reviewed, and three model programs were devised and put into operation. The model programs provide various forms of financial assistance to nonprofits and serve as a source of information on energy efficiency as well. After examining the results from the model programs (which are still on-going) and from the existing programs, several replicability factors'' were developed for use in the implementation of programs by other states. These factors -- some concrete and practical, others more generalized -- serve as guidelines for states devising program based on their own particular needs and resources.« less
2013-07-05
In this document, the Commission adopts further measures to improve the structure, efficiency, and quality of the video relay service (VRS) program, reducing the inefficiencies in the program, as well as reducing the risk of waste, fraud, and abuse, and ensuring that the program makes full use of advances in commercially-available technology. These measures involve a fundamental restructuring of the program to support innovation and competition, drive down ratepayer and provider costs, eliminate incentives for waste that have burdened the Telecommunications Relay Services (TRS) Fund in the past, and further protect consumers. The Commission adopts several measures in order to: ensure that VRS users can easily select their provider of choice by promoting the development of interoperability and portability standards; enable consumers to use off-the-shelf devices and deploying a VRS application to work with these devices; create a centralized TRS User Registration Database to ensure VRS user eligibility; encourage competition and innovation in VRS call handling services; spur research and development on VRS services by entering into a Memorandum of Understanding with the National Science Foundation; and pilot a National Outreach Program to educate the general public about relay services. In this document, the Commission also adopts new VRS compensation rates that move these rates toward actual costs over the next four years which will better approximate the actual, reasonable costs of providing VRS, and will reduce the costs of operating the program. The Commission takes these steps to ensure the integrity of the TRS Fund while providing stability and certainty to providers.
Ramirez, A Susana; Leyva, Bryan; Graff, Kaitlin; Nelson, David E; Huerta, Elmer
2015-07-01
Spanish-monolingual Latinos account for 13% of U.S. residents and experience multiple barriers to effective health communication. Information intermediaries/proxies mediate between the linguistically isolated and health care providers. This study characterizes the information needs of surrogate callers and their subjects to a U.S.-based Spanish-language radio health program. Content analysis of calls placed (N = 281 calls). Women made 70% of calls; 39.1% of calls were on behalf of children, 11.0% on behalf of parents/older adults, and 18.5% on behalf of spouses/siblings/contemporary adults. Most common topics were disease symptoms/conditions (19.6%), cancer (13.9%), and reproduction/sexuality (12.9%). Calls for children were more likely than those for parents/other adults to pertain to current illness symptoms or conditions; calls for parents were more likely to be about cancer/chronic conditions. Half of all calls sought clarification about a previous medical encounter. Information-seeking surrogates may represent a useful strategy for linguistic minorities to overcome structural and individual barriers to health information access. Results suggest that Latinos are willing to seek information on behalf of friends and family and highlight the need for improved, culturally and linguistically appropriate health communication sources. Leveraging Latinos' natural familial social networks/willingness to share information may improve dissemination of culturally and linguistically appropriate health information. Further implications for patient activation and doctor-patient communication are discussed. © 2015 Society for Public Health Education.
STC Synthesis of Best Practices for Determining Value of Research Results : Research Project Capsule
DOT National Transportation Integrated Search
2012-09-01
The RAC Region II has initiated a collaborative research program consortium : through the Transportation Pooled Fund (TPF) Program. The research program : is called the Southeast Transportation Consortium (STC) and is intended to : encourage coordina...
DOT National Transportation Integrated Search
2012-07-01
The RAC Region II has initiated a collaborative research program consortium through the : Transportation Pooled Fund (TPF) Program. The research program is called the Southeast : Transportation Consortium (STC) and is intended to encourage coordinati...
The fuel tax compliance unit : an evaluation and analysis of results.
DOT National Transportation Integrated Search
2004-01-01
Kentucky utilized TEA-21 federal funds to create an innovative pilot program to identify the best practices and methods for auditing taxpayers of transportation related taxes. This program involved a four-year experimental program called the Fuel Tax...
TRIP : The Transportation Remuneration and Incentive Program in West Virginia, 1974-1979
DOT National Transportation Integrated Search
1982-07-01
Between July 1974 and June 1979, the State of West Virginia was host to the largest Federal demonstration program for improving rural transit service called Transportation Remuneration Incentive Program (TRIP). The remuneration part of TRIP (ticket s...
ERIC Educational Resources Information Center
Tesler, Lawrence G.
1984-01-01
Discusses the nature of programing languages, considering the features of BASIC, LOGO, PASCAL, COBOL, FORTH, APL, and LISP. Also discusses machine/assembly codes, the operation of a compiler, and trends in the evolution of programing languages (including interest in notational systems called object-oriented languages). (JN)
NASA Astrophysics Data System (ADS)
Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard
2017-07-01
Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.
Software for Automated Reading of STEP Files by I-DEAS(trademark)
NASA Technical Reports Server (NTRS)
Pinedo, John
2003-01-01
A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... Approved Information Collection for the Energy Efficiency and Conservation Block Grant Program Status... guidance concerning the Energy Efficiency and Conservation Block Grant (EECBG) Program is available for... Conservation Block Grant (EECBG) Program Status Report''; (3) Type of Review: Revision of currently approved...
Organizational determinants of efficiency and effectiveness in mental health partial care programs.
Schinnar, A P; Kamis-Gould, E; Delucia, N; Rothbard, A B
1990-01-01
The use of partial care as a treatment modality for mentally ill patients, particularly the chronically mentally ill, has greatly increased. However, research into what constitutes a "good" program has been scant. This article reports on an evaluation study of staff productivity, cost efficiency, and service effectiveness of adult partial care programs carried out in New Jersey in fiscal year 1984/1985. Five program performance indexes are developed based on comparisons of multiple measures of resources, service activities, and client outcomes. These are used to test various hypotheses regarding the effect of organizational and fiscal variables on partial care program efficiency and effectiveness. The four issues explored are: auspices, organizational complexity, service mix, and fiscal control by the state. These were found to explain about half of the variance in program performance. In addition, partial care programs demonstrating midlevel performance with regard to productivity and efficiency were observed to be the most effective, implying a possible optimal level of efficiency at which effectiveness is maximized. PMID:2113046
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moren, Richard J.; Grindstaff, Keith D.
Hanford's Long-Term Stewardship (LTS) Program has evolved from a small, informal process, with minimal support, to a robust program that provides comprehensive transitions from cleanup contractors to long-term stewardship for post-cleanup requirements specified in the associated cleanup decision documents. The LTS Program has the responsibility for almost 100,000 acres of land, along with over 200 waste sites and will soon have six cocooned reactors. Close to 2,600 documents have been identified and tagged for storage in the LTS document library. The program has successfully completed six consecutive transitions over the last two years in support of the U.S. DOE Richlandmore » Operations Office's (DOE-RL) near-term cleanup objectives of significantly reducing the footprint of active cleanup operations for the River Corridor. The program has evolved from one that was initially responsible for defining and measuring Institutional Controls for the Hanford Site, to a comprehensive, post remediation surveillance and maintenance program that begins early in the transition process. In 2013, the first reactor area -- the cocooned 105-F Reactor and its surrounding 1,100 acres, called the F Area was transitioned. In another first, the program is expected to transition the five remaining cocooned reactors into the program through using a Transition and Turnover Package (TTP). As Hanford's LTS Program moves into the next few years, it will continue to build on a collaborative approach. The program has built strong relationships between contractors, regulators, tribes and stakeholders and with the U.S. Department of Energy's Office of Legacy Management (LM). The LTS Program has been working with LM since its inception. The transition process utilized LM's Site Transition Framework as one of the initial requirement documents and the Hanford Program continues to collaborate with LM today. One example of this collaboration is the development of the LTS Program's records management system in which, LM has been instrumental. The development of a rigorous data collection and records management systems has been influenced and built off of LMs success, which also ensures compatibility between what Hanford's LTS Program develops and LM. In another example, we are exploring a pilot project to ship records from the Hanford Site directly to LM for long-term storage. This pilot would gain program efficiencies so that records would be handled only once. Rather than storage on-site, then shipment to an interim Federal Records Center in Seattle, records would be shipped directly to LM. The Hanford LTS Program is working to best align programmatic processes, find efficiencies, and to benchmark site transition requirements. Involving the Hanford LTS Program early in the transition process with an integrated contractor and DOE team is helping to ensure that there is time to work through details on the completed remediation of transitioning areas. It also will allow for record documentation and storage for the future, and is an opportunity for the program to mature through the experiences that will be gained by implementing LTS Program activities over time.« less
ERIC Educational Resources Information Center
Dalbey, John; Linn, Marcia
Spider World is an interactive program designed to help individuals with no previous computer experience to learn the fundamentals of programming. The program emphasizes cognitive tasks which are central to programming and provides significant problem-solving opportunities. In Spider World, the user commands a hypothetical robot (called the…