Science.gov

Sample records for additional computational costs

  1. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  2. Calculators and Computers: Graphical Addition.

    ERIC Educational Resources Information Center

    Spero, Samuel W.

    1978-01-01

    A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

  3. Cost of Computer Searching

    ERIC Educational Resources Information Center

    Chenery, Peter J.

    1973-01-01

    The program described has the primary objective of making Federally generated technology and research information available to public and private agencies. Cost analysis, data banks, and search strategies are explained. (Author/DH)

  4. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  5. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  6. Cutting Costs on Computer Forms.

    ERIC Educational Resources Information Center

    Rupp, Robert V., Jr.

    1989-01-01

    Using the experience of Ford Motor Company, Oscar Meyer, and IBM, this article shows that companies are enjoying high quality product performance and substantially lower costs by converting from premium white bond computer stock forms to blended bond forms. School administrators are advised to do likewise. (MLH)

  7. A Study of Additional Costs of Second Language Instruction.

    ERIC Educational Resources Information Center

    McEwen, Nelly

    A study was conducted whose primary aim was to identify and explain additional costs incurred by Alberta, Canada school jurisdictions providing second language instruction in 1980. Additional costs were defined as those which would not have been incurred had the second language program not been in existence. Three types of additional costs were…

  8. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  9. Cost and Effectiveness of Computer Based Training.

    ERIC Educational Resources Information Center

    Fletcher, J. D.

    The methodology and results of cost-effectiveness evaluations of computer based instruction used in military training are discussed. Methodological issues center on decisions about what cost elements and what effectiveness measures should be included, and how they should be combined. Preliminary results suggest that, in general, computer based…

  10. Cut Costs with Thin Client Computing.

    ERIC Educational Resources Information Center

    Hartley, Patrick H.

    2001-01-01

    Discusses how school districts can considerably increase the number of administrative computers in their districts without a corresponding increase in costs by using the "Thin Client" component of the Total Cost of Ownership (TCC) model. TCC and Thin Client are described, including its software and hardware components. An example of a Thin Client…

  11. Computer-Controlled HVAC -- at Low Cost

    ERIC Educational Resources Information Center

    American School and University, 1974

    1974-01-01

    By tying into a computerized building-automation network, Schaumburg High School, Illinois, slashed its energy consumption by one-third. The remotely connected computer controls the mechanical system for the high school as well as other buildings in the community, with the cost being shared by all. (Author)

  12. The Hidden Costs of Wireless Computer Labs

    ERIC Educational Resources Information Center

    Daly, Una

    2005-01-01

    Various elementary schools and middle schools across the U.S. have purchased one or more mobile laboratories. Although the wireless labs have provided more classroom computing, teachers and technology aides still have mixed views about their cost-benefit ratio. This is because the proliferation of viruses and spyware has dramatically increased…

  13. Additive Manufacturing of Low Cost Upper Stage Propulsion Components

    NASA Technical Reports Server (NTRS)

    Protz, Christopher; Bowman, Randy; Cooper, Ken; Fikes, John; Taminger, Karen; Wright, Belinda

    2014-01-01

    NASA is currently developing Additive Manufacturing (AM) technologies and design tools aimed at reducing the costs and manufacturing time of regeneratively cooled rocket engine components. These Low Cost Upper Stage Propulsion (LCUSP) tasks are funded through NASA's Game Changing Development Program in the Space Technology Mission Directorate. The LCUSP project will develop a copper alloy additive manufacturing design process and develop and optimize the Electron Beam Freeform Fabrication (EBF3) manufacturing process to direct deposit a nickel alloy structural jacket and manifolds onto an SLM manufactured GRCop chamber and Ni-alloy nozzle. In order to develop these processes, the project will characterize both the microstructural and mechanical properties of the SLMproduced GRCop-84, and will explore and document novel design techniques specific to AM combustion devices components. These manufacturing technologies will be used to build a 25K-class regenerative chamber and nozzle (to be used with tested DMLS injectors) that will be tested individually and as a system in hot fire tests to demonstrate the applicability of the technologies. These tasks are expected to bring costs and manufacturing time down as spacecraft propulsion systems typically comprise more than 70% of the total vehicle cost and account for a significant portion of the development schedule. Additionally, high pressure/high temperature combustion chambers and nozzles must be regeneratively cooled to survive their operating environment, causing their design to be time consuming and costly to build. LCUSP presents an opportunity to develop and demonstrate a process that can infuse these technologies into industry, build competition, and drive down costs of future engines.

  14. Cost Estimation of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Piili, Heidi; Happonen, Ari; Väistö, Tapio; Venkataramanan, Vijaikrishnan; Partanen, Jouni; Salminen, Antti

    Laser additive manufacturing (LAM) is a layer wise fabrication method in which a laser beam melts metallic powder to form solid objects. Although 3D printing has been invented 30 years ago, the industrial use is quite limited whereas the introduction of cheap consumer 3D printers, in recent years, has familiarized the 3D printing. Interest is focused more and more in manufacturing of functional parts. Aim of this study is to define and discuss the current economic opportunities and restrictions of LAM process. Manufacturing costs were studied with different build scenarios each with estimated cost structure by calculated build time and calculating the costs of the machine, material and energy with optimized machine utilization. All manufacturing and time simulations in this study were carried out with a research machine equal to commercial EOS M series equipment. The study shows that the main expense in LAM is the investment cost of the LAM machine, compared to which the relative proportions of the energy and material costs are very low. The manufacturing time per part is the key factor to optimize costs of LAM.

  15. 2 CFR 200.453 - Materials and supplies costs, including costs of computing devices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... costs of computing devices. 200.453 Section 200.453 Grants and Agreements Office of Management and... Provisions for Selected Items of Cost § 200.453 Materials and supplies costs, including costs of computing... performance of a Federal award may be charged as direct costs. In the specific case of computing...

  16. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  17. Cost Considerations in Nonlinear Finite-Element Computing

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.

    1985-01-01

    Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.

  18. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... indirect costs on the same basis as the allocation of indirect costs to sponsored research and development. (3) The cost of IR & D, including its proportionate share of indirect costs, is unallowable. (End of... are allowable as indirect costs. (3) B & P costs of past accounting periods are unallowable in...

  19. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... costs of the current accounting period are allowable as indirect costs; bid and proposal costs of past... indirect costs on the same basis as the allocations of indirect costs of sponsored research and development. The costs of independent research and development, including its proportionate share of indirect...

  20. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... costs of the current accounting period are allowable as indirect costs; bid and proposal costs of past... indirect costs on the same basis as the allocations of indirect costs of sponsored research and development. The costs of independent research and development, including its proportionate share of indirect...

  1. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... costs of the current accounting period are allowable as indirect costs; bid and proposal costs of past... indirect costs on the same basis as the allocations of indirect costs of sponsored research and development. The costs of independent research and development, including its proportionate share of indirect...

  2. Computed tomography: cost and efficacy implications.

    PubMed

    Abrams, H L; McNeil, B J

    1978-07-01

    Because CT is unique, it has been accepted by physicians with unrestrained enthusiasm. However, the capital investment and cost of maintenance are high, and there has been no orderly program of dispersion despite the profound interest of the regulatory agencies in cost containment. Although the diagnostic accuracy of CT in both the head and body is high, its information gain over other competing imaging methods, particularly those in the abdomen (ultrasound, nuclear medicine), has not been fully documented. In evaluating the cost effectiveness of CT, long-term outcome, while the most important criterion, requires carefully controlled studies over many years. Short-term value may be measured by assessing the degree to which CT furnishes new diagnostic information, its accuracy, its effect on the morbidity and mortality of diagnostic and theraupeutic procedures, its impact on treatment planning, and changes in cost and saving incident to its use. Prospective studies must relate the contribution of CT to that of competing methods and document the impact of additional diagnostic information. PMID:97990

  3. Can Additional Homeopathic Treatment Save Costs? A Retrospective Cost-Analysis Based on 44500 Insured Persons

    PubMed Central

    Ostermann, Julia K.; Reinhold, Thomas; Witt, Claudia M.

    2015-01-01

    Objectives The aim of this study was to compare the health care costs for patients using additional homeopathic treatment (homeopathy group) with the costs for those receiving usual care (control group). Methods Cost data provided by a large German statutory health insurance company were retrospectively analysed from the societal perspective (primary outcome) and from the statutory health insurance perspective. Patients in both groups were matched using a propensity score matching procedure based on socio-demographic variables as well as costs, number of hospital stays and sick leave days in the previous 12 months. Total cumulative costs over 18 months were compared between the groups with an analysis of covariance (adjusted for baseline costs) across diagnoses and for six specific diagnoses (depression, migraine, allergic rhinitis, asthma, atopic dermatitis, and headache). Results Data from 44,550 patients (67.3% females) were available for analysis. From the societal perspective, total costs after 18 months were higher in the homeopathy group (adj. mean: EUR 7,207.72 [95% CI 7,001.14–7,414.29]) than in the control group (EUR 5,857.56 [5,650.98–6,064.13]; p<0.0001) with the largest differences between groups for productivity loss (homeopathy EUR 3,698.00 [3,586.48–3,809.53] vs. control EUR 3,092.84 [2,981.31–3,204.37]) and outpatient care costs (homeopathy EUR 1,088.25 [1,073.90–1,102.59] vs. control EUR 867.87 [853.52–882.21]). Group differences decreased over time. For all diagnoses, costs were higher in the homeopathy group than in the control group, although this difference was not always statistically significant. Conclusion Compared with usual care, additional homeopathic treatment was associated with significantly higher costs. These analyses did not confirm previously observed cost savings resulting from the use of homeopathy in the health care system. PMID:26230412

  4. Estimating the additional cost of disability: beyond budget standards.

    PubMed

    Wilkinson-Meyers, Laura; Brown, Paul; McNeill, Robert; Patston, Philip; Dylan, Sacha; Baker, Ronelle

    2010-11-01

    Disabled people have long advocated for sufficient resources to live a life with the same rights and responsibilities as non-disabled people. Identifying the unique resource needs of disabled people relative to the population as a whole and understanding the source of these needs is critical for determining adequate levels of income support and for prioritising service provision. Previous attempts to identify the resources and costs associated with disability have tended to rely on surveys of current resource use. These approaches have been criticised as being inadequate for identifying the resources that would be required to achieve a similar standard of living to non-disabled people and for not using methods that are acceptable to and appropriate for the disabled community. The challenge is therefore to develop a methodology that accurately identifies these unique resource needs, uses an approach that is acceptable to the disabled community, enables all disabled people to participate, and distinguishes 'needs' from 'wants.' This paper describes and presents the rationale for a mixed methodology for identifying and prioritising the resource needs of disabled people. The project is a partnership effort between disabled researchers, a disability support organisation and academic researchers in New Zealand. The method integrates a social model of disability framework and an economic cost model using a budget standards approach to identify additional support, equipment, travel and time required to live an 'ordinary life' in the community. A survey is then used to validate the findings and identify information gaps and resource priorities of the community. Both the theoretical basis of the approach and the practical challenges of designing and implementing a methodology that is acceptable to the disabled community, service providers and funding agencies are discussed. PMID:20933315

  5. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... scientific, cost and other data needed to support the bids, proposals and applications. Bid and proposal... practice is to treat these costs by some other method, they may be accepted if they are found to...

  6. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-Federal contracts, grants, and agreements, including the development of scientific, cost, and other data... method, they may be accepted if they are found to be reasonable and equitable. (4) B & P costs do...

  7. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... scientific, cost, and other data needed to support the bids, proposals, and applications. Bid and proposal... practice is to treat these costs by some other method, they may be accepted if they are found to...

  8. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  9. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  10. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  11. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  12. 38 CFR 36.4404 - Computation of cost.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cost of adaptations. Under section 2101(b) of Chapter 21, for the purpose of computing the amount of... market value of the adaptations, including installation costs, determined to be reasonably necessary,...

  13. Additional development of the XTRAN3S computer program

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.

  14. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  15. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  16. On the capabilities and computational costs of neuron models.

    PubMed

    Skocik, Michael J; Long, Lyle N

    2014-08-01

    We review the Hodgkin-Huxley, Izhikevich, and leaky integrate-and-fire neuron models in regular spiking modes solved with the forward Euler, fourth-order Runge-Kutta, and exponential Euler methods and determine the necessary time steps and corresponding computational costs required to make the solutions accurate. We conclude that the leaky integrate-and-fire needs the least number of computations, and that the Hodgkin-Huxley and Izhikevich models are comparable in computational cost. PMID:25050945

  17. Cost-Justification of Computers in General Practice in Canada

    PubMed Central

    McAlister, Neil Harding; Covvey, H. Dominic; McAlister, Nazlin K.

    1978-01-01

    In General Practice, computers might assist clinical decision-making,perform business procedures, and support health care delivery research. Before being used, however, computers first must be economically justifiable. The cost of computer systems is known. One can estimate their potential dollar benefit in primary care. Computer technology was therefore assessed for its potential to save money in a model General Practice. Information processing needs were noted, functional specifications were developed, and typical costs for systems appropriate to practices of varying size were calculated. Computers might improve primary care in many ways, but savings accrue only from support of billing and accounting. Savings might equal or exceed the cost of a computer system in groups of practitioners, optimally composed of between six and eight doctors. If computers could pay for themselves by performing essential business functions, they would then be readily available for other purposes in General Practice.

  18. Estimating boiling water reactor decommissioning costs. A user`s manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    SciTech Connect

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  19. Estimating pressurized water reactor decommissioning costs: A user`s manual for the PWR Cost Estimating Computer Program (CECP) software. Draft report for comment

    SciTech Connect

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  20. Estimating boiling water reactor decommissioning costs: A user`s manual for the BWR Cost Estimating Computer Program (CECP) software. Final report

    SciTech Connect

    Bierschbach, M.C.

    1996-06-01

    Nuclear power plant licensees are required to submit to the US Nuclear Regulatory Commission (NRC) for review their decommissioning cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning boiling water reactor (BWR) power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  1. Computers and Media Centers: Services, Satisfaction, and Cost Effectiveness.

    ERIC Educational Resources Information Center

    Givens, Patsy B.

    A survey was conducted of school media centers throughout the United States to determine: (1) how computers are being utilized by these centers, (2) the levels of satisfaction with present services, and (3) whether or not the services being provided by the computer are cost effective. Responses to survey forms returned by 20 school districts and…

  2. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  3. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    PubMed

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p < 0.001). Significant differences in mean costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p < 0.001). Significant differences were also obtained between the costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. PMID:25459375

  4. Cost analysis for computer supported multiple-choice paper examinations

    PubMed Central

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam. PMID:22205913

  5. 47 CFR 25.111 - Additional information and ITU cost recovery.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Additional information and ITU cost recovery. 25.111 Section 25.111 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER....111 Additional information and ITU cost recovery. (a) The Commission may request from any party at...

  6. Accurate charge capture and cost allocation: cost justification for bedside computing.

    PubMed Central

    Grewal, R.; Reed, R. L.

    1993-01-01

    This paper shows that cost justification for bedside clinical computing can be made by recouping charges with accurate charge capture. Twelve months worth of professional charges for a sixteen bed surgical intensive care unit are computed from charted data in a bedside clinical database and are compared to the professional charges actually billed by the unit. A substantial difference in predicted charges and billed charges was found. This paper also discusses the concept of appropriate cost allocation in the inpatient environment and the feasibility of appropriate allocation as a by-product of bedside computing. PMID:8130444

  7. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  8. Software Requirements for a System to Compute Mean Failure Cost

    SciTech Connect

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder. We also demonstrated this infrastructure through the results of security breakdowns for the ecommerce case. In this paper, we illustrate this infrastructure by an application that supports the computation of the Mean Failure Cost (MFC) for each stakeholder.

  9. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  10. Low cost spacecraft computers: Oxymoron or future trend?

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  11. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  12. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  13. Prospects for cost reductions from relaxing additional cross-border measures related to livestock trade.

    PubMed

    Hop, G E; Mourits, M C M; Slager, R; Oude Lansink, A G J M; Saatkamp, H W

    2013-05-01

    Compared with the domestic trade in livestock, intra-communal trade across the European Union (EU) is subject to costly, additional veterinary measures. Short-distance transportation just across a border requires more measures than long-distance domestic transportation, while the need for such additional cross-border measures can be questioned. This study examined the prospects for cost reductions from relaxing additional cross-border measures related to trade within the cross-border region of the Netherlands (NL) and Germany (GER); that is, North Rhine Westphalia and Lower Saxony. The study constructed a deterministic spread-sheet cost model to calculate the costs of both routine veterinary measures (standard measures that apply to both domestic and cross-border transport) and additional cross-border measures (extra measures that only apply to cross-border transport) as applied in 2010. This model determined costs by stakeholder, region and livestock sector, and studied the prospects for cost reduction by calculating the costs after the relaxation of additional cross-border measures. The selection criteria for relaxing these measures were (1) a low expected added value on preventing contagious livestock diseases, (2) no expected additional veterinary risks in case of relaxation of measures and (3) reasonable cost-saving possibilities. The total cost of routine veterinary measures and additional cross-border measures for the cross-border region was €22.1 million, 58% (€12.7 million) of which came from additional cross-border measures. Two-thirds of this €12.7 million resulted from the trade in slaughter animals. The main cost items were veterinary checks on animals (twice in the case of slaughter animals), export certification and control of export documentation. Four additional cross-border measures met the selection criteria for relaxation. The relaxation of these measures could save €8.2 million (€5.0 million for NL and €3.2 million for GER) annually

  14. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  15. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  16. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  17. The cost-effectiveness of HPV vaccination in addition to screening: a Dutch perspective.

    PubMed

    Setiawan, Didik; Luttjeboer, Jos; Westra, Tjalke Arend; Wilschut, Jan C; Suwantika, Auliya A; Daemen, Toos; Atthobari, Jarir; Wilffert, Bob; Postma, Maarten J

    2015-04-01

    Addition of the HPV vaccine to available cytological screening has been proposed to increase HPV-related cancer prevention. A comprehensive review on this combined strategy implemented in the Netherlands is lacking. For this review, we therefore analyzed all relevant studies on cost-effectiveness of HPV vaccines in combination with cervical screening in the Netherlands. Most of the studies agree that vaccination in pre-sexual-activity periods of life is cost-effective. Based on published sensitivity analyses, the incremental cost-effectiveness ratio was found to be mainly driven by vaccine cost and discount rates. Fewer vaccine doses, inclusion of additional benefits of these vaccines to prevent HPV-related non-cervical cancers and vaccination of males to further reduce the burden of HPV-induced cancers are three relevant options suggested to be investigated in upcoming economic evaluations. PMID:25482311

  18. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  19. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  20. Cost-effectiveness of additional catheter-directed thrombolysis for deep vein thrombosis

    PubMed Central

    ENDEN, T.; RESCH, S.; WHITE, C.; WIK, H. S.; KLØW, N. E.; SANDSET, P. M.

    2013-01-01

    Summary Background Additional treatment with catheter-directed thrombolysis (CDT) has recently been shown to reduce post-thrombotic syndrome (PTS). Objectives To estimate the cost effectiveness of additional CDT compared with standard treatment alone. Methods Using a Markov decision model, we compared the two treatment strategies in patients with a high proximal deep vein thrombosis (DVT) and a low risk of bleeding. The model captured the development of PTS, recurrent venous thromboembolism and treatment-related adverse events within a lifetime horizon and the perspective of a third-party payer. Uncertainty was assessed with one-way and probabilistic sensitivity analyzes. Model inputs from the CaVenT study included PTS development, major bleeding from CDT and utilities for post DVT states including PTS. The remaining clinical inputs were obtained from the literature. Costs obtained from the CaVenT study, hospital accounts and the literature are expressed in US dollars ($); effects in quality adjusted life years (QALY). Results In base case analyzes, additional CDT accumulated 32.31 QALYs compared with 31.68 QALYs after standard treatment alone. Direct medical costs were $64 709 for additional CDT and $51 866 for standard treatment. The incremental cost-effectiveness ratio (ICER) was $20 429/QALY gained. One-way sensitivity analysis showed model sensitivity to the clinical efficacy of both strategies, but the ICER remained < $55 000/QALY over the full range of all parameters. The probability that CDT is cost effective was 82% at a willingness to pay threshold of $50 000/QALY gained. Conclusions Additional CDT is likely to be a cost-effective alternative to the standard treatment for patients with a high proximal DVT and a low risk of bleeding. PMID:23452204

  1. A Web-Based Computer-Tailored Alcohol Prevention Program for Adolescents: Cost-Effectiveness and Intersectoral Costs and Benefits

    PubMed Central

    2016-01-01

    Background Preventing excessive alcohol use among adolescents is important not only to foster individual and public health, but also to reduce alcohol-related costs inside and outside the health care sector. Computer tailoring can be both effective and cost-effective for working with many lifestyle behaviors, yet the available information on the cost-effectiveness of computer tailoring for reducing alcohol use by adolescents is limited as is information on the costs and benefits pertaining to sectors outside the health care sector, also known as intersectoral costs and benefits (ICBs). Objective The aim was to assess the cost-effectiveness of a Web-based computer-tailored intervention for reducing alcohol use and binge drinking by adolescents from a health care perspective (excluding ICBs) and from a societal perspective (including ICBs). Methods Data used were from the Alcoholic Alert study, a cluster randomized controlled trial with randomization at the level of schools into two conditions. Participants either played a game with tailored feedback on alcohol awareness after the baseline assessment (intervention condition) or received care as usual (CAU), meaning that they had the opportunity to play the game subsequent to the final measurement (waiting list control condition). Data were recorded at baseline (T0=January/February 2014) and after 4 months (T1=May/June 2014) and were used to calculate incremental cost-effectiveness ratios (ICERs), both from a health care perspective and a societal perspective. Stochastic uncertainty in the data was dealt with by using nonparametric bootstraps (5000 simulated replications). Additional sensitivity analyses were conducted based on excluding cost outliers. Subgroup cost-effectiveness analyses were conducted based on several background variables, including gender, age, educational level, religion, and ethnicity. Results From both the health care perspective and the societal perspective for both outcome measures, the

  2. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models. PMID:26336695

  3. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253...

  4. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND...

  5. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND...

  6. A low cost human computer interface based on eye tracking.

    PubMed

    Hiley, Jonathan B; Redekopp, Andrew H; Fazel-Rezai, Reza

    2006-01-01

    This paper describes the implementation of a human computer interface based on eye tracking. Current commercially available systems exist, but have limited use due mainly to their large cost. The system described in this paper was designed to be a low cost and unobtrusive. The technique was video-oculography assisted by corneal reflections. An off-the shelf CCD webcam was used to capture images. The images were analyzed in software to extract key features of the eye. The users gaze point was then calculated based on the relative position of these features. The system is capable of calculating eye-gaze in real-time to provide a responsive interaction. A throughput of eight gaze points per second was achieved. The accuracy of the fixations based on the calculated eye-gazes were within 1 cm of the on-screen gaze location. By developing a low-cost system, this technology is made accessible to a wider range of applications. PMID:17946167

  7. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  8. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  9. Low Cost Injection Mold Creation via Hybrid Additive and Conventional Manufacturing

    SciTech Connect

    Dehoff, Ryan R.; Watkins, Thomas R.; List, III, Frederick Alyious; Carver, Keith; England, Roger

    2015-12-01

    The purpose of the proposed project between Cummins and ORNL is to significantly reduce the cost of the tooling (machining and materials) required to create injection molds to make plastic components. Presently, the high cost of this tooling forces the design decision to make cast aluminum parts because Cummins typical production volumes are too low to allow injection molded plastic parts to be cost effective with the amortized cost of the injection molding tooling. In addition to reducing the weight of components, polymer injection molding allows the opportunity for the alternative cooling methods, via nitrogen gas. Nitrogen gas cooling offers an environmentally and economically attractive cooling option, if the mold can be manufactured economically. In this project, a current injection molding design was optimized for cooling using nitrogen gas. The various components of the injection mold tooling were fabricated using the Renishaw powder bed laser additive manufacturing technology. Subsequent machining was performed on the as deposited components to form a working assembly. The injection mold is scheduled to be tested in a projection setting at a commercial vendor selected by Cummins.

  10. SideRack: A Cost-Effective Addition to Commercial Zebrafish Housing Systems

    PubMed Central

    Burg, Leonard; Gill, Ryan; Balciuniene, Jorune

    2014-01-01

    Abstract Commercially available aquatic housing systems provide excellent and relatively trouble-free hardware for rearing and housing juvenile as well as adult zebrafish. However, the cost of such systems is quite high and potentially prohibitive for smaller educational and research institutions. The need for tank space prompted us to experiment with various additions to our existing Aquaneering system. We also noted that high water exchange rates typical in commercial systems are suboptimal for quick growth of juvenile fish. We devised a housing system we call “SideRack,” which contains 20 large tanks with air supply and slow water circulation. It enables cost-effective expansion of existing fish facility, with a key additional benefit of increased growth and maturation rates of juvenile fish. PMID:24611601

  11. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-01

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. PMID:25753841

  12. Manual of phosphoric acid fuel cell power plant cost model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  13. High-performance, cost-effective computer and video interfacing to megapixel flat panel displays (FPDs)

    NASA Astrophysics Data System (ADS)

    Wedding, Carol A.; Stoller, Ray A.

    1994-04-01

    In the Information Age of multimedia computer and presentation, video-telephone conferencing and interactive television, display designs have increased demands for even higher resolution, greater definition, increased color palette, higher speed and large size. These performance demands are required on equipment that is space efficient, transportable, and usable in commercial, industrial and military applications. FPD designs are flourishing because the CRT is neither space efficient or easily transportable, nor cost effective in larger sizes and in ruggedized form. In addition, communications and computing systems for high definition imaging information are based on digital interfacing rather than analog. A video digital interface (VDI) is therefore more cost effective and higher performance for FPDs which use data directly from the computer system and circumvent the analog and rescan conversions that occur for CRTs. Photonics has developed and produces high resolution AC plasma FPDs that can accept both in analog or digital form imaging information that is presented at up to 75 frames per second, 1280 X 1024 full color pixel resolution and 8 bits of gray scale per color channel. This paper explores cost effective and high performance capabilities of the FPD-VDI and how it integrated with high definition computer and communications systems.

  14. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  15. Optimizing breast cancer follow-up: diagnostic value and costs of additional routine breast ultrasound.

    PubMed

    Wojcinski, Sebastian; Farrokh, Andre; Hille, Ursula; Hirschauer, Elke; Schmidt, Werner; Hillemanns, Peter; Degenhardt, Friedrich

    2011-02-01

    A total of 2,546,325 breast cancer survivors are estimated to live in the United States. The organized breast cancer follow-up programs do not generally include breast ultrasound in asymptomatic women. The purpose of our prospective study was to investigate the efficacy of breast ultrasound in detecting previously occult recurrences. A total of 735 eligible patients with a history of breast cancer were recruited. We assessed the same patient population before (routine follow-up program) and after (study follow-up program) the introduction of an additional ultrasound examination. In the routine follow-up program 245 of 735 patients (33.3% [95% confidence-interval (CI): 29.9-36.7]) had an ultrasound due to abnormal local or mammographic findings. 490 of 735 patients (66.7% [95% CI: 63.3-70.1]) were initially considered asymptomatic and received an additional ultrasound exclusively within the study follow-up program. All positive examination results were followed by accelerated core needle biopsy. The routine follow-up program led to a biopsy in 66 of 735 patients (9.0%) revealing a recurrent cancer in 27 cases (3.7%). The study follow-up program with the additional ultrasound led to another 21 biopsies raising the total number of patients who had to undergo a biopsy from 9.0% (95% CI: 6.9-11.1) to 11.8% (95% CI: 9.5-14.2). Finally, we diagnosed a previously occult malignant lesion in an additional six patients following this protocol. Therefore, the rate of detected recurrences rose from 3.7% (95% CI: 2.3-5.0) in the routine follow-up program to 4.5% (95% CI: 3.0-6.0) in the study follow-up program (p = 0.041). Negative side effects were the additional costs (the costs per detected malignancy in the routine follow-up program were $2455.69; the costs for each additionally detected malignancy in the study follow-up program were $7580.30), the higher overall biopsy rate (9.0 vs. 11.8%) and the elevated benign biopsies rate (59.1% vs. 71.4%). Regarding these results, the

  16. Teaching with Technology: The Classroom Manager. Cost-Conscious Computing.

    ERIC Educational Resources Information Center

    Smith, Rhea; And Others

    1992-01-01

    Teachers discuss how to make the most of technology in the classroom during a tight economy. Ideas include recycling computer printer ribbons, buying replacement batteries for computer power supply packs, upgrading via software, and soliciting donated computer equipment. (SM)

  17. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production. PMID:24712982

  18. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Computation of adjusted average per capita cost... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of adjusted average per capita cost (AAPCC). (a) Basic data. In computing the AAPCC, CMS uses the U.S....

  19. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Computation of adjusted average per capita cost... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of adjusted average per capita cost (AAPCC). (a) Basic data. In computing the AAPCC, CMS uses the U.S....

  20. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Funding Process § 170.602 If a tribe incurs unforeseen construction costs, can it get additional funds... circumstances of the construction process (i.e., cost overruns). If the Secretary is unable to fund the... sufficient additional funds are awarded. (See 25 CFR 900.130(e).) Miscellaneous Provisions...

  1. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Funding Process § 170.602 If a tribe incurs unforeseen construction costs, can it get additional funds... circumstances of the construction process (i.e., cost overruns). If the Secretary is unable to fund the... sufficient additional funds are awarded. (See 25 CFR 900.130(e).) Miscellaneous Provisions...

  2. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... sufficient additional funds are awarded. (See 25 CFR 900.130(e).) Miscellaneous Provisions ... 25 Indians 1 2010-04-01 2010-04-01 false If a tribe incurs unforeseen construction costs, can it... Funding Process § 170.602 If a tribe incurs unforeseen construction costs, can it get additional...

  3. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... sufficient additional funds are awarded. (See 25 CFR 900.130(e).) Miscellaneous Provisions ... 25 Indians 1 2013-04-01 2013-04-01 false If a tribe incurs unforeseen construction costs, can it... Funding Process § 170.602 If a tribe incurs unforeseen construction costs, can it get additional...

  4. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations § 201.14 Computation of time,...

  5. Cost-Effective Additive Manufacturing in Space: HELIOS Technology Challenge Guide

    NASA Technical Reports Server (NTRS)

    DeVieneni, Alayna; Velez, Carlos Andres; Benjamin, David; Hollenbeck, Jay

    2012-01-01

    Welcome to the HELIOS Technology Challenge Guide. This document is intended to serve as a general road map for participants of the HELIOS Technology Challenge [HTC] Program and the associated inaugural challenge: HTC-01: Cost-Effective Additive Manufacturing in Space. Please note that this guide is not a rule book and is not meant to hinder the development of innovative ideas. Its primary goal is to highlight the objectives of the HTC-01 Challenge and to describe possible solution routes and pitfalls that such technology may encounter in space. Please also note that participants wishing to demonstrate any hardware developed under this program during any future HELIOS Technology Challenge showcase event(s) may be subject to event regulations to be published separately at a later date.

  6. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motions: Analysis and results

    NASA Technical Reports Server (NTRS)

    Rowe, W. S.; Sebastian, J. D.; Petrarca, J. R.

    1979-01-01

    Results of theoretical and numerical investigations conducted to develop economical computing procedures were applied to an existing computer program that predicts unsteady aerodynamic loadings caused by leading and trailing edge control surface motions in subsonic compressible flow. Large reductions in computing costs were achieved by removing the spanwise singularity of the downwash integrand and evaluating its effect separately in closed form. Additional reductions were obtained by modifying the incremental pressure term that account for downwash singularities at control surface edges. Accuracy of theoretical predictions of unsteady loading at high reduced frequencies was increased by applying new pressure expressions that exactly satisified the high frequency boundary conditions of an oscillating control surface. Comparative computer result indicated that the revised procedures provide more accurate predictions of unsteady loadings as well as providing reduction of 50 to 80 percent in computer usage costs.

  7. Additive Manufacturing for Cost Efficient Production of Compact Ceramic Heat Exchangers and Recuperators

    SciTech Connect

    Shulman, Holly; Ross, Nicole

    2015-10-30

    An additive manufacture technique known as laminated object manufacturing (LOM) was used to fabricate compact ceramic heat exchanger prototypes. LOM uses precision CO2 laser cutting of ceramic green tapes, which are then precision stacked to build a 3D object with fine internal features. Modeling was used to develop prototype designs and predict the thermal response, stress, and efficiency in the ceramic heat exchangers. Build testing and materials analyses were used to provide feedback for the design selection. During this development process, laminated object manufacturing protocols were established. This included laser optimization, strategies for fine feature integrity, lamination fluid control, green handling, and firing profile. Three full size prototypes were fabricated using two different designs. One prototype was selected for performance testing. During testing, cross talk leakage prevented the application of a high pressure differential, however, the prototype was successful at withstanding the high temperature operating conditions (1300 °F). In addition, analysis showed that the bulk of the part did not have cracks or leakage issues. This led to the development of a module method for next generation LOM heat exchangers. A scale-up cost analysis showed that given a purpose built LOM system, these ceramic heat exchangers would be affordable for the applications.

  8. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motion. Addendum to computer program description

    NASA Technical Reports Server (NTRS)

    Rowe, W. S.; Petrarca, J. R.

    1980-01-01

    Changes to be made that provide increased accuracy and increased user flexibility in prediction of unsteady loadings caused by control surface motions are described. Analysis flexibility is increased by reducing the restrictions on the location of the downwash stations relative to the leading edge and the edges of the control surface boundaries. Analysis accuracy is increased in predicting unsteady loading for high Mach number analysis conditions through use of additional chordwise downwash stations. User guideline are presented to enlarge analysis capabilities of unusual wing control surface configurations. Comparative results indicate that the revised procedures provide accurate predictions of unsteady loadings as well as providing reductions of 40 to 75 percent in computer usage cost required by previous versions of this program.

  9. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  10. Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    2005-01-01

    Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…

  11. COMPUTER PROGRAM FOR CALCULATING THE COST OF DRINKING WATER TREATMENT SYSTEMS

    EPA Science Inventory

    This FORTRAN computer program calculates the construction and operation/maintenance costs for 45 centralized unit treatment processes for water supply. The calculated costs are based on various design parameters and raw water quality. These cost data are applicable to small size ...

  12. Comparing the Cost-Effectiveness of Tutoring and Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Niemiec, Richard P.; And Others

    1989-01-01

    Compares the effects of peer tutoring and computer-based instruction on student achievement and motivation through a meta-analysis of research. Cost effectiveness is also investigated via ratios which combine the effects of the intervention with estimates of implementation costs, and an appendix includes reviews of 10 cost-effectiveness research…

  13. Energy Drain by Computers Stifles Efforts at Cost Control

    ERIC Educational Resources Information Center

    Keller, Josh

    2009-01-01

    The high price of storing and processing data is hurting colleges and universities across the country. In response, some institutions are embracing greener technologies to keep costs down and help the environment. But compared with other industries, colleges and universities have been slow to understand the problem and to adopt energy-saving…

  14. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    ERIC Educational Resources Information Center

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  15. Commodity CPU-GPU System for Low-Cost , High-Performance Computing

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhang, S.; Weiss, R. M.; Barnett, G. A.; Yuen, D. A.

    2009-12-01

    We have put together a desktop computer system for under 2.5 K dollars from commodity components that consist of one quad-core CPU (Intel Core 2 Quad Q6600 Kentsfield 2.4GHz) and two high end GPUs (nVidia's GeForce GTX 295 and Tesla C1060). A 1200 watt power supply is required. On this commodity system, we have constructed an easy-to-use hybrid computing environment, in which Message Passing Interface (MPI) is used for managing the working loads, for transferring the data among different GPU devices, and for minimizing the need of CPU’s memory. The test runs using the MAGMA (Matrix Algebra on GPU and Multicore Architectures) library show that the speed ups for double precision calculations can be greater than 10 (GPU vs. CPU) and they are bigger (> 20) for single precision calculations. In addition we have enabled the combination of Matlab with CUDA for interactive visualization through MPI, i.e., two GPU devices are used for simulation and one GPU device is used for visualizing the computing results as the simulation goes. Our experience with this commodity system has shown that running multiple applications on one GPU device or running one application across multiple GPU devices can be done as conveniently as on CPUs. With NVIDIA CEO Jen-Hsun Huang's claim that over the next 6 years GPU processing power will increase by 570x compared to the 3x for CPUs, future low-cost commodity computers such as ours may be a remedy for the long wait queues of the world's supercomputers, especially for small- and mid-scale computation. Our goal here is to explore the limits and capabilities of this emerging technology and to get ourselves ready to run large-scale simulations on the next generation of computing environment, which we believe will hybridize CPU and GPU architectures.

  16. 19 CFR 210.6 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 210.6 Section 210.6 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Rules of General Applicability §...

  17. Computers and Social Knowledge; Opportunities and Opportunity Cost.

    ERIC Educational Resources Information Center

    Hartoonian, Michael

    Educators must use computers to move society beyond the information age and toward the age of wisdom. The movement toward knowledge and wisdom constitutes an evolution beyond the "third wave" or electronic/information age, the phase of history in which, according to Alvin Toffler, we are now living. We are already moving into a fourth wave, the…

  18. COMPUTER PROGRAMS FOR ESTIMATING THE COST OF PARTICULATE CONTROL EQUIPMENT

    EPA Science Inventory

    The report describes an interactive computer program, written to estimate the capital and operating expenses of electrostatic precipitators, fabric filters, and venturi scrubbers used on coal-fired boilers. The program accepts as input the current interest rate, coal analysis, em...

  19. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  20. Estimating Development Cost for a Tailored Interactive Computer Program to Enhance Colorectal Cancer Screening Compliance

    PubMed Central

    Lairson, David R.; Chang, Yu-Chia; Bettencourt, Judith L.; Vernon, Sally W.; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was $328,866. The development cost was $52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  1. Method for computing marginal costs associated with on-site energy technologies

    SciTech Connect

    Bright, R.; Davitian, H.

    1980-08-01

    A method for calculating long-run marginal costs for an electric utility is described. The method is especially suitable for computing the marginal costs associated with the use of small on-site energy technologies, i.e., cogenerators, solar heating and hot water systems, wind generators, etc., which are interconnected with electric utilities. In particular, both the costs a utility avoids when power is delivered to it from a facility with an on-site generator and marginal cost to the utility of supplementary power sold to the facility can be calculated. A utility capacity expansion model is used to compute changes in the utility's costs when loads are modified by the use of the on-site technology. Changes in capacity-related costs and production costs are thus computed in an internally consistent manner. The variable nature of the generation/load pattern of the on-site technology is treated explicitly. The method yields several measures of utility costs that can be used to develop rates based on marginal avoided costs for on-site technologies as well as marginal cost rates for conventional utility customers.

  2. A low computation cost method for seizure prediction.

    PubMed

    Zhang, Yanli; Zhou, Weidong; Yuan, Qi; Wu, Qi

    2014-10-01

    The dynamic changes of electroencephalograph (EEG) signals in the period prior to epileptic seizures play a major role in the seizure prediction. This paper proposes a low computation seizure prediction algorithm that combines a fractal dimension with a machine learning algorithm. The presented seizure prediction algorithm extracts the Higuchi fractal dimension (HFD) of EEG signals as features to classify the patient's preictal or interictal state with Bayesian linear discriminant analysis (BLDA) as a classifier. The outputs of BLDA are smoothed by a Kalman filter for reducing possible sporadic and isolated false alarms and then the final prediction results are produced using a thresholding procedure. The algorithm was evaluated on the intracranial EEG recordings of 21 patients in the Freiburg EEG database. For seizure occurrence period of 30 min and 50 min, our algorithm obtained an average sensitivity of 86.95% and 89.33%, an average false prediction rate of 0.20/h, and an average prediction time of 24.47 min and 39.39 min, respectively. The results confirm that the changes of HFD can serve as a precursor of ictal activities and be used for distinguishing between interictal and preictal epochs. Both HFD and BLDA classifier have a low computational complexity. All of these make the proposed algorithm suitable for real-time seizure prediction. PMID:25062892

  3. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  4. The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments

    NASA Technical Reports Server (NTRS)

    Budney, T. J.; Stone, R. W.

    1982-01-01

    Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.

  5. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Procedures for the Computation of the Real Cost of Capital... POWERPLANTS Pt. 504, App. I Appendix I to Part 504—Procedures for the Computation of the Real Cost of Capital (a) The firm's real after-tax weighted average marginal cost of capital (K) is computed with...

  6. Conceptual HALT (Hydrate Addition at Low Temperature) scaleup design: Capital and operating costs: Part 5. [Hydrate addition at low temperature for the removal of SO/sub 2/

    SciTech Connect

    Babu, M.; Kerivan, D.; Hendrick, C.; Kosek, B.; Tackett, D.; Golightley, M.

    1988-12-01

    Hydrate addition at low temperature (or the HALT process) is a retrofit option for moderate SO/sub 2/ removal efficiency in coal burning utility plants. This dry FGD process involves injecting calcium based dry hydrate particles into flue gas ducting downstream of the air preheater where the flue gas temperature is typically in the range of 280-325/degree/F. This report is comprised of the conceptual scaleup design of the HALT process to a 180 MW and a 500 MW coal fired utility station followed by detailed capital and operating cost estimates. A cost sensitivity analysis of major process variables for the 500 MW unit is also included. 1 fig.

  7. Government regulation and public opposition create high additional costs for field trials with GM crops in Switzerland.

    PubMed

    Bernauer, Thomas; Tribaldos, Theresa; Luginbühl, Carolin; Winzeler, Michael

    2011-12-01

    Field trials with GM crops are not only plant science experiments. They are also social experiments concerning the implications of government imposed regulatory constraints and public opposition for scientific activity. We assess these implications by estimating additional costs due to government regulation and public opposition in a recent set of field trials in Switzerland. We find that for every Euro spent on research, an additional 78 cents were spent on security, an additional 31 cents on biosafety, and an additional 17 cents on government regulatory supervision. Hence the total additional spending due to government regulation and public opposition was around 1.26 Euros for every Euro spent on the research per se. These estimates are conservative; they do not include additional costs that are hard to monetize (e.g. stakeholder information and dialogue activities, involvement of various government agencies). We conclude that further field experiments with GM crops in Switzerland are unlikely unless protected sites are set up to reduce these additional costs. PMID:21279684

  8. Transport Sector Marginal Abatement Cost Curves in Computable General Equilibrium Model

    NASA Astrophysics Data System (ADS)

    Tippichai, Atit; Fukuda, Atsushi; Morisugi, Hisayoshi

    In the last decade, computable general equilibrium (CGE) models have emerged a standard tool for climate policy evaluation due to their abilities to prospectively elucidate the character and magnitude of the economic impacts of energy and environmental policies. Furthermore, marginal abatement cost (MAC) curves which represent GHG emissions reduction potentials and costs can be derived from these top-down economic models. However, most studies have never address MAC curves for a specific sector that have a large coverage of countries which are needed for allocation of optimal emission reductions. This paper aims to explicitly describe the meaning and character of MAC curves for transport sector in a CGE context through using the AIM/CGE Model developed by Toshihiko Masui. It found that the MAC curves derived in this study are the inverse of the general equilibrium reduction function for CO2 emissions. Moreover, the transport sector MAC curves for six regions including USA, EU-15, Japan, China, India, and Brazil, derived from this study are compared to the reduction potentials under 100 USD/tCO2 in 2020 from a bottom-up study. The results showed that the ranking of the regional reduction potentials in transport sector from this study are almost same with the bottom-up study except the ranks of the EU-15 and China. In addition, the range of the reduction potentials from this study is wider and only the USA has higher potentials than those derived from the bottom-up study.

  9. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... as part of the Model Design PN, 77 FR 38804, June 29, 2012, of the possible significant economic...). See Electronic Filing of Documents in Rulemaking Proceedings, 63 FR 24121, May 1, 1998. Electronic...; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal...

  10. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  11. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  12. Municipal Rebate Programs for Environmental Retrofits: An Evaluation of Additionality and Cost-Effectiveness

    ERIC Educational Resources Information Center

    Bennear, Lori S.; Lee, Jonathan M.; Taylor, Laura O.

    2013-01-01

    When policies incentivize voluntary activities that also take place in the absence of the incentive, it is critical to identify the additionality of the policy--that is, the degree to which the policy results in actions that would not have occurred otherwise. Rebate programs have become a common conservation policy tool for local municipalities…

  13. 78 FR 12271 - Wireline Competition Bureau Seeks Additional Comment In Connect America Cost Model Virtual Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... Design PN, 77 FR 38804, June 29, 2012, of the possible significant economic impact on a substantial... Documents in Rulemaking Proceedings, 63 FR 24121, May 1, 1998. Electronic Filers: Comments may be filed... document, the Wireline Competition Bureau seeks public input on additional questions relating to...

  14. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  15. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  16. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  17. A low-cost vector processor boosting compute-intensive image processing operations

    NASA Technical Reports Server (NTRS)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  18. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  19. Computing the acoustic radiation force exerted on a sphere using the translational addition theorem.

    PubMed

    Silva, Glauber T; Baggio, André L; Lopes, J Henrique; Mitri, Farid G

    2015-03-01

    In this paper, the translational addition theorem for spherical functions is employed to calculate the acoustic radiation force produced by an arbitrary shaped beam on a sphere arbitrarily suspended in an inviscid fluid. The procedure is also based on the partial-wave expansion method, which depends on the beam-shape and scattering coefficients. Given a set of beam-shape coefficients (BSCs) for an acoustic beam relative to a reference frame, the translational addition theorem can be used to obtain the BSCs relative to the sphere positioned anywhere in the medium. The scattering coefficients are obtained from the acoustic boundary conditions across the sphere's surface. The method based on the addition theorem is particularly useful to avoid quadrature schemes to obtain the BSCs. We use it to compute the acoustic radiation force exerted by a spherically focused beam (in the paraxial approximation) on a silicone-oil droplet (compressible fluid sphere). The analysis is carried out in the Rayleigh (i.e., the particle diameter is much smaller than the wavelength) and Mie (i.e., the particle diameter is of the order of the wavelength or larger) scattering regimes. The obtained results show that the paraxial focused beam can only trap particles in the Rayleigh scattering regime. PMID:25768823

  20. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    NASA Technical Reports Server (NTRS)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  1. Radial subsampling for fast cost function computation in intensity-based 3D image registration

    NASA Astrophysics Data System (ADS)

    Boettger, Thomas; Wolf, Ivo; Meinzer, Hans-Peter; Celi, Juan Carlos

    2007-03-01

    Image registration is always a trade-off between accuracy and speed. Looking towards clinical scenarios the time for bringing two or more images into registration should be around a few seconds only. We present a new scheme for subsampling 3D-image data to allow for efficient computation of cost functions in intensity-based image registration. Starting from an arbitrary center point voxels are sampled along scan lines which do radially extend from the center point. We analyzed the characteristics of different cost functions computed on the sub-sampled data and compared them to known cost functions with respect to local optima. Results show the cost functions are smooth and give high peaks at the expected optima. Furthermore we investigated capture range of cost functions computed under the new subsampling scheme. Capture range was remarkably better for the new scheme compared to metrics using all voxels or different subsampling schemes and high registration accuracy was achieved as well. The most important result is the improvement in terms of speed making this scheme very interesting for clinical scenarios. We conclude using the new subsampling scheme intensity-based 3D image registration can be performed much faster than using other approaches while maintaining high accuracy. A variety of different extensions of the new approach is conceivable, e.g. non-regular distribution of the scan lines or not to let the scan lines start from a center point only, but from the surface of an organ model for example.

  2. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  3. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  4. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  5. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGESBeta

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  6. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    ERIC Educational Resources Information Center

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  7. BICYCLE: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1980-08-01

    This report serves as a user's manual for the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included in this report are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  8. BICYCLE II: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1981-11-01

    This report describes the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  9. Cost-Effective Computing: Making the Most of Your PC Dollars.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1992-01-01

    Lists 27 suggestions for making cost-effective decisions when buying personal computers. Topics covered include physical comfort; modem speed; color graphics; institutional discounts; direct-order firms; brand names; replacing versus upgrading; expanding hard disk capacity; printers; software; wants versus needs; and RLIN (Research Libraries…

  10. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S r

  11. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence

    PubMed Central

    2013-01-01

    Background High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients’ adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients’ level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients’ adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Methods Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project’s research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention’s effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. Results The intervention’s cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for

  12. Formation of gold nanostructures on copier paper surface for cost effective SERS active substrate - Effect of halide additives

    NASA Astrophysics Data System (ADS)

    Desmonda, Christa; Kar, Sudeshna; Tai, Yian

    2016-03-01

    In this study, we report the simple fabrication of an active substrate assisted by gold nanostructures (AuNS) for application in surface-enhanced Raman scattering (SERS) using copier paper, which is a biodegradable and cost-effective material. As cellulose is the main component of paper, it can behave as a reducing agent and as a capping molecule for the synthesis of AuNS on the paper substrate. AuNS can be directly generated on the surface of the copier paper by addition of halides. The AuNS thus synthesized were characterized by ultraviolet-visible spectroscopy, SEM, XRD, and XPS. In addition, the SERS effect of the AuNS-paper substrates synthesized by using various halides was investigated by using rhodamine 6G and melamine as probe molecules.

  13. Reducing metal alloy powder costs for use in powder bed fusion additive manufacturing: Improving the economics for production

    NASA Astrophysics Data System (ADS)

    Medina, Fransisco

    Titanium and its associated alloys have been used in industry for over 50 years and have become more popular in the recent decades. Titanium has been most successful in areas where the high strength to weight ratio provides an advantage over aluminum and steels. Other advantages of titanium include biocompatibility and corrosion resistance. Electron Beam Melting (EBM) is an additive manufacturing (AM) technology that has been successfully applied in the manufacturing of titanium components for the aerospace and medical industry with equivalent or better mechanical properties as parts fabricated via more traditional casting and machining methods. As the demand for titanium powder continues to increase, the price also increases. Titanium spheroidized powder from different vendors has a price range from 260/kg-450/kg, other spheroidized alloys such as Niobium can cost as high as $1,200/kg. Alternative titanium powders produced from methods such as the Titanium Hydride-Dehydride (HDH) process and the Armstrong Commercially Pure Titanium (CPTi) process can be fabricated at a fraction of the cost of powders fabricated via gas atomization. The alternative powders can be spheroidized and blended. Current sectors in additive manufacturing such as the medical industry are concerned that there will not be enough spherical powder for production and are seeking other powder options. It is believed the EBM technology can use a blend of spherical and angular powder to build fully dense parts with equal mechanical properties to those produced using traditional powders. Some of the challenges with angular and irregular powders are overcoming the poor flow characteristics and the attainment of the same or better packing densities as spherical powders. The goal of this research is to demonstrate the feasibility of utilizing alternative and lower cost powders in the EBM process. As a result, reducing the cost of the raw material to reduce the overall cost of the product produced with

  14. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    PubMed Central

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  15. On Training Efficiency and Computational Costs of a Feed Forward Neural Network: A Review

    PubMed Central

    Laudani, Antonino; Lozito, Gabriele Maria; Riganti Fulginei, Francesco; Salvini, Alessandro

    2015-01-01

    A comprehensive review on the problem of choosing a suitable activation function for the hidden layer of a feed forward neural network has been widely investigated. Since the nonlinear component of a neural network is the main contributor to the network mapping capabilities, the different choices that may lead to enhanced performances, in terms of training, generalization, or computational costs, are analyzed, both in general-purpose and in embedded computing environments. Finally, a strategy to convert a network configuration between different activation functions without altering the network mapping capabilities will be presented. PMID:26417368

  16. On Training Efficiency and Computational Costs of a Feed Forward Neural Network: A Review.

    PubMed

    Laudani, Antonino; Lozito, Gabriele Maria; Riganti Fulginei, Francesco; Salvini, Alessandro

    2015-01-01

    A comprehensive review on the problem of choosing a suitable activation function for the hidden layer of a feed forward neural network has been widely investigated. Since the nonlinear component of a neural network is the main contributor to the network mapping capabilities, the different choices that may lead to enhanced performances, in terms of training, generalization, or computational costs, are analyzed, both in general-purpose and in embedded computing environments. Finally, a strategy to convert a network configuration between different activation functions without altering the network mapping capabilities will be presented. PMID:26417368

  17. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates...

  18. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates...

  19. Computers in Secondary Schools in Developing Countries: Costs and Other Issues (Including Original Data from South Africa and Zimbabwe).

    ERIC Educational Resources Information Center

    Cawthera, Andy

    This research is mainly concerned with the costs of computers in schools in developing countries. It starts with a brief overview of the information revolution and its consequences. It then briefly examines some of the arguments for the use of computers in schools in developing countries, before reviewing previous work undertaken on the costs of…

  20. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    SciTech Connect

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case.

  1. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    SciTech Connect

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  2. Application of a single-board computer as a low-cost pulse generator

    NASA Astrophysics Data System (ADS)

    Fedrizzi, Marcus; Soria, Julio

    2015-09-01

    A BeagleBone Black (BBB) single-board open-source computer was implemented as a low-cost fully programmable pulse generator. The pulse generator makes use of the BBB Programmable Real-Time Unit (PRU) subsystem to achieve a deterministic temporal resolution of 5 ns, an RMS jitter of 290 ps and a timebase stability on the order of 10 ppm. A Python-based software framework has also been developed to simplify the usage of the pulse generator.

  3. Predicting Cost/Performance Trade-Offs for Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. We develop closed form expressions detailing the number and size of messages sent by each benchmark. Coupling these with measured single processor performance, network latency, and network bandwidth, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $1,000,000.

  4. Predicting Cost/Performance Trade-offs For Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanDerWijngaart, Rob F.; Tweten, Dave (Technical Monitor)

    1998-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. By measuring single processor benchmark performance, network latency, and network bandwidth, and using closed form expressions detailing the number and size of messages sent by each benchmark, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $ 1,000,000.

  5. Reducing annotation cost and uncertainty in computer-aided diagnosis through selective iterative classification

    NASA Astrophysics Data System (ADS)

    Riely, Amelia; Sablan, Kyle; Xiaotao, Thomas; Furst, Jacob; Raicu, Daniela

    2015-03-01

    Medical imaging technology has always provided radiologists with the opportunity to view and keep records of anatomy of the patient. With the development of machine learning and intelligent computing, these images can be used to create Computer-Aided Diagnosis (CAD) systems, which can assist radiologists in analyzing image data in various ways to provide better health care to patients. This paper looks at increasing accuracy and reducing cost in creating CAD systems, specifically in predicting the malignancy of lung nodules in the Lung Image Database Consortium (LIDC). Much of the cost in creating an accurate CAD system stems from the need for multiple radiologist diagnoses or annotations of each image, since there is rarely a ground truth diagnosis and even different radiologists' diagnoses of the same nodule often disagree. To resolve this issue, this paper outlines an method of selective iterative classification that predicts lung nodule malignancy by using multiple radiologist diagnoses only for cases that can benefit from them. Our method achieved 81% accuracy while costing only 46% of the method that indiscriminately used all annotations, which achieved a lower accuracy of 70%, while costing more.

  6. Cost-Effectiveness of Computed Tomographic Colonography Screening for Colorectal Cancer in the Medicare Population

    PubMed Central

    Lansdorp-Vogelaar, Iris; Rutter, Carolyn M.; Savarino, James E.; van Ballegooijen, Marjolein; Kuntz, Karen M.; Zauber, Ann G.

    2010-01-01

    Background The Centers for Medicare and Medicaid Services (CMS) considered whether to reimburse computed tomographic colonography (CTC) for colorectal cancer screening of Medicare enrollees. To help inform its decision, we evaluated the reimbursement rate at which CTC screening could be cost-effective compared with the colorectal cancer screening tests that are currently reimbursed by CMS and are included in most colorectal cancer screening guidelines, namely annual fecal occult blood test (FOBT), flexible sigmoidoscopy every 5 years, flexible sigmoidoscopy every 5 years in conjunction with annual FOBT, and colonoscopy every 10 years. Methods We used three independently developed microsimulation models to assess the health outcomes and costs associated with CTC screening and with currently reimbursed colorectal cancer screening tests among the average-risk Medicare population. We assumed that CTC was performed every 5 years (using test characteristics from either a Department of Defense CTC study or the National CTC Trial) and that individuals with findings of 6 mm or larger were referred to colonoscopy. We computed incremental cost-effectiveness ratios for the currently reimbursed screening tests and calculated the maximum cost per scan (ie, the threshold cost) for the CTC strategy to lie on the efficient frontier. Sensitivity analyses were performed on key parameters and assumptions. Results Assuming perfect adherence with all tests, the undiscounted number life-years gained from CTC screening ranged from 143 to 178 per 1000 65-year-olds, which was slightly less than the number of life-years gained from 10-yearly colonoscopy (152–185 per 1000 65-year-olds) and comparable to that from 5-yearly sigmoidoscopy with annual FOBT (149–177 per 1000 65-year-olds). If CTC screening was reimbursed at $488 per scan (slightly less than the reimbursement for a colonoscopy without polypectomy), it would be the most costly strategy. CTC screening could be cost-effective at

  7. Dosimetry and cost of imaging osseointegrated implants with film-based and computed tomography.

    PubMed

    Scaf, G; Lurie, A G; Mosier, K M; Kantor, M L; Ramsby, G R; Freedman, M L

    1997-01-01

    Thermoluminescent dosimeters were used to measure radiation doses at craniofacial sites in a tissue-equivalent phantom during film-based multidirectional tomography with the Tomax Ultrascan (Incubation Industries, Ivyland, Pa.) and during computed tomography with the Elscint Excel 2400 (Elscint Corp., Tel Aviv, Israel). Mean absorbed doses for presurgical mandibular and maxillary canine and molar implant assessments were converted to equivalent doses, which were then multiplied by published weighting factors and summed to give effective doses. The computed tomography device consistently delivered higher doses than the Tomax Ultrascan to all anatomic locations; the differences were most pronounced when only one or two implant sites were evaluated. The reasons for the dose disparities are considered both anatomically and procedurally. A survey of examination cost revealed film-based multidirectional tomography to be less expensive than computed tomography. PMID:9007922

  8. A unified RANS–LES model: Computational development, accuracy and cost

    SciTech Connect

    Gopalan, Harish; Heinz, Stefan; Stöllinger, Michael K.

    2013-09-15

    Large eddy simulation (LES) is computationally extremely expensive for the investigation of wall-bounded turbulent flows at high Reynolds numbers. A way to reduce the computational cost of LES by orders of magnitude is to combine LES equations with Reynolds-averaged Navier–Stokes (RANS) equations used in the near-wall region. A large variety of such hybrid RANS–LES methods are currently in use such that there is the question of which hybrid RANS-LES method represents the optimal approach. The properties of an optimal hybrid RANS–LES model are formulated here by taking reference to fundamental properties of fluid flow equations. It is shown that unified RANS–LES models derived from an underlying stochastic turbulence model have the properties of optimal hybrid RANS–LES models. The rest of the paper is organized in two parts. First, a priori and a posteriori analyses of channel flow data are used to find the optimal computational formulation of the theoretically derived unified RANS–LES model and to show that this computational model, which is referred to as linear unified model (LUM), does also have all the properties of an optimal hybrid RANS–LES model. Second, a posteriori analyses of channel flow data are used to study the accuracy and cost features of the LUM. The following conclusions are obtained. (i) Compared to RANS, which require evidence for their predictions, the LUM has the significant advantage that the quality of predictions is relatively independent of the RANS model applied. (ii) Compared to LES, the significant advantage of the LUM is a cost reduction of high-Reynolds number simulations by a factor of 0.07Re{sup 0.46}. For coarse grids, the LUM has a significant accuracy advantage over corresponding LES. (iii) Compared to other usually applied hybrid RANS–LES models, it is shown that the LUM provides significantly improved predictions.

  9. Evolutionary adaptive eye tracking for low-cost human computer interaction applications

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Shin, Hak Chul; Sung, Won Jun; Khim, Sarang; Kim, Honglak; Rhee, Phill Kyu

    2013-01-01

    We present an evolutionary adaptive eye-tracking framework aiming for low-cost human computer interaction. The main focus is to guarantee eye-tracking performance without using high-cost devices and strongly controlled situations. The performance optimization of eye tracking is formulated into the dynamic control problem of deciding on an eye tracking algorithm structure and associated thresholds/parameters, where the dynamic control space is denoted by genotype and phenotype spaces. The evolutionary algorithm is responsible for exploring the genotype control space, and the reinforcement learning algorithm organizes the evolved genotype into a reactive phenotype. The evolutionary algorithm encodes an eye-tracking scheme as a genetic code based on image variation analysis. Then, the reinforcement learning algorithm defines internal states in a phenotype control space limited by the perceived genetic code and carries out interactive adaptations. The proposed method can achieve optimal performance by compromising the difficulty in the real-time performance of the evolutionary algorithm and the drawback of the huge search space of the reinforcement learning algorithm. Extensive experiments were carried out using webcam image sequences and yielded very encouraging results. The framework can be readily applied to other low-cost vision-based human computer interactions in solving their intrinsic brittleness in unstable operational environments.

  10. Experiments with a low-cost system for computer graphics material model acquisition

    NASA Astrophysics Data System (ADS)

    Rushmeier, Holly; Lockerman, Yitzhak; Cartwright, Luke; Pitera, David

    2015-03-01

    We consider the design of an inexpensive system for acquiring material models for computer graphics rendering applications in animation, games and conceptual design. To be useful in these applications a system must be able to model a rich range of appearances in a computationally tractable form. The range of appearance of interest in computer graphics includes materials that have spatially varying properties, directionality, small-scale geometric structure, and subsurface scattering. To be computationally tractable, material models for graphics must be compact, editable, and efficient to numerically evaluate for ray tracing importance sampling. To construct appropriate models for a range of interesting materials, we take the approach of separating out directly and indirectly scattered light using high spatial frequency patterns introduced by Nayar et al. in 2006. To acquire the data at low cost, we use a set of Raspberry Pi computers and cameras clamped to miniature projectors. We explore techniques to separate out surface and subsurface indirect lighting. This separation would allow the fitting of simple, and so tractable, analytical models to features of the appearance model. The goal of the system is to provide models for physically accurate renderings that are visually equivalent to viewing the original physical materials.

  11. Addition of flexible body option to the TOLA computer program. Part 2: User and programmer documentation

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    User and programmer oriented documentation for the flexible body option of the Takeoff and Landing Analysis (TOLA) computer program are provided. The user information provides sufficient knowledge of the development and use of the option to enable the engineering user to successfully operate the modified program and understand the results. The programmer's information describes the option structure and logic enabling a programmer to make major revisions to this part of the TOLA computer program.

  12. Versatile, low-cost, computer-controlled, sample positioning system for vacuum applications

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1991-01-01

    A versatile, low-cost, easy to implement, microprocessor-based motorized positioning system (MPS) suitable for accurate sample manipulation in a Second Ion Mass Spectrometry (SIMS) system, and for other ultra-high vacuum (UHV) applications was designed and built at NASA LeRC. The system can be operated manually or under computer control. In the latter case, local, as well as remote operation is possible via the IEEE-488 bus. The position of the sample can be controlled in three linear orthogonal and one angular coordinates.

  13. Can Computer-Assisted Discovery Learning Foster First Graders' Fluency with the Most Basic Addition Combinations?

    ERIC Educational Resources Information Center

    Baroody, Arthur J.; Eiland, Michael D.; Purpura, David J.; Reid, Erin E.

    2013-01-01

    In a 9-month training experiment, 64 first graders with a risk factor were randomly assigned to computer-assisted structured discovery of the add-1 rule (e.g., the sum of 7 + 1 is the number after "seven" when we count), unstructured discovery learning of this regularity, or an active-control group. Planned contrasts revealed that the add-1…

  14. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  15. Engineering and environmental properties of thermally treated mixtures containing MSWI fly ash and low-cost additives.

    PubMed

    Polettini, A; Pomi, R; Trinci, L; Muntoni, A; Lo Mastro, S

    2004-09-01

    An experimental work was carried out to investigate the feasibility of application of a sintering process to mixtures composed of Municipal Solid Waste Incinerator (MSWI) fly ash and low-cost additives (waste from feldspar production and cullet). The proportions of the three constituents were varied to adjust the mixture compositions to within the optimal range for sintering. The material was compacted in cylindrical specimens and treated at 1100 and 1150 degrees C for 30 and 60 min. Engineering and environmental characteristics including weight loss, dimensional changes, density, open porosity, mechanical strength, chemical stability and leaching behavior were determined for the treated material, allowing the relationship between the degree of sintering and both mixture composition and treatment conditions to be singled out. Mineralogical analyses detected the presence of neo-formation minerals from the pyroxene group. Estimation of the extent of metal loss from the samples indicated that the potential for volatilization of species of Pb, Cd and Zn is still a matter of major concern when dealing with thermal treatment of incinerator ash. PMID:15268956

  16. Low-cost computer-controlled current stimulator for the student laboratory.

    PubMed

    Güçlü, Burak

    2007-06-01

    Electrical stimulation of nerve and muscle tissues is frequently used for teaching core concepts in physiology. It is usually expensive to provide every student group in the laboratory with an individual stimulator. This article presents the design and application of a low-cost [about $100 (U.S.)] isolated stimulator that can be controlled by two analog-output channels (e.g., output channels of a data-acquisition card or onboard audio channels) of a computer. The device is based on a voltage-to-current converter circuit and can produce accurate monopolar and bipolar current pulses, pulse trains, arbitrary current waveforms, and a trigger output. The compliance of the current source is +/-15 V, and the maximum available current is +/-1.5 mA. The device was electrically tested by using the audio output of a personal computer. In this condition, the device had a dynamic range of 46 dB and the available pulse-width range was 0.1-10 ms. The device is easily programmable, and a freeware MATLAB script is posted on the World Wide Web. The practical use of the device was demonstrated by electrically stimulating the sciatic nerve of a frog and recording compound action potentials. The newly designed current stimulator is a flexible and effective tool for teaching in the physiology laboratory, and it can increase the efficiency of learning by maximizing performance-to-cost ratio. PMID:17562915

  17. Identification of Students' Intuitive Mental Computational Strategies for 1, 2 and 3 Digits Addition and Subtraction: Pedagogical and Curricular Implications

    ERIC Educational Resources Information Center

    Ghazali, Munirah; Alias, Rohana; Ariffin, Noor Asrul Anuar; Ayub, Ayminsyadora

    2010-01-01

    This paper reports on a study to examine mental computation strategies used by Year 1, Year 2, and Year 3 students to solve addition and subtraction problems. The participants in this study were twenty five 7 to 9 year-old students identified as excellent, good and satisfactory in their mathematics performance from a school in Penang, Malaysia.…

  18. The Effects of Computer-Assisted Instruction on Student Achievement in Addition and Subtraction at First Grade Level.

    ERIC Educational Resources Information Center

    Spivey, Patsy M.

    This study was conducted to determine whether the traditional classroom approach to instruction involving the addition and subtraction of number facts (digits 0-6) is more or less effective than the traditional classroom approach plus a commercially-prepared computer game. A pretest-posttest control group design was used with two groups of first…

  19. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  20. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    NASA Technical Reports Server (NTRS)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  1. A Novel Cost Based Model for Energy Consumption in Cloud Computing

    PubMed Central

    Horri, A.; Dastghaibyfard, Gh.

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. PMID:25705716

  2. A novel cost based model for energy consumption in cloud computing.

    PubMed

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. PMID:25705716

  3. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a)...

  4. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a)...

  5. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a)...

  6. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  7. Restructuring the introductory physics lab with the addition of computer-based laboratories

    PubMed Central

    Pierri-Galvao, Monica

    2011-01-01

    Nowadays, data acquisition software and sensors are being widely used in introductory physics laboratories. This allows the student to spend more time exploring the data that is collected by the computer hence focusing more on the physical concept. Very often, a faculty is faced with the challenge of updating or introducing a microcomputer-based laboratory (MBL) at his or her institution. This article will provide a list of experiments and equipment needed to convert about half of the traditional labs on a 1-year introductory physics lab into MBLs. PMID:22346229

  8. Addition of higher order plate and shell elements into NASTRAN computer program

    NASA Technical Reports Server (NTRS)

    Narayanaswami, R.; Goglia, G. L.

    1976-01-01

    Two higher order plate elements, the linear strain triangular membrane element and the quintic bending element, along with a shallow shell element, suitable for inclusion into the NASTRAN (NASA Structural Analysis) program are described. Additions to the NASTRAN Theoretical Manual, Users' Manual, Programmers' Manual and the NASTRAN Demonstration Problem Manual, for inclusion of these elements into the NASTRAN program are also presented.

  9. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  10. Indolyne Experimental and Computational Studies: Synthetic Applications and Origins of Selectivities of Nucleophilic Additions

    PubMed Central

    Im, G-Yoon J.; Bronner, Sarah M.; Goetz, Adam E.; Paton, Robert S.; Cheong, Paul H.-Y.; Houk, K. N.; Garg, Neil K.

    2010-01-01

    Efficient syntheses of 4,5-, 5,6-, and 6,7-indolyne precursors beginning from commercially available hydroxyindole derivatives are reported. The synthetic routes are versatile and allow access to indolyne precursors that remain unsubstituted on the pyrrole ring. Indolynes can be generated under mild fluoride-mediated conditions, trapped by a variety of nucleophilic reagents, and used to access a number of novel substituted indoles. Nucleophilic addition reactions to indolynes proceed with varying degrees of regioselectivity; distortion energies control regioselectivity and provide a simple model to predict the regioselectivity in the nucleophilic additions to indolynes and other unsymmetrical arynes. This model has led to the design of a substituted 4,5-indolyne that exhibits enhanced nucleophilic regioselectivity. PMID:21114321

  11. Addition of visual noise boosts evoked potential-based brain-computer interface.

    PubMed

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-01-01

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7-36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs. PMID:24828128

  12. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  13. Cost justification for an interactive Computer-Aided Design Drafting/Manufacturing system

    SciTech Connect

    Norton, F.J.

    1980-09-23

    Many factors influence the capital investment decision. System costs and benefits are weighed by methods of financial analysis to determine the advisability of an investment. Capital, expense, and benefits as related to Interactive Computer-Aided Design Drafting/Manufacturing (CADD/M) Systems are discussed and model calculations are included. An example is treated by the simple payback method and the more sophisticated methods of Net Present Value (NPV) and Internal Rate of Return (IRR). The NPV and IRR approaches include in the calculation the time value of money and provide a sounder foundation on which to base the purchase decision. It is hoped that an understanding of these techniques by technical personnel will make an optimum system purchase more likely.

  14. A simple, low-cost, data logging pendulum built from a computer mouse

    SciTech Connect

    Gintautas, Vadas; Hubler, Alfred

    2009-01-01

    Lessons and homework problems involving a pendulum are often a big part of introductory physics classes and laboratory courses from high school to undergraduate levels. Although laboratory equipment for pendulum experiments is commercially available, it is often expensive and may not be affordable for teachers on fixed budgets, particularly in developing countries. We present a low-cost, easy-to-build rotary sensor pendulum using the existing hardware in a ball-type computer mouse. We demonstrate how this apparatus may be used to measure both the frequency and coefficient of damping of a simple physical pendulum. This easily constructed laboratory equipment makes it possible for all students to have hands-on experience with one of the most important simple physical systems.

  15. First- and Second-Line Bevacizumab in Addition to Chemotherapy for Metastatic Colorectal Cancer: A United States–Based Cost-Effectiveness Analysis

    PubMed Central

    Goldstein, Daniel A.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.

    2015-01-01

    Purpose The addition of bevacizumab to fluorouracil-based chemotherapy is a standard of care for previously untreated metastatic colorectal cancer. Continuation of bevacizumab beyond progression is an accepted standard of care based on a 1.4-month increase in median overall survival observed in a randomized trial. No United States–based cost-effectiveness modeling analyses are currently available addressing the use of bevacizumab in metastatic colorectal cancer. Our objective was to determine the cost effectiveness of bevacizumab in the first-line setting and when continued beyond progression from the perspective of US payers. Methods We developed two Markov models to compare the cost and effectiveness of fluorouracil, leucovorin, and oxaliplatin with or without bevacizumab in the first-line treatment and subsequent fluorouracil, leucovorin, and irinotecan with or without bevacizumab in the second-line treatment of metastatic colorectal cancer. Model robustness was addressed by univariable and probabilistic sensitivity analyses. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Results Using bevacizumab in first-line therapy provided an additional 0.10 QALYs (0.14 life-years) at a cost of $59,361. The incremental cost-effectiveness ratio was $571,240 per QALY. Continuing bevacizumab beyond progression provided an additional 0.11 QALYs (0.16 life-years) at a cost of $39,209. The incremental cost-effectiveness ratio was $364,083 per QALY. In univariable sensitivity analyses, the variables with the greatest influence on the incremental cost-effectiveness ratio were bevacizumab cost, overall survival, and utility. Conclusion Bevacizumab provides minimal incremental benefit at high incremental cost per QALY in both the first- and second-line settings of metastatic colorectal cancer treatment. PMID:25691669

  16. Low cost, highly effective parallel computing achieved through a Beowulf cluster.

    PubMed

    Bitner, Marc; Skelton, Gordon

    2003-01-01

    A Beowulf cluster is a means of bringing together several computers and using software and network components to make this cluster of computers appear and function as one computer with multiple parallel computing processors. A cluster of computers can provide comparable computing power usually found only in very expensive super computers or servers. PMID:12724866

  17. Low-cost, high-performance and efficiency computational photometer design

    NASA Astrophysics Data System (ADS)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  18. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... following costs: (a) An administrative fee to process your Payment Plan, as required by 31 CFR 901.9. (b... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  19. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... following costs: (a) An administrative fee to process your Payment Plan, as required by 31 CFR 901.9. (b... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  20. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... following costs: (a) An administrative fee to process your Payment Plan, as required by 31 CFR 901.9. (b... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  1. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... following costs: (a) An administrative fee to process your Payment Plan, as required by 31 CFR 901.9. (b... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  2. Least-squares reverse-time migration with cost-effective computation and memory storage

    NASA Astrophysics Data System (ADS)

    Liu, Xuejian; Liu, Yike; Huang, Xiaogang; Li, Peng

    2016-06-01

    Least-squares reverse-time migration (LSRTM), which involves several iterations of reverse-time migration (RTM) and Born modeling procedures, can provide subsurface images with better balanced amplitudes, higher resolution and fewer artifacts than standard migration. However, the same source wavefield is repetitively computed during the Born modeling and RTM procedures of different iterations. We developed a new LSRTM method with modified excitation-amplitude imaging conditions, where the source wavefield for RTM is forward propagated only once while the maximum amplitude and its excitation-time at each grid are stored. Then, the RTM procedure of different iterations only involves: (1) backward propagation of the residual between Born modeled and acquired data, and (2) implementation of the modified excitation-amplitude imaging condition by multiplying the maximum amplitude by the back propagated data residuals only at the grids that satisfy the imaging time at each time-step. For a complex model, 2 or 3 local peak-amplitudes and corresponding traveltimes should be confirmed and stored for all the grids so that multiarrival information of the source wavefield can be utilized for imaging. Numerical experiments on a three-layer and the Marmousi2 model demonstrate that the proposed LSRTM method saves huge computation and memory cost.

  3. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    SciTech Connect

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  4. Improving the precision and speed of Euler angles computation from low-cost rotation sensor data.

    PubMed

    Janota, Aleš; Šimák, Vojtech; Nemec, Dušan; Hrbček, Jozef

    2015-01-01

    This article compares three different algorithms used to compute Euler angles from data obtained by the angular rate sensor (e.g., MEMS gyroscope)-the algorithms based on a rotational matrix, on transforming angular velocity to time derivations of the Euler angles and on unit quaternion expressing rotation. Algorithms are compared by their computational efficiency and accuracy of Euler angles estimation. If attitude of the object is computed only from data obtained by the gyroscope, the quaternion-based algorithm seems to be most suitable (having similar accuracy as the matrix-based algorithm, but taking approx. 30% less clock cycles on the 8-bit microcomputer). Integration of the Euler angles' time derivations has a singularity, therefore is not accurate at full range of object's attitude. Since the error in every real gyroscope system tends to increase with time due to its offset and thermal drift, we also propose some measures based on compensation by additional sensors (a magnetic compass and accelerometer). Vector data of mentioned secondary sensors has to be transformed into the inertial frame of reference. While transformation of the vector by the matrix is slightly faster than doing the same by quaternion, the compensated sensor system utilizing a matrix-based algorithm can be approximately 10% faster than the system utilizing quaternions (depending on implementation and hardware). PMID:25806874

  5. Improving the Precision and Speed of Euler Angles Computation from Low-Cost Rotation Sensor Data

    PubMed Central

    Janota, Aleš; Šimák, Vojtech; Nemec, Dušan; Hrbček, Jozef

    2015-01-01

    This article compares three different algorithms used to compute Euler angles from data obtained by the angular rate sensor (e.g., MEMS gyroscope)—the algorithms based on a rotational matrix, on transforming angular velocity to time derivations of the Euler angles and on unit quaternion expressing rotation. Algorithms are compared by their computational efficiency and accuracy of Euler angles estimation. If attitude of the object is computed only from data obtained by the gyroscope, the quaternion-based algorithm seems to be most suitable (having similar accuracy as the matrix-based algorithm, but taking approx. 30% less clock cycles on the 8-bit microcomputer). Integration of the Euler angles’ time derivations has a singularity, therefore is not accurate at full range of object’s attitude. Since the error in every real gyroscope system tends to increase with time due to its offset and thermal drift, we also propose some measures based on compensation by additional sensors (a magnetic compass and accelerometer). Vector data of mentioned secondary sensors has to be transformed into the inertial frame of reference. While transformation of the vector by the matrix is slightly faster than doing the same by quaternion, the compensated sensor system utilizing a matrix-based algorithm can be approximately 10% faster than the system utilizing quaternions (depending on implementation and hardware). PMID:25806874

  6. Protecting child health and nutrition status with ready-to-use food in addition to food assistance in urban Chad: a cost-effectiveness analysis

    PubMed Central

    2013-01-01

    Background Despite growing interest in use of lipid nutrient supplements for preventing child malnutrition and morbidity, there is inconclusive evidence on the effectiveness, and no evidence on the cost-effectiveness of this strategy. Methods A cost effectiveness analysis was conducted comparing costs and outcomes of two arms of a cluster randomized controlled trial implemented in eastern Chad during the 2010 hunger gap by Action contre la Faim France and Ghent University. This trial assessed the effect on child malnutrition and morbidity of a 5-month general distribution of staple rations, or staple rations plus a ready-to-use supplementary food (RUSF). RUSF was distributed to households with a child aged 6–36 months who was not acutely malnourished (weight-for-height > = 80% of the NCHS reference median, and absence of bilateral pitting edema), to prevent acute malnutrition in these children. While the addition of RUSF to a staple ration did not result in significant reduction in wasting rates, cost-effectiveness was assessed using successful secondary outcomes of cases of diarrhea and anemia (hemoglobin <110 g/L) averted among children receiving RUSF. Total costs of the program and incremental costs of RUSF and related management and logistics were estimated using accounting records and key informant interviews, and include costs to institutions and communities. An activity-based costing methodology was applied and incremental costs were calculated per episode of diarrhea and case of anemia averted. Results Adding RUSF to a general food distribution increased total costs by 23%, resulting in an additional cost per child of 374 EUR, and an incremental cost per episode of diarrhea averted of 1,083 EUR and per case of anemia averted of 3,627 EUR. Conclusions Adding RUSF to a staple ration was less cost-effective than other standard intervention options for averting diarrhea and anemia. This strategy holds potential to address a broad array of health and

  7. A low-computational-cost inverse heat transfer technique for convective heat transfer measurements in hypersonic flows

    NASA Astrophysics Data System (ADS)

    Avallone, F.; Greco, C. S.; Schrijer, F. F. J.; Cardone, G.

    2015-04-01

    The measurement of the convective wall heat flux in hypersonic flows may be particularly challenging in the presence of high-temperature gradients and when using high-thermal-conductivity materials. In this case, the solution of multidimensional problems is necessary, but it considerably increases the computational cost. In this paper, a low-computational-cost inverse data reduction technique is presented. It uses a recursive least-squares approach in combination with the trust-region-reflective algorithm as optimization procedure. The computational cost is reduced by performing the discrete Fourier transform on the discrete convective heat flux function and by identifying the most relevant coefficients as objects of the optimization algorithm. In the paper, the technique is validated by means of both synthetic data, built in order to reproduce physical conditions, and experimental data, carried out in the Hypersonic Test Facility Delft at Mach 7.5 on two wind tunnel models having different thermal properties.

  8. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    NASA Astrophysics Data System (ADS)

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-08-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, Eyegrade, a system for automatic grading of multiple choice exams is presented. While most current solutions are based on expensive scanners, Eyegrade offers a truly low-cost solution requiring only a regular off-the-shelf webcam. Additionally, Eyegrade performs both mark recognition as well as optical character recognition of handwritten student identification numbers, which avoids the use of bubbles in the answer sheet. When compared with similar webcam-based systems, the user interface in Eyegrade has been designed to provide a more efficient and error-free data collection procedure. The tool has been validated with a set of experiments that show the ease of use (both setup and operation), the reduction in grading time, and an increase in the reliability of the results when compared with conventional, more expensive systems.

  9. Computations on the primary photoreaction of Br2 with CO2: stepwise vs concerted addition of Br atoms.

    PubMed

    Xu, Kewei; Korter, Timothy M; Braiman, Mark S

    2015-04-01

    It was proposed previously that Br2-sensitized photolysis of liquid CO2 proceeds through a metastable primary photoproduct, CO2Br2. Possible mechanisms for such a photoreaction are explored here computationally. First, it is shown that the CO2Br radical is not stable in any geometry. This rules out a free-radical mechanism, for example, photochemical splitting of Br2 followed by stepwise addition of Br atoms to CO2-which in turn accounts for the lack of previously observed Br2+CO2 photochemistry in gas phases. A possible alternative mechanism in liquid phase is formation of a weakly bound CO2:Br2 complex, followed by concerted photoaddition of Br2. This hypothesis is suggested by the previously published spectroscopic detection of a binary CO2:Br2 complex in the supersonically cooled gas phase. We compute a global binding-energy minimum of -6.2 kJ mol(-1) for such complexes, in a linear geometry. Two additional local minima were computed for perpendicular (C2v) and nearly parallel asymmetric planar geometries, both with binding energies near -5.4 kJ mol(-1). In these two latter geometries, C-Br and O-Br bond distances are simultaneously in the range of 3.5-3.8 Å, that is, perhaps suitable for a concerted photoaddition under the temperature and pressure conditions where Br2 + CO2 photochemistry has been observed. PMID:25767936

  10. Phase Transition in Computing Cost of Overconstrained NP-Complete 3-SAT Problems

    NASA Astrophysics Data System (ADS)

    Woodson, Adam; O'Donnell, Thomas; Maniloff, Peter

    2002-03-01

    Many intractable, NP-Complete problems such as Traveling Salesmen (TSP) and 3-Satisfiability (3-Sat) which arise in hundreds of computer science, industrial and commercial applications, are now known to exhibit phase transitions in computational cost. While these problems appear to not have any structure which would make them amenable to attack with quantum computing, their critical behavior may allow physical insights derived from statistical mechanics and critical theory to shed light on these computationally ``hardest" of problems. While computational theory indicates that ``the intractability of the NP-Complete class resides solely in the exponential growth of the possible solutions" with the number of variables, n, the present work instead investigates the complex patterns of ``overlap" amongst 3-SAT clauses (their combined effects) when n-tuples of these act in succession to reduce the space of valid solutions. An exhaustive-search algorithm was used to eliminate `bad' states from amongst the `good' states residing within the spaces of all 2^n--possible solutions of randomly generated 3-Sat problems. No backtracking nor optimization heuristics were employed, nor was problem structure exploited (i.e., phtypical cases were generated), and the (k=3)-Sat propositional logic problems generated were in standard, conjunctive normal form (CNF). Each problem had an effectively infinite number of clauses, m (i.e., with r = m/n >= 10), to insure every problem would not be satisfiable (i.e. that each would fail), and duplicate clauses were not permitted. This process was repeated for each of several low values of n (i.e., 4 <= n <= 20). The entire history of solution-states elimination as successive clauses were applied was archived until, in each instance, sufficient clauses were applied to kill all possible solutions . An asymmetric, sigmoid-shaped phase transition is observed in Fg = F_g(m'/n), the fraction of the original 2^n ``good" solutions remaining valid as a

  11. Financial Quality Control of In-Patient Chemotherapy in Germany: Are Additional Payments Cost-Covering for Pharmaco-Oncological Expenses?

    PubMed Central

    Jacobs, Volker R.; Mallmann, Peter

    2011-01-01

    Summary Background Cost-covering in-patient care is increasingly important for hospital providers in Germany, especially with regard to expensive oncological pharmaceuticals. Additional payments (Zusatzentgelte; ZE) on top of flat rate diagnose-related group (DRG) reimbursement can be claimed by hospitals for in-patient use of selected medications. To verify cost coverage of in-patient chemotherapies, the costs of medication were compared to their revenues. Method From January to June 2010, a retrospective cost-revenue study was performed at a German obstetrics/gynecology university clinic. The hospital's pharmacy list of inpatient oncological therapies for breast and gynecological cancer was checked for accuracy and compared with the documented ZEs and the costs and revenues for each oncological application. Results N = 45 in-patient oncological therapies were identified in n = 18 patients, as well as n = 7 bisphosphonate applications; n = 11 ZEs were documented. Costs for oncological medication were € 33,752. The corresponding ZE revenues amounted to only € 13,980, resulting in a loss of € 19,772. All in-patient oncological therapies performed were not cost-covering. Data discrepancy, incorrect documentation and cost attribution, and process aborts were identified. Conclusions Routine financial quality control at the medicine-pharmacy administration interface is implemented, with monthly comparison of costs and revenues, as well as admission status. Non-cost-covering therapies for in-patients should be converted to out-patient therapies. Necessary adjustments of clinic processes are made according to these results, to avoid future losses. PMID:21673822

  12. Low cost, high resolution x-ray detector system for digital radiography and computed tomography

    SciTech Connect

    Smith, C.R.; Erker, J.W.

    1993-12-31

    The authors have designed and evaluated a novel design of line array x-ray detector for use with digital radiography (DR) and computed tomography (CT) systems. The Radiographic Line Scan (RLS) detector is less than half the cost of discrete multi-channel line array detectors, yet provides the potential for resolution to less than 25 {micro}m at energies of 420 kV. The RLS detector consists of a scintillator fiber-optically coupled to a thermo-electrically cooled line array CCD. Gadolinium oxysulfide screen material has been used as the scintillator, in thicknesses up to 250 {micro}m. Scintillating glass, which is formed into a fiber optic bundle, has also been used in thicknesses up to 2 mm. The large 2.5 mm by 25 {micro}m CCD cells provide high dynamic range while preserving high resolution; the 2.5 mm dimension is oriented in the x-ray absorption direction while the 25 {micro}m dimension is oriented in the resolution direction. Servo controlled thermo-electric cooling of the CCD to a fixed temperature provides reduction of dark current and stabilization of the output. Greater dynamic range is achieved by reducing the dark current, while output stabilization reduces the need for frequent calibration of the detector. Measured performance characteristics are presented along with DR and CT images produced using the RLS detector.

  13. EPA evaluation of the SYNERGY-1 fuel additive under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1981-06-01

    This document announces the conclusions of the EPA evaluation of the 'SYNERGY-1' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. This additive is intended to improve fuel economy and exhaust emission levels of two and four cycle gasoline fueled engines.

  14. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING...+B×R m where: R f=The risk free interest rate—the average of the most recent auction rates of...

  15. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  16. 12 CFR Appendix L to Part 1026 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. L Appendix L to Part 1026—Assumed Loan Periods...

  17. 12 CFR Appendix L to Part 1026 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. L Appendix L to Part 1026—Assumed Loan Periods...

  18. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions...

  19. Resource Utilization and Costs during the Initial Years of Lung Cancer Screening with Computed Tomography in Canada

    PubMed Central

    Lam, Stephen; Tammemagi, Martin C.; Evans, William K.; Leighl, Natasha B.; Regier, Dean A.; Bolbocean, Corneliu; Shepherd, Frances A.; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R.; Mayo, John R.; McWilliams, Annette; Couture, Christian; English, John C.; Goffin, John; Hwang, David M.; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J.; Goss, Glenwood D.; Nicholas, Garth; Seely, Jean M.; Sekhon, Harmanjatinder S.; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N.; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D.; Tan, Wan C.; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J.

    2014-01-01

    Background: It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Methods: Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer’s perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. Results: The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400–$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553–$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254–$52,200; p = 0.061). Conclusion: In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure. PMID:25105438

  20. Additive manufacturing of liquid/gas diffusion layers for low-cost and high-efficiency hydrogen production

    DOE PAGESBeta

    Mo, Jingke; Zhang, Feng -Yuan; Dehoff, Ryan R.; Peter, William H.; Toops, Todd J.; Green, Jr., Johney Boyd

    2016-01-14

    The electron beam melting (EBM) additive manufacturing technology was used to fabricate titanium liquid/gas diffusion media with high-corrosion resistances and well-controllable multifunctional parameters, including two-phase transport and excellent electric/thermal conductivities, has been first demonstrated. Their applications in proton exchange membrane eletrolyzer cells have been explored in-situ in a cell and characterized ex-situ with SEM and XRD. Compared with the conventional woven liquid/gas diffusion layers (LGDLs), much better performance with EBM fabricated LGDLs is obtained due to their significant reduction of ohmic loss. The EBM technology components exhibited several distinguished advantages in fabricating gas diffusion layer: well-controllable pore morphology and structure,more » rapid prototyping, fast manufacturing, highly customizing and economic. In addition, by taking advantage of additive manufacturing, it possible to fabricate complicated three-dimensional designs of virtually any shape from a digital model into one single solid object faster, cheaper and easier, especially for titanium. More importantly, this development will provide LGDLs with control of pore size, pore shape, pore distribution, and therefore porosity and permeability, which will be very valuable to develop modeling and to validate simulations of electrolyzers with optimal and repeatable performance. Further, it will lead to a manufacturing solution to greatly simplify the PEMEC/fuel cell components and to couple the LGDLs with other parts, since they can be easily integrated together with this advanced manufacturing process« less

  1. Low cost computer subsystem for the Solar Electric Propulsion Stage (SEPS)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Solar Electric Propulsion Stage (SEPS) subsystem which consists of the computer, custom input/output (I/O) unit, and tape recorder for mass storage of telemetry data was studied. Computer software and interface requirements were developed along with computer and I/O unit design parameters. Redundancy implementation was emphasized. Reliability analysis was performed for the complete command computer sybsystem. A SEPS fault tolerant memory breadboard was constructed and its operation demonstrated.

  2. Turbulence computations with 3-D small-scale additive turbulent decomposition and data-fitting using chaotic map combinations

    SciTech Connect

    Mukerji, S.

    1997-12-31

    Although the equations governing turbulent fluid flow, the Navier-Stokes (N.S.) equations, have been known for well over a century and there is a clear technological necessity in obtaining solutions to these equations, turbulence remains one of the principal unsolved problems in physics today. It is still not possible to make accurate quantitative predictions about turbulent flows without relying heavily on empirical data. In principle, it is possible to obtain turbulent solutions from a direct numerical simulation (DNS) of the N.-S. equations. The author first provides a brief introduction to the dynamics of turbulent flows. The N.-S. equations which govern fluid flow, are described thereafter. Then he gives a brief overview of DNS calculations and where they stand at present. He next introduces the two most popular approaches for doing turbulent computations currently in use, namely, the Reynolds averaging of the N.-S. equations (RANS) and large-eddy simulation (LES). Approximations, often ad hoc ones, are present in these methods because use is made of heuristic models for turbulence quantities (the Reynolds stresses) which are otherwise unknown. They then introduce a new computational method called additive turbulent decomposition (ATD), the small-scale version of which is the topic of this research. The rest of the thesis is organized as follows. In Chapter 2 he describes the ATD procedure in greater detail; how dependent variables are split and the decomposition into large- and small-scale sets of equations. In Chapter 3 the spectral projection of the small-scale momentum equations are derived in detail. In Chapter 4 results of the computations with the small-scale ATD equations are presented. In Chapter 5 he describes the data-fitting procedure which can be used to directly specify the parameters of a chaotic-map turbulence model.

  3. Low-cost computing and network communication for a point-of-care device to perform a 3-part leukocyte differential

    NASA Astrophysics Data System (ADS)

    Powless, Amy J.; Feekin, Lauren E.; Hutcheson, Joshua A.; Alapat, Daisy V.; Muldoon, Timothy J.

    2016-03-01

    Point-of-care approaches for 3-part leukocyte differentials (granulocyte, monocyte, and lymphocyte), traditionally performed using a hematology analyzer within a panel of tests called a complete blood count (CBC), are essential not only to reduce cost but to provide faster results in low resource areas. Recent developments in lab-on-a-chip devices have shown promise in reducing the size and reagents used, relating to a decrease in overall cost. Furthermore, smartphone diagnostic approaches have shown much promise in the area of point-of-care diagnostics, but the relatively high per-unit cost may limit their utility in some settings. We present here a method to reduce computing cost of a simple epi-fluorescence imaging system using a Raspberry Pi (single-board computer, <$40) to perform a 3-part leukocyte differential comparable to results from a hematology analyzer. This system uses a USB color camera in conjunction with a leukocyte-selective vital dye (acridine orange) in order to determine a leukocyte count and differential from a low volume (<20 microliters) of whole blood obtained via fingerstick. Additionally, the system utilizes a "cloud-based" approach to send image data from the Raspberry Pi to a main server and return results back to the user, exporting the bulk of the computational requirements. Six images were acquired per minute with up to 200 cells per field of view. Preliminary results showed that the differential count varied significantly in monocytes with a 1 minute time difference indicating the importance of time-gating to produce an accurate/consist differential.

  4. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  5. Neural Correlates of Task Cost for Stance Control with an Additional Motor Task: Phase-Locked Electroencephalogram Responses

    PubMed Central

    Hwang, Ing-Shiou; Huang, Cheng-Ya

    2016-01-01

    With appropriate reallocation of central resources, the ability to maintain an erect posture is not necessarily degraded by a concurrent motor task. This study investigated the neural control of a particular postural-suprapostural procedure involving brain mechanisms to solve crosstalk between posture and motor subtasks. Participants completed a single posture task and a dual-task while concurrently conducting force-matching and maintaining a tilted stabilometer stance at a target angle. Stabilometer movements and event-related potentials (ERPs) were recorded. The added force-matching task increased the irregularity of postural response rather than the size of postural response prior to force-matching. In addition, the added force-matching task during stabilometer stance led to marked topographic ERP modulation, with greater P2 positivity in the frontal and sensorimotor-parietal areas of the N1-P2 transitional phase and in the sensorimotor-parietal area of the late P2 phase. The time-frequency distribution of the ERP primary principal component revealed that the dual-task condition manifested more pronounced delta (1–4 Hz) and beta (13–35 Hz) synchronizations but suppressed theta activity (4–8 Hz) before force-matching. The dual-task condition also manifested coherent fronto-parietal delta activity in the P2 period. In addition to a decrease in postural regularity, this study reveals spatio-temporal and temporal-spectral reorganizations of ERPs in the fronto-sensorimotor-parietal network due to the added suprapostural motor task. For a particular set of postural-suprapostural task, the behavior and neural data suggest a facilitatory role of autonomous postural response and central resource expansion with increasing interregional interactions for task-shift and planning the motor-suprapostural task. PMID:27010634

  6. Neural Correlates of Task Cost for Stance Control with an Additional Motor Task: Phase-Locked Electroencephalogram Responses.

    PubMed

    Hwang, Ing-Shiou; Huang, Cheng-Ya

    2016-01-01

    With appropriate reallocation of central resources, the ability to maintain an erect posture is not necessarily degraded by a concurrent motor task. This study investigated the neural control of a particular postural-suprapostural procedure involving brain mechanisms to solve crosstalk between posture and motor subtasks. Participants completed a single posture task and a dual-task while concurrently conducting force-matching and maintaining a tilted stabilometer stance at a target angle. Stabilometer movements and event-related potentials (ERPs) were recorded. The added force-matching task increased the irregularity of postural response rather than the size of postural response prior to force-matching. In addition, the added force-matching task during stabilometer stance led to marked topographic ERP modulation, with greater P2 positivity in the frontal and sensorimotor-parietal areas of the N1-P2 transitional phase and in the sensorimotor-parietal area of the late P2 phase. The time-frequency distribution of the ERP primary principal component revealed that the dual-task condition manifested more pronounced delta (1-4 Hz) and beta (13-35 Hz) synchronizations but suppressed theta activity (4-8 Hz) before force-matching. The dual-task condition also manifested coherent fronto-parietal delta activity in the P2 period. In addition to a decrease in postural regularity, this study reveals spatio-temporal and temporal-spectral reorganizations of ERPs in the fronto-sensorimotor-parietal network due to the added suprapostural motor task. For a particular set of postural-suprapostural task, the behavior and neural data suggest a facilitatory role of autonomous postural response and central resource expansion with increasing interregional interactions for task-shift and planning the motor-suprapostural task. PMID:27010634

  7. Low-Cost Magnetic Stirrer from Recycled Computer Parts with Optional Hot Plate

    ERIC Educational Resources Information Center

    Guidote, Armando M., Jr.; Pacot, Giselle Mae M.; Cabacungan, Paul M.

    2015-01-01

    Magnetic stirrers and hot plates are key components of science laboratories. However, these are not readily available in many developing countries due to their high cost. This article describes the design of a low-cost magnetic stirrer with hot plate from recycled materials. Some of the materials used are neodymium magnets and CPU fans from…

  8. Linking process, structure, property, and performance for metal-based additive manufacturing: computational approaches with experimental support

    NASA Astrophysics Data System (ADS)

    Smith, Jacob; Xiong, Wei; Yan, Wentao; Lin, Stephen; Cheng, Puikei; Kafka, Orion L.; Wagner, Gregory J.; Cao, Jian; Liu, Wing Kam

    2016-04-01

    Additive manufacturing (AM) methods for rapid prototyping of 3D materials (3D printing) have become increasingly popular with a particular recent emphasis on those methods used for metallic materials. These processes typically involve an accumulation of cyclic phase changes. The widespread interest in these methods is largely stimulated by their unique ability to create components of considerable complexity. However, modeling such processes is exceedingly difficult due to the highly localized and drastic material evolution that often occurs over the course of the manufacture time of each component. Final product characterization and validation are currently driven primarily by experimental means as a result of the lack of robust modeling procedures. In the present work, the authors discuss primary detrimental hurdles that have plagued effective modeling of AM methods for metallic materials while also providing logical speculation into preferable research directions for overcoming these hurdles. The primary focus of this work encompasses the specific areas of high-performance computing, multiscale modeling, materials characterization, process modeling, experimentation, and validation for final product performance of additively manufactured metallic components.

  9. Usability of a Low-Cost Head Tracking Computer Access Method following Stroke.

    PubMed

    Mah, Jasmine; Jutai, Jeffrey W; Finestone, Hillel; Mckee, Hilary; Carter, Melanie

    2015-01-01

    Assistive technology devices for computer access can facilitate social reintegration and promote independence for people who have had a stroke. This work describes the exploration of the usefulness and acceptability of a new computer access device called the Nouse™ (Nose-as-mouse). The device uses standard webcam and video recognition algorithms to map the movement of the user's nose to a computer cursor, thereby allowing hands-free computer operation. Ten participants receiving in- or outpatient stroke rehabilitation completed a series of standardized and everyday computer tasks using the Nouse™ and then completed a device usability questionnaire. Task completion rates were high (90%) for computer activities only in the absence of time constraints. Most of the participants were satisfied with ease of use (70%) and liked using the Nouse™ (60%), indicating they could resume most of their usual computer activities apart from word-processing using the device. The findings suggest that hands-free computer access devices like the Nouse™ may be an option for people who experience upper motor impairment caused by stroke and are highly motivated to resume personal computing. More research is necessary to further evaluate the effectiveness of this technology, especially in relation to other computer access assistive technology devices. PMID:26427744

  10. Low cost and high performance on-board computer for picosatellite

    NASA Astrophysics Data System (ADS)

    Rajkowski, T.; Graczyk, R.; Palau, M. C.; Orleański, P.

    2012-05-01

    This work presents new design of an on-board computer utilizing COTS, non-space qualified components. Common attributes of already used computers for pico- and nanosatellites are presented as well, and the need for new solutions of on-board computer for such satellites (concentrating on CubeSat satellites) is explained. Requirements for electronic devices which are sent to low Earth orbit in CubeSats are described widely. Finally, first version of architecture of onboard computer for CubeSat is presented (PICARD project - Picosatellite Computer Architecture Development). Computer utilizes two processing units - primary: low power unit (ATmega128 microcontroller) and secondary: high performance unit (Spartan-6, SRAM FPGA). Basic features of these devices are presented, clarifying the choice of these units to the project.

  11. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motions: Computer program description

    NASA Technical Reports Server (NTRS)

    Petrarca, J. R.; Harrison, B. A.; Redman, M. C.; Rowe, W. S.

    1979-01-01

    A digital computer program was developed to calculate unsteady loadings caused by motions of lifting surfaces with leading edge and trailing edge controls based on the subsonic kernel function approach. The pressure singularities at hinge line and side edges were extracted analytically as a preliminary step to solving the integral equation of collocation. The program calculates generalized aerodynamic forces for user supplied deflection modes. Optional intermediate output includes pressure at an array of points, and sectional generalized forces. From one to six controls on the half span can be accomodated.

  12. Can low-cost VOR and Omega receivers suffice for RNAV - A new computer-based navigation technique

    NASA Technical Reports Server (NTRS)

    Hollaar, L. A.

    1978-01-01

    It is shown that although RNAV is particularly valuable for the personal transportation segment of general aviation, it has not gained complete acceptance. This is due, in part, to its high cost and the necessary special-handling air traffic control. VOR/DME RNAV calculations are ideally suited for analog computers, and the use of microprocessor technology has been suggested for reducing RNAV costs. Three navigation systems, VOR, Omega, and DR, are compared for common navigational difficulties, such as station geometry, siting errors, ground disturbances, and terminal area coverage. The Kalman filtering technique is described with reference to the disadvantages when using a system including standard microprocessors. An integrated navigation system, using input data from various low-cost sensor systems, is presented and current simulation studies are noted.

  13. Model implementation for dynamic computation of system cost for advanced life support

    NASA Technical Reports Server (NTRS)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  14. Evaluating Computer-Assisted Career Guidance Systems: A Critique of the Differential Feature-Cost Approach.

    ERIC Educational Resources Information Center

    Oliver, Laurel W.

    1990-01-01

    Finds the feature-cost analysis method (Sampson et al., CE 521 972) a useful tool, but suggests that users need to determine which criteria are most important to them on the basis of a needs assessment. (SK)

  15. Cost-effectiveness of rituximab in addition to fludarabine and cyclophosphamide (R-FC) for the first-line treatment of chronic lymphocytic leukemia.

    PubMed

    Müller, Dirk; Fischer, Kirsten; Kaiser, Peter; Eichhorst, Barbara; Walshe, Ronald; Reiser, Marcel; Kellermann, Lenka; Borsi, Lisa; Civello, Daniele; Mensch, Alexander; Bahlo, Jasmin; Hallek, Michael; Stock, Stephanie; Fingerle-Rowson, Günter

    2016-05-01

    The cost-effectiveness of rituximab in combination with fludarabine/cyclophosphamide (R-FC) for the first line treatment of chronic lymphocytic leukemia (CLL) was evaluated. Based on long-term clinical data (follow-up of 5.9 years) from the CLL8-trial, a Markov-model with three health states (Free from disease progression, Progressive disease, Death) was used to evaluate the cost per quality-adjusted life-year (QALY) and cost per life years gained (LYG) of R-FC from the perspective of the German statutory health insurance (SHI). The addition of rituximab to FC chemotherapy results in a gain of 1.1 quality-adjusted life-years. The incremental cost-effectiveness ratio (ICER) of R-FC compared with FC was €17 979 per QALY (€15 773 per LYG). Results were robust in deterministic and probabilistic sensitivity analyses. From the German SHI perspective, rituximab in combination with FC chemotherapy represents good value for first-line treatment of patients with CLL and compares favorably with chemotherapy alone. PMID:26584689

  16. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    NASA Astrophysics Data System (ADS)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  17. Computer simulation of energy use, greenhouse gas emissions, and costs for alternative methods of processing fluid milk.

    PubMed

    Tomasula, P M; Datta, N; Yee, W C F; McAloon, A J; Nutter, D W; Sampedro, F; Bonnaillie, L M

    2014-07-01

    Computer simulation is a useful tool for benchmarking electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature, short time (HTST) pasteurization was extended to include models for processes for shelf-stable milk and extended shelf-life milk that may help prevent the loss or waste of milk that leads to increases in the greenhouse gas (GHG) emissions for fluid milk. The models were for UHT processing, crossflow microfiltration (MF) without HTST pasteurization, crossflow MF followed by HTST pasteurization (MF/HTST), crossflow MF/HTST with partial homogenization, and pulsed electric field (PEF) processing, and were incorporated into the existing model for the fluid milk process. Simulation trials were conducted assuming a production rate for the plants of 113.6 million liters of milk per year to produce only whole milk (3.25%) and 40% cream. Results showed that GHG emissions in the form of process-related CO₂ emissions, defined as CO₂ equivalents (e)/kg of raw milk processed (RMP), and specific energy consumptions (SEC) for electricity and natural gas use for the HTST process alone were 37.6g of CO₂e/kg of RMP, 0.14 MJ/kg of RMP, and 0.13 MJ/kg of RMP, respectively. Emissions of CO2 and SEC for electricity and natural gas use were highest for the PEF process, with values of 99.1g of CO₂e/kg of RMP, 0.44 MJ/kg of RMP, and 0.10 MJ/kg of RMP, respectively, and lowest for the UHT process at 31.4 g of CO₂e/kg of RMP, 0.10 MJ/kg of RMP, and 0.17 MJ/kg of RMP. Estimated unit production costs associated with the various processes were lowest for the HTST process and MF/HTST with partial homogenization at $0.507/L and highest for the UHT process at $0.60/L. The increase in shelf life associated with the UHT and MF processes may eliminate some of the supply chain product and consumer losses and waste of milk and compensate for the small increases in GHG

  18. Low-Cost Computer-Controlled Current Stimulator for the Student Laboratory

    ERIC Educational Resources Information Center

    Guclu, Burak

    2007-01-01

    Electrical stimulation of nerve and muscle tissues is frequently used for teaching core concepts in physiology. It is usually expensive to provide every student group in the laboratory with an individual stimulator. This article presents the design and application of a low-cost [about $100 (U.S.)] isolated stimulator that can be controlled by two…

  19. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    ERIC Educational Resources Information Center

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-01-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, "Eyegrade," a system for automatic grading of multiple…

  20. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    PubMed Central

    Handford, Matthew L.; Srinivasan, Manoj

    2016-01-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost – even lower than assuming that the non-amputee’s ankle torques are cost-free. PMID:26857747

  1. Virtual Grower: Estimating Greenhouse Energy Costs and Plant Growth Using New Computer Software

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Greenhouse crop production is a complex, integrated system wherein a change in one component inevitably influences different, sometimes seemingly disparate components. For example, growers may modify their heating schedules to reduce energy costs, but a cooler temperature set-point can delay crop d...

  2. Computer analysis of effects of altering jet fuel properties on refinery costs and yields

    NASA Technical Reports Server (NTRS)

    Breton, T.; Dunbar, D.

    1984-01-01

    This study was undertaken to evaluate the adequacy of future U.S. jet fuel supplies, the potential for large increases in the cost of jet fuel, and to what extent a relaxation in jet fuel properties would remedy these potential problems. The results of the study indicate that refiners should be able to meet jet fuel output requirements in all regions of the country within the current Jet A specifications during the 1990-2010 period. The results also indicate that it will be more difficult to meet Jet A specifications on the West Coast, because the feedstock quality is worse and the required jet fuel yield (jet fuel/crude refined) is higher than in the East. The results show that jet fuel production costs could be reduced by relaxing fuel properties. Potential cost savings in the East (PADDs I-IV) through property relaxation were found to be about 1.3 cents/liter (5 cents/gallon) in January 1, 1981 dollars between 1990 and 2010. However, the savings from property relaxation were all obtained within the range of current Jet A specifications, so there is no financial incentive to relax Jet A fuel specifications in the East. In the West (PADD V) the potential cost savings from lowering fuel quality were considerably greater than in the East. Cost savings from 2.7 to 3.7 cents/liter (10-14 cents/gallon) were found. In contrast to the East, on the West Coast a significant part of the savings was obtained through relaxation of the current Jet A fuel specifications.

  3. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  4. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  5. Low cost SCR lamp driver indicates contents of digital computer registers

    NASA Technical Reports Server (NTRS)

    Cliff, R. A.

    1967-01-01

    Silicon Controlled Rectifier /SCR/ lamp driver is adapted for use in integrated circuit digital computers where it indicates the contents of the various registers. The threshold voltage at which visual indication begins is very sharply defined and can be adjusted to suit particular system requirements.

  6. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  7. Cost Study of Educational Media Systems and Their Equipment Components. Volume III, A Supplementary Report: Computer Assisted Instruction. Final Report.

    ERIC Educational Resources Information Center

    General Learning Corp., Washington, DC.

    The COST-ED model (Costs of Schools, Training, and Education) of the instructional process encourages the recognition of management alternatives and potential cost-savings. It is used to calculate the minimum cost of performing specified instructional tasks. COST-ED components are presented as cost modules in a flowchart format for manpower,…

  8. Low-cost digital image processing on a university mainframe computer. [considerations in selecting and/or designing instructional systems

    NASA Technical Reports Server (NTRS)

    Williams, T. H. L.

    1981-01-01

    The advantages and limitations of using university mainframe computers in digital image processing instruction are listed. Aspects to be considered when designing software for this purpose include not only two general audience, but also the capabilities of the system regarding the size of the image/subimage, preprocessing and enhancement functions, geometric correction and registration techniques; classification strategy, classification algorithm, multitemporal analysis, and ancilliary data and geographic information systems. The user/software/hardware interaction as well as acquisition and operating costs must also be considered.

  9. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  10. Low-cost reconfigurable DSP-based parallel image processing computer

    NASA Astrophysics Data System (ADS)

    Murphy, Ciaron W.; Harvey, David M.; Nicolson, Laurence J.

    1998-10-01

    To develop a cost-effective re-configurable DSP engine, it has been proposed to upgrade an existing custom designed TMS320C40 based multi-processing architecture with run-time configuration capabilities. The upgrade will consist of four Xilinx XC6200 series field programmable gate arrays which will enable concurrent algorithm structures to be efficiently mapped onto the system. Furthermore, the upgraded architecture will provide a platform for the development of adaptive routing structures, self- configuration techniques and facilitate the merging of instruction and hardware based parallelism.

  11. Yeast hydrolysate as a low-cost additive to serum-free medium for the production of human thrombopoietin in suspension cultures of Chinese hamster ovary cells.

    PubMed

    Sung, Y H; Lim, S W; Chung, J Y; Lee, G M

    2004-02-01

    To enhance the performance of a serum-free medium (SFM) for human thrombopoietin (hTPO) production in suspension cultures of recombinant Chinese hamster ovary (rCHO) cells, several low-cost hydrolysates such as yeast hydrolysate (YH), soy hydrolysate, wheat gluten hydrolysate and rice hydrolysate were tested as medium additives. Among various hydrolysates tested, the positive effect of YH on hTPO production was most significant. When 5 g l(-1) YH was added to SFM, the maximum hTPO concentration in batch culture was 40.41 microg ml(-1), which is 11.5 times higher than that in SFM without YH supplementation. This enhanced hTPO production in YH-supplemented SFM was obtained by the combined effect of enhanced q(hTPO) (the specific rate of hTPO production). The supplementation of YH in SFM increased q(hTPO) by 294% and extended culture longevity by >2 days if the culture was terminated at a cell viability of 50%. Furthermore, cell viability throughout the culture using YH-supplemented SFM was higher than that using any other hydrolysate-supplemented SFM tested, thereby minimizing degradation of hTPO susceptible to proteolytic degradation. In addition, YH supplementation did not affect in vivo biological activity of hTPO. Taken together, the results obtained demonstrate the potential of YH as a medium additive for hTPO production in serum-free suspension cultures of rCHO cells. PMID:12856163

  12. Matched filtering of gravitational waves from inspiraling compact binaries: Computational cost and template placement

    NASA Astrophysics Data System (ADS)

    Owen, Benjamin J.; Sathyaprakash, B. S.

    1999-07-01

    We estimate the number of templates, computational power, and storage required for a one-step matched filtering search for gravitational waves from inspiraling compact binaries. Our estimates for the one-step search strategy should serve as benchmarks for the evaluation of more sophisticated strategies such as hierarchical searches. We use a discrete family of two-parameter wave form templates based on the second post-Newtonian approximation for binaries composed of nonspinning compact bodies in circular orbits. We present estimates for all of the large- and mid-scale interferometers now under construction: LIGO (three configurations), VIRGO, GEO600, and TAMA. To search for binaries with components more massive than mmin=0.2Msolar while losing no more than 10% of events due to coarseness of template spacing, the initial LIGO interferometers will require about 1.0×1011 flops (floating point operations per second) for data analysis to keep up with data acquisition. This is several times higher than estimated in previous work by Owen, in part because of the improved family of templates and in part because we use more realistic (higher) sampling rates. Enhanced LIGO, GEO600, and TAMA will require computational power similar to initial LIGO. Advanced LIGO will require 7.8×1011 flops, and VIRGO will require 4.8×1012 flops to take full advantage of its broad target noise spectrum. If the templates are stored rather than generated as needed, storage requirements range from 1.5×1011 real numbers for TAMA to 6.2×1014 for VIRGO. The computational power required scales roughly as m-8/3min and the storage as m-13/3min. Since these scalings are perturbed by the curvature of the parameter space at second post-Newtonian order, we also provide estimates for a search with mmin=1Msolar. Finally, we sketch and discuss an algorithm for placing the templates in the parameter space.

  13. Computer-based word-processing system speeds nuke plant startup, cuts cost

    SciTech Connect

    Not Available

    1980-08-01

    The start-up time for Louisiana Power and Light's Waterford III nuclear power plant will be shortened from 26 to 12 months by using the Automated Document Production System (ADOPS), a text management system that will pay for itself in the first half-day of reduced start-up time. With up to a billion megabytes of disc storage, the computer serves as a daytime word processor and as a remote job entry terminal at night. Start-up functions include critical path scheduling, compliance demonstration, and safety tracking and reporting. (DCK)

  14. A visual probe localization and calibration system for cost-effective computer-aided 3D ultrasound.

    PubMed

    Ali, Aziah; Logeswaran, Rajasvaran

    2007-08-01

    The 3D ultrasound systems produce much better reproductions than 2D ultrasound, but their prohibitively high cost deprives many less affluent organization this benefit. This paper proposes using the conventional 2D ultrasound equipment readily available in most hospitals, along with a single conventional digital camera, to construct 3D ultrasound images. The proposed system applies computer vision to extract position information of the ultrasound probe while the scanning takes place. The probe, calibrated in order to calculate the offset of the ultrasound scan from the position of the marker attached to it, is used to scan a number of geometrical objects. Using the proposed system, the 3D volumes of the objects were successfully reconstructed. The system was tested in clinical situations where human body parts were scanned. The results presented, and confirmed by medical staff, are very encouraging for cost-effective implementation of computer-aided 3D ultrasound using a simple setup with 2D ultrasound equipment and a conventional digital camera. PMID:17126314

  15. Taming the Electronic Structure of Diradicals through the Window of Computationally Cost Effective Multireference Perturbation Theory.

    PubMed

    Sinha Ray, Suvonil; Ghosh, Anirban; Chattopadhyay, Sudip; Chaudhuri, Rajat K

    2016-07-28

    Recently a state-specific multireference perturbation theory (SSMRPT) with an improved virtual orbitals complete active space configuration interaction (IVO-CASCI) reference function has been proposed for treating electronic structures of radicals such as methylene, m-benzyne, pyridyne, and pyridynium cation. This new development in MRPT, termed as IVO-SSMRPT, ensures that it is able to describe the structure of radicaloids with reasonable accuracy even with small reference spaces. IVO-SSMRPT is also capable of predicting the correct ordering of the lowest singlet-triplet gaps. Investigation of the first three electronic states of the oxygen molecule has also been used for rating our method. The agreement of our estimates with the available far more expensive benchmark state-of-the-art ab initio calculations is creditable. The IVO-SSMRPT method provides an effective avenue with manageable cost/accuracy ratio for accurately dealing with radicaloid systems possessing varying degrees of quasidegeneracy. PMID:27355260

  16. Computational cost of image registration with a parallel binary array processor

    SciTech Connect

    Reeves, A.P.; Rostampour, A.

    1982-07-01

    The application of a simulated binary array processor (BAP) to the rapid analysis of a sequence of images has been studied. Several algorithms have been developed which may be implemented on many existing parallel processing machines. The characteristic operations of a BAP are discussed and analyzed. A set of preprocessing algorithms are described which are designed to register two images of tv-type video data in real time. These algorithms illustrate the potential uses of a BAP and their cost is analyzed in detail. The results of applying these algorithms to flir data and to noisy optical data are given. An analysis of these algorithms illustrates the importance of an efficient global feature extraction hardware for image understanding applications. 16 references.

  17. Computational evaluation of cellular metabolic costs successfully predicts genes whose expression is deleterious

    PubMed Central

    Wagner, Allon; Zarecki, Raphy; Reshef, Leah; Gochev, Camelia; Sorek, Rotem; Gophna, Uri; Ruppin, Eytan

    2013-01-01

    Gene suppression and overexpression are both fundamental tools in linking genotype to phenotype in model organisms. Computational methods have proven invaluable in studying and predicting the deleterious effects of gene deletions, and yet parallel computational methods for overexpression are still lacking. Here, we present Expression-Dependent Gene Effects (EDGE), an in silico method that can predict the deleterious effects resulting from overexpression of either native or foreign metabolic genes. We first test and validate EDGE’s predictive power in bacteria through a combination of small-scale growth experiments that we performed and analysis of extant large-scale datasets. Second, a broad cross-species analysis, ranging from microorganisms to multiple plant and human tissues, shows that genes that EDGE predicts to be deleterious when overexpressed are indeed typically down-regulated. This reflects a universal selection force keeping the expression of potentially deleterious genes in check. Third, EDGE-based analysis shows that cancer genetic reprogramming specifically suppresses genes whose overexpression impedes proliferation. The magnitude of this suppression is large enough to enable an almost perfect distinction between normal and cancerous tissues based solely on EDGE results. We expect EDGE to advance our understanding of human pathologies associated with up-regulation of particular transcripts and to facilitate the utilization of gene overexpression in metabolic engineering. PMID:24198337

  18. Potentially Low Cost Solution to Extend Use of Early Generation Computed Tomography

    PubMed Central

    Tonna, Joseph E.; Balanoff, Amy M.; Lewin, Matthew R.; Saandari, Namjilmaa; Wintermark, Max

    2010-01-01

    In preparing a case report on Brown-Séquard syndrome for publication, we made the incidental finding that the inexpensive, commercially available three-dimensional (3D) rendering software we were using could produce high quality 3D spinal cord reconstructions from any series of two-dimensional (2D) computed tomography (CT) images. This finding raises the possibility that spinal cord imaging capabilities can be expanded where bundled 2D multi-planar reformats and 3D reconstruction software for CT are not available and in situations where magnetic resonance imaging (MRI) is either not available or appropriate (e.g. metallic implants). Given the worldwide burden of trauma and considering the limited availability of MRI and advanced generation CT scanners, we propose an alternative, potentially useful approach to imaging spinal cord that might be useful in areas where technical capabilities and support are limited. PMID:21293767

  19. Avoiding the Enumeration of Infeasible Elementary Flux Modes by Including Transcriptional Regulatory Rules in the Enumeration Process Saves Computational Costs

    PubMed Central

    Jungreuthmayer, Christian; Ruckerbauer, David E.; Gerstl, Matthias P.; Hanscho, Michael; Zanghellini, Jürgen

    2015-01-01

    Despite the significant progress made in recent years, the computation of the complete set of elementary flux modes of large or even genome-scale metabolic networks is still impossible. We introduce a novel approach to speed up the calculation of elementary flux modes by including transcriptional regulatory information into the analysis of metabolic networks. Taking into account gene regulation dramatically reduces the solution space and allows the presented algorithm to constantly eliminate biologically infeasible modes at an early stage of the computation procedure. Thereby, computational costs, such as runtime, memory usage, and disk space, are extremely reduced. Moreover, we show that the application of transcriptional rules identifies non-trivial system-wide effects on metabolism. Using the presented algorithm pushes the size of metabolic networks that can be studied by elementary flux modes to new and much higher limits without the loss of predictive quality. This makes unbiased, system-wide predictions in large scale metabolic networks possible without resorting to any optimization principle. PMID:26091045

  20. A cost-effective and universal strategy for complete prokaryotic genomic sequencing proposed by computer simulation

    PubMed Central

    2012-01-01

    Background Pyrosequencing techniques allow scientists to perform prokaryotic genome sequencing to achieve the draft genomic sequences within a few days. However, the assemblies with shotgun sequencing are usually composed of hundreds of contigs. A further multiplex PCR procedure is needed to fill all the gaps and link contigs into complete chromosomal sequence, which is the basis for prokaryotic comparative genomic studies. In this article, we study various pyrosequencing strategies by simulated assembling from 100 prokaryotic genomes. Findings Simulation study shows that a single end 454 Jr. run combined with a paired end 454 Jr. run (8 kb library) can produce: 1) ~90% of 100 assemblies with < 10 scaffolds and ~95% of 100 assemblies with < 150 contigs; 2) average contig N50 size is over 331 kb; 3) average single base accuracy is > 99.99%; 4) average false gene duplication rate is < 0.7%; 5) average false gene loss rate is < 0.4%. Conclusions A single end 454 Jr. run combined with a paired end 454 Jr. run (8 kb library) is a cost-effective way for prokaryotic whole genome sequencing. This strategy provides solution to produce high quality draft assemblies for most of prokaryotic organisms within days. Due to the small number of assembled scaffolds, the following multiplex PCR procedure (for gap filling) would be easy. As a result, large scale prokaryotic whole genome sequencing projects may be finished within weeks. PMID:22289569

  1. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    NASA Technical Reports Server (NTRS)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  2. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation. PMID:21728773

  3. Modeling the economic costs of disasters and recovery: analysis using a dynamic computable general equilibrium model

    NASA Astrophysics Data System (ADS)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2014-04-01

    Disaster damages have negative effects on the economy, whereas reconstruction investment has positive effects. The aim of this study is to model economic causes of disasters and recovery involving the positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and furthermore avoid the double-counting problem. In order to factor both shocks into the CGE model, direct loss is set as the amount of capital stock reduced on the supply side of the economy; a portion of investments restores the capital stock in an existing period; an investment-driven dynamic model is formulated according to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable to balance the fixed investment. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction, respectively. The study showed that output from S1 is found to be closer to real data than that from S2. Economic loss under S2 is roughly 1.5 times that under S1. The gap in the economic aggregate between S1 and S0 is reduced to 3% at the end of government-led reconstruction activity, a level that should take another four years to achieve under S2.

  4. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  5. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  6. Cost-effectiveness of intensive multifactorial treatment compared with routine care for individuals with screen-detected Type 2 diabetes: analysis of the ADDITION-UK cluster-randomized controlled trial

    PubMed Central

    Tao, L; Wilson, E C F; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2015-01-01

    Aims To examine the short- and long-term cost-effectiveness of intensive multifactorial treatment compared with routine care among people with screen-detected Type 2 diabetes. Methods Cost–utility analysis in ADDITION-UK, a cluster-randomized controlled trial of early intensive treatment in people with screen-detected diabetes in 69 UK general practices. Unit treatment costs and utility decrement data were taken from published literature. Accumulated costs and quality-adjusted life years (QALYs) were calculated using ADDITION-UK data from 1 to 5 years (short-term analysis, n = 1024); trial data were extrapolated to 30 years using the UKPDS outcomes model (version 1.3) (long-term analysis; n = 999). All costs were transformed to the UK 2009/10 price level. Results Adjusted incremental costs to the NHS were £285, £935, £1190 and £1745 over a 1-, 5-, 10- and 30-year time horizon, respectively (discounted at 3.5%). Adjusted incremental QALYs were 0.0000, – 0.0040, 0.0140 and 0.0465 over the same time horizons. Point estimate incremental cost-effectiveness ratios (ICERs) suggested that the intervention was not cost-effective although the ratio improved over time: the ICER over 10 years was £82 250, falling to £37 500 over 30 years. The ICER fell below £30 000 only when the intervention cost was below £631 per patient: we estimated the cost at £981. Conclusion Given conventional thresholds of cost-effectiveness, the intensive treatment delivered in ADDITION was not cost-effective compared with routine care for individuals with screen-detected diabetes in the UK. The intervention may be cost-effective if it can be delivered at reduced cost. PMID:25661661

  7. A low-cost EEG system-based hybrid brain-computer interface for humanoid robot navigation and recognition.

    PubMed

    Choi, Bongjae; Jo, Sungho

    2013-01-01

    This paper describes a hybrid brain-computer interface (BCI) technique that combines the P300 potential, the steady state visually evoked potential (SSVEP), and event related de-synchronization (ERD) to solve a complicated multi-task problem consisting of humanoid robot navigation and control along with object recognition using a low-cost BCI system. Our approach enables subjects to control the navigation and exploration of a humanoid robot and recognize a desired object among candidates. This study aims to demonstrate the possibility of a hybrid BCI based on a low-cost system for a realistic and complex task. It also shows that the use of a simple image processing technique, combined with BCI, can further aid in making these complex tasks simpler. An experimental scenario is proposed in which a subject remotely controls a humanoid robot in a properly sized maze. The subject sees what the surrogate robot sees through visual feedback and can navigate the surrogate robot. While navigating, the robot encounters objects located in the maze. It then recognizes if the encountered object is of interest to the subject. The subject communicates with the robot through SSVEP and ERD-based BCIs to navigate and explore with the robot, and P300-based BCI to allow the surrogate robot recognize their favorites. Using several evaluation metrics, the performances of five subjects navigating the robot were quite comparable to manual keyboard control. During object recognition mode, favorite objects were successfully selected from two to four choices. Subjects conducted humanoid navigation and recognition tasks as if they embodied the robot. Analysis of the data supports the potential usefulness of the proposed hybrid BCI system for extended applications. This work presents an important implication for the future work that a hybridization of simple BCI protocols provide extended controllability to carry out complicated tasks even with a low-cost system. PMID:24023953

  8. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: radiation exposure and cost analysis.

    PubMed

    Huppertz, Alexander; Radmer, Sebastian; Asbach, Patrick; Juran, Ralf; Schwenke, Carsten; Diederichs, Gerd; Hamm, Bernd; Sparmann, Martin

    2011-06-01

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m(2) underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan(®) (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p<0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p=0.15) and the difference between males and females were not significant (p=0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 € were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs. PMID:20022723

  9. Computer Vision Tools for Low-Cost and Noninvasive Measurement of Autism-Related Behaviors in Infants

    PubMed Central

    Vallin Spina, Thiago; Papanikolopoulos, Nikolaos; Egger, Helen

    2014-01-01

    The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments. PMID:25045536

  10. Reduced computational cost, totally symmetric angular quadrature sets for discrete ordinates radiation transport. Master`s thesis

    SciTech Connect

    Oder, J.M.

    1997-12-01

    Several new quadrature sets for use in the discrete ordinates method of solving the Boltzmann neutral particle transport equation are derived. These symmetric quadratures extend the traditional symmetric quadratures by allowing ordinates perpendicular to one or two of the coordinate axes. Comparable accuracy with fewer required ordinates is obtained. Quadratures up to seventh order are presented. The validity and efficiency of the quadratures is then tested and compared with the Sn level symmetric quadratures relative to a Monte Carlo benchmark solution. The criteria for comparison include current through the surface, scalar flux at the surface, volume average scalar flux, and time required for convergence. Appreciable computational cost was saved when used in an unstructured tetrahedral cell code using highly accurate characteristic methods. However, no appreciable savings in computation time was found using the new quadratures compared with traditional Sn methods on a regular Cartesian mesh using the standard diamond difference method. These quadratures are recommended for use in three-dimensional calculations on an unstructured mesh.

  11. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  12. Costs Associated with Implementation of Computer-Assisted Clinical Decision Support System for Antenatal and Delivery Care: Case Study of Kassena-Nankana District of Northern Ghana

    PubMed Central

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    Objective This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. Methods A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009–2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Results Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128–lower than the financial cost by 26.5%. Conclusions The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines

  13. Improved operating scenarios of the DIII-D tokamak as a result of the addition of UNIX computer systems

    SciTech Connect

    Henline, P.A.

    1995-10-01

    The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape control due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.

  14. The clinical significance and management of patients with incomplete coronary angiography and the value of additional computed tomography coronary angiography.

    PubMed

    Pregowski, Jerzy; Kepka, Cezary; Kruk, Mariusz; Mintz, Gary S; Kalinczuk, Lukasz; Ciszewski, Michal; Kochanowski, Lukasz; Wolny, Rafal; Chmielak, Zbigniew; Jastrzębski, Jan; Klopotowski, Mariusz; Zalewska, Joanna; Demkow, Marcin; Karcz, Maciej; Witkowski, Adam

    2014-04-01

    To assess the anatomical background and significance of incomplete invasive coronary angiography (ICA) and to evaluate the value of coronary computed tomography angiography (CTA) in this scenario. The current study is an analysis of high volume center experience with prospective registry of coronary CTA and ICA. The target population was identified through a review of the electronic database. We included consecutive patients referred for coronary CTA after ICA, which did not visualize at least one native coronary artery or by-pass graft. Between January 2009 and April 2013, 13,603 diagnostic ICA were performed. There were 45 (0.3 %) patients referred for coronary CTA after incomplete ICA. Patients were divided into 3 groups: angina symptoms without previous coronary artery by-pass grafting (CABG) (n = 11,212), angina symptoms with previous CABG (n = 986), and patients prior to valvular surgery (n = 925). ICA did not identify by-pass grafts in 21 (2.2 %) patients and in 24 (0.2 %) cases of native arteries. The explanations for an incomplete ICA included: 11 ostium anomalies, 2 left main spasms, 5 access site problems, 5 ascending aorta aneurysms, and 2 tortuous take-off of a subclavian artery. However, in 20 (44 %) patients no specific reason for the incomplete ICA was identified. After coronary CTA revascularization was performed in 11 (24 %) patients: 6 successful repeat ICA and percutaneous intervention and 5 CABG. Incomplete ICA constitutes rare, but a significant clinical problem. Coronary CTA provides adequate clinical information in these patients. PMID:24623270

  15. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  16. Novel low-cost 2D/3D switchable autostereoscopic system for notebook computers and other portable devices

    NASA Astrophysics Data System (ADS)

    Eichenlaub, Jesse B.

    1995-03-01

    Mounting a lenticular lens in front of a flat panel display is a well known, inexpensive, and easy way to create an autostereoscopic system. Such a lens produces half resolution 3D images because half the pixels on the LCD are seen by the left eye and half by the right eye. This may be acceptable for graphics, but it makes full resolution text, as displayed by common software, nearly unreadable. Very fine alignment tolerances normally preclude the possibility of removing and replacing the lens in order to switch between 2D and 3D applications. Lenticular lens based displays are therefore limited to use as dedicated 3D devices. DTI has devised a technique which removes this limitation, allowing switching between full resolution 2D and half resolution 3D imaging modes. A second element, in the form of a concave lenticular lens array whose shape is exactly the negative of the first lens, is mounted on a hinge so that it can be swung down over the first lens array. When so positioned the two lenses cancel optically, allowing the user to see full resolution 2D for text or numerical applications. The two lenses, having complementary shapes, naturally tend to nestle together and snap into perfect alignment when pressed together--thus obviating any need for user operated alignment mechanisms. This system represents an ideal solution for laptop and notebook computer applications. It was devised to meet the stringent requirements of a laptop computer manufacturer including very compact size, very low cost, little impact on existing manufacturing or assembly procedures, and compatibility with existing full resolution 2D text- oriented software as well as 3D graphics. Similar requirements apply to high and electronic calculators, several models of which now use LCDs for the display of graphics.

  17. Development of ANFIS models for air quality forecasting and input optimization for reducing the computational cost and time

    NASA Astrophysics Data System (ADS)

    Prasad, Kanchan; Gorai, Amit Kumar; Goyal, Pramila

    2016-03-01

    This study aims to develop adaptive neuro-fuzzy inference system (ANFIS) for forecasting of daily air pollution concentrations of five air pollutants [sulphur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3) and particular matters (PM10)] in the atmosphere of a Megacity (Howrah). Air pollution in the city (Howrah) is rising in parallel with the economics and thus observing, forecasting and controlling the air pollution becomes increasingly important due to the health impact. ANFIS serve as a basis for constructing a set of fuzzy IF-THEN rules, with appropriate membership functions to generate the stipulated input-output pairs. The ANFIS model predictor considers the value of meteorological factors (pressure, temperature, relative humidity, dew point, visibility, wind speed, and precipitation) and previous day's pollutant concentration in different combinations as the inputs to predict the 1-day advance and same day air pollution concentration. The concentration value of five air pollutants and seven meteorological parameters of the Howrah city during the period 2009 to 2011 were used for development of the ANFIS model. Collinearity tests were conducted to eliminate the redundant input variables. A forward selection (FS) method is used for selecting the different subsets of input variables. Application of collinearity tests and FS techniques reduces the numbers of input variables and subsets which helps in reducing the computational cost and time. The performances of the models were evaluated on the basis of four statistical indices (coefficient of determination, normalized mean square error, index of agreement, and fractional bias).

  18. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  19. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  20. The role of additional computed tomography in the decision-making process on the secondary prevention in patients after systemic cerebral thrombolysis

    PubMed Central

    Sobolewski, Piotr; Kozera, Grzegorz; Szczuchniak, Wiktor; Nyka, Walenty M

    2016-01-01

    Introduction Patients with ischemic stroke undergoing intravenous (iv)-thrombolysis are routinely controlled with computed tomography on the second day to assess stroke evolution and hemorrhagic transformation (HT). However, the benefits of an additional computed tomography (aCT) performed over the next days after iv-thrombolysis have not been determined. Methods We retrospectively screened 287 Caucasian patients with ischemic stroke who were consecutively treated with iv-thrombolysis from 2008 to 2012. The results of computed tomography performed on the second (control computed tomography) and seventh (aCT) day after iv-thrombolysis were compared in 274 patients (95.5%); 13 subjects (4.5%), who died before the seventh day from admission were excluded from the analysis. Results aCTs revealed a higher incidence of HT than control computed tomographies (14.2% vs 6.6%; P=0.003). Patients with HT in aCT showed higher median of National Institutes of Health Stroke Scale score on admission than those without HT (13.0 vs 10.0; P=0.01) and higher presence of ischemic changes >1/3 middle cerebral artery territory (66.7% vs 35.2%; P<0.01). Correlations between presence of HT in aCT and National Institutes of Health Stroke Scale score on admission (rpbi 0.15; P<0.01), and the ischemic changes >1/3 middle cerebral artery (phi=0.03) existed, and the presence of HT in aCT was associated with 3-month mortality (phi=0.03). Conclusion aCT after iv-thrombolysis enables higher detection of HT, which is related to higher 3-month mortality. Thus, patients with severe middle cerebral artery infarction may benefit from aCT in the decision-making process on the secondary prophylaxis. PMID:26730196

  1. Strapdown cost trend study and forecast

    NASA Technical Reports Server (NTRS)

    Eberlein, A. J.; Savage, P. G.

    1975-01-01

    The potential cost advantages offered by advanced strapdown inertial technology in future commercial short-haul aircraft are summarized. The initial procurement cost and six year cost-of-ownership, which includes spares and direct maintenance cost were calculated for kinematic and inertial navigation systems such that traditional and strapdown mechanization costs could be compared. Cost results for the inertial navigation systems showed that initial costs and the cost of ownership for traditional triple redundant gimbaled inertial navigators are three times the cost of the equivalent skewed redundant strapdown inertial navigator. The net cost advantage for the strapdown kinematic system is directly attributable to the reduction in sensor count for strapdown. The strapdown kinematic system has the added advantage of providing a fail-operational inertial navigation capability for no additional cost due to the use of inertial grade sensors and attitude reference computers.

  2. The cumulative cost of additional wakefulness: dose-response effects on neurobehavioral functions and sleep physiology from chronic sleep restriction and total sleep deprivation

    NASA Technical Reports Server (NTRS)

    Van Dongen, Hans P A.; Maislin, Greg; Mullington, Janet M.; Dinges, David F.

    2003-01-01

    were near-linearly related to the cumulative duration of wakefulness in excess of 15.84 h (s.e. 0.73 h). CONCLUSIONS: Since chronic restriction of sleep to 6 h or less per night produced cognitive performance deficits equivalent to up to 2 nights of total sleep deprivation, it appears that even relatively moderate sleep restriction can seriously impair waking neurobehavioral functions in healthy adults. Sleepiness ratings suggest that subjects were largely unaware of these increasing cognitive deficits, which may explain why the impact of chronic sleep restriction on waking cognitive functions is often assumed to be benign. Physiological sleep responses to chronic restriction did not mirror waking neurobehavioral responses, but cumulative wakefulness in excess of a 15.84 h predicted performance lapses across all four experimental conditions. This suggests that sleep debt is perhaps best understood as resulting in additional wakefulness that has a neurobiological "cost" which accumulates over time.

  3. Additivity of factor effects in reading tasks is still a challenge for computational models: Reply to Ziegler, Perry, and Zorzi (2009).

    PubMed

    Besner, Derek; O'Malley, Shannon

    2009-01-01

    J. C. Ziegler, C. Perry, and M. Zorzi (2009) have claimed that their connectionist dual process model (CDP+) can simulate the data reported by S. O'Malley and D. Besner. Most centrally, they have claimed that the model simulates additive effects of stimulus quality and word frequency on the time to read aloud when words and nonwords are randomly intermixed. This work represents an important attempt given that computational models of reading processes have to date largely ignored the issue of whether it is possible to simulate additive effects. Despite CDP+'s success at capturing many other phenomena, it is clear that CDP+ fails to capture the full pattern seen with skilled readers in these experiments. PMID:19210105

  4. ANL/RBC: A computer code for the analysis of Rankine bottoming cycles, including system cost evaluation and off-design performance

    NASA Technical Reports Server (NTRS)

    Mclennan, G. A.

    1986-01-01

    This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.

  5. Would school closure for the 2009 H1N1 influenza epidemic have been worth the cost?: a computational simulation of Pennsylvania

    PubMed Central

    2011-01-01

    Background During the 2009 H1N1 influenza epidemic, policy makers debated over whether, when, and how long to close schools. While closing schools could have reduced influenza transmission thereby preventing cases, deaths, and health care costs, it may also have incurred substantial costs from increased childcare needs and lost productivity by teachers and other school employees. Methods A combination of agent-based and Monte Carlo economic simulation modeling was used to determine the cost-benefit of closing schools (vs. not closing schools) for different durations (range: 1 to 8 weeks) and symptomatic case incidence triggers (range: 1 to 30) for the state of Pennsylvania during the 2009 H1N1 epidemic. Different scenarios varied the basic reproductive rate (R0) from 1.2, 1.6, to 2.0 and used case-hospitalization and case-fatality rates from the 2009 epidemic. Additional analyses determined the cost per influenza case averted of implementing school closure. Results For all scenarios explored, closing schools resulted in substantially higher net costs than not closing schools. For R0 = 1.2, 1.6, and 2.0 epidemics, closing schools for 8 weeks would have resulted in median net costs of $21.0 billion (95% Range: $8.0 - $45.3 billion). The median cost per influenza case averted would have been $14,185 ($5,423 - $30,565) for R0 = 1.2, $25,253 ($9,501 - $53,461) for R0 = 1.6, and $23,483 ($8,870 - $50,926) for R0 = 2.0. Conclusions Our study suggests that closing schools during the 2009 H1N1 epidemic could have resulted in substantial costs to society as the potential costs of lost productivity and childcare could have far outweighed the cost savings in preventing influenza cases. PMID:21599920

  6. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  7. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention.

    PubMed

    Akutsu, Yasushi; Hamazaki, Yuji; Sekimoto, Teruo; Kaneko, Kyouichi; Kodama, Yusuke; Li, Hui-Ling; Suyama, Jumpei; Gokan, Takehiko; Sakai, Koshiro; Kosaki, Ryota; Yokota, Hiroyuki; Tsujita, Hiroaki; Tsukamoto, Shigeto; Sakurai, Masayuki; Sambe, Takehiko; Oguchi, Katsuji; Uchida, Naoki; Kobayashi, Shinichi; Aoki, Atsushi; Kobayashi, Youichi

    2016-06-01

    Our data shows the regional coronary artery calcium scores (lesion CAC) on multidetector computed tomography (MDCT) and the cross-section imaging on MDCT angiography (CTA) in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI). CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany) before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan) and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions. PMID:26977441

  8. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... common equity, an alternate methodology to predict the firm's real after-tax marginal cost of capital may... Inflation, Charlottesville, Va.: The Financial Analysts Research Foundation, 1977, cited by Ernst &...

  9. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... sample form—(1) Model form. Total Annual Loan Cost Rate Loan Terms Age of youngest borrower: Appraised... Age of youngest borrower: 75 Appraised property value: $100,000 Interest rate: 9% Monthly...

  10. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... sample form—(1) Model form. Total Annual Loan Cost Rate Loan Terms Age of youngest borrower: Appraised... Age of youngest borrower: 75 Appraised property value: $100,000 Interest rate: 9% Monthly...

  11. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    USGS Publications Warehouse

    Gaydos, Leonard

    1978-01-01

    The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.

  12. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  13. Experimental and computational mechanistic investigation of chlorocarbene additions to bridgehead carbene-anti-Bredt systems: noradamantylcarbene-adamantene and adamantylcarbene-homoadamantene.

    PubMed

    Hare, Stephanie R; Orman, Marina; Dewan, Faizunnahar; Dalchand, Elizabeth; Buzard, Camilla; Ahmed, Sadia; Tolentino, Julia C; Sethi, Ulweena; Terlizzi, Kelly; Houferak, Camille; Stein, Aliza M; Stedronsky, Alexandra; Thamattoor, Dasan M; Tantillo, Dean J; Merrer, Dina C

    2015-05-15

    Cophotolysis of noradamantyldiazirine with the phenanthride precursor of dichlorocarbene or phenylchlorodiazirine in pentane at room temperature produces noradamantylethylenes in 11% yield with slight diastereoselectivity. Cophotolysis of adamantyldiazirine with phenylchlorodiazirine in pentane at room temperature generates adamantylethylenes in 6% yield with no diastereoselectivity. (1)H NMR showed the reaction of noradamantyldiazirine + phenylchlorodiazirine to be independent of solvent, and the rate of noradamantyldiazirine consumption correlated with the rate of ethylene formation. Laser flash photolysis showed that reaction of phenylchlorocarbene + adamantene was independent of adamantene concentration. The reaction of phenylchlorocarbene + homoadamantene produces the ethylene products with k = 9.6 × 10(5) M(-1) s(-1). Calculations at the UB3LYP/6-31+G(d,p) and UM062X/6-31+G(d,p)//UB3LYP/6-31+G(d,p) levels show the formation of exocyclic ethylenes to proceed (a) on the singlet surface via stepwise addition of phenylchlorocarbene (PhCCl) to bridgehead alkenes adamantene and homoadamantene, respectively, producing an intermediate singlet diradical in each case, or (b) via addition of PhCCl to the diazo analogues of noradamantyl- and adamantyldiazirine. Preliminary direct dynamics calculations on adamantene + PhCCl show a high degree of recrossing (68%), indicative of a flat transition state surface. Overall, 9% of the total trajectories formed noradamantylethylene product, each proceeding via the computed singlet diradical. PMID:25902301

  14. ECG boy: low-cost medical instrumentation using mass-produced, hand-held entertainment computers. A preliminary report.

    PubMed

    Rohde, M M; Bement, S L; Lupa, R S

    1998-01-01

    A prototype low-cost, portable ECG monitor, the "ECG Boy," is described. A mass produced hand-held video game platform is the basis for a complete three-lead, driven right-leg electrocardiogram (ECG). The ECG circuitry is planned to fit in a standard modular cartridge that is inserted in a production Nintendo "Gameboy." The combination is slightly smaller than a paperback book and weighs less than 500 g. The unit contains essential safety features such as optical isolation and is powered by 9-V and AA batteries. Functionally, the ECG Boy permits viewing ECG recordings in real time on the integrated screen. The user can select both the lead displayed on the screen and the time scale used. A 1-mV reference allows for calibration. Other ECG enhancements such as data transmission via telephone can be easily and inexpensively added to this system. The ECG Boy is intended as a proof of concept for a new class of low-cost biomedical instruments. Rising health care costs coupled with tightened funding have created an acute demand for low-cost medical equipment that satisfies safety and quality standards. A mass-produced microprocessor-based platform designed for the entertainment market can keep costs low while providing a functional basis for a biomedical instrument. PMID:9800006

  15. [The cost of quality assurance].

    PubMed

    Materson, B J; Quintana, O

    1993-01-01

    This paper views quality assurance costs as appraisal costs. We used cost accounting techniques to estimate the cost of quality assurance activities in a large university affiliated Veteran Administration Medical Center. In addition to the personnel employed full-time for quality assurance activities, all other employees in or directly in support of clinical services were interviewed in order to determine the per cent of their work time devoted to specific quality assurance activities. The per cent time committed was multiplied by the salary and benefits package for each employee and the total computed for the facility. In addition, non-salary overhead expenses were estimated by multiplying the salary and fringe benefit costs to the ratio of total medical center non-personnel costs to total medical center costs. We found that 3.39 per cent of the total budget or $4,884,775 was devoted to quality assurance activities. The highest costs aside from the designated quality assurance personnel were for pharmacy, Laboratory, extended care (including nursing home), psychiatry, and nursing services. We did not attempt a formal benefit analysis. We concluded that quality assurance activities in a major medical center are not free. Careful cost accounting studies should be performed both to determine the cost of quality assurance and to identify its specific benefits. PMID:8322107

  16. 12 CFR Appendix K to Part 1026 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...%. (ii) In using these iteration procedures, it is expected that calculators or computers will be... programmable calculator and the iteration procedure described in appendix J of this part. (9) Assumption...

  17. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... decimals, is 48.53%. (ii) In using these iteration procedures, it is expected that calculators or computers... using a 10-digit programmable calculator and the iteration procedure described in appendix J of...

  18. 12 CFR Appendix K to Part 1026 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) In using these iteration procedures, it is expected that calculators or computers will be programmed... programmable calculator and the iteration procedure described in Appendix J of this part. (9) Assumption...

  19. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    NASA Astrophysics Data System (ADS)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  20. INTEGRATED AIR POLLUTION CONTROL FOR COAL-FIRED UTILITY BOILERS: A COMPUTER MODEL APPROACH FOR DESIGN AND COST-ESTIMATING

    EPA Science Inventory

    The paper describes the Integrated Air Pollution Control System (IAPCS), a computerized program that can be used to estimate the cost and performance of pre-combustion, in situ, and post-combustion air pollution control configurations in pulverized-coal-fired utility boilers of 1...

  1. User manual for GEOCITY: a computer model for cost analysis of geothermal district-heating-and-cooling systems. Volume II. Appendices

    SciTech Connect

    Huber, H.D.; Fassbender, L.L.; Bloomster, C.H.

    1982-09-01

    The purpose of this model is to calculate the costs of residential space heating, space cooling, and sanitary water heating or process heating (cooling) using geothermal energy from a hydrothermal reservoir. The model can calculate geothermal heating and cooling costs for residential developments, a multi-district city, or a point demand such as an industrial factory or commercial building. Volume II contains all the appendices, including cost equations and models for the reservoir and fluid transmission system and the distribution system, descriptions of predefined residential district types for the distribution system, key equations for the cooling degree hour methodology, and a listing of the sample case output. Both volumes include the complete table of contents and lists of figures and tables. In addition, both volumes include the indices for the input parameters and subroutines defined in the user manual.

  2. Cost-Estimation Program

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    1995-01-01

    COSTIT computer program estimates cost of electronic design by reading item-list file and file containing cost for each item. Accuracy of cost estimate based on accuracy of cost-list file. Written by use of AWK utility for Sun4-series computers running SunOS 4.x and IBM PC-series and compatible computers running MS-DOS. The Sun version (NPO-19587). PC version (NPO-19157).

  3. Design and implementation of a medium speed communications interface and protocol for a low cost, refreshed display computer

    NASA Technical Reports Server (NTRS)

    Phyne, J. R.; Nelson, M. D.

    1975-01-01

    The design and implementation of hardware and software systems involved in using a 40,000 bit/second communication line as the connecting link between an IMLAC PDS 1-D display computer and a Univac 1108 computer system were described. The IMLAC consists of two independent processors sharing a common memory. The display processor generates the deflection and beam control currents as it interprets a program contained in the memory; the minicomputer has a general instruction set and is responsible for starting and stopping the display processor and for communicating with the outside world through the keyboard, teletype, light pen, and communication line. The processing time associated with each data byte was minimized by designing the input and output processes as finite state machines which automatically sequence from each state to the next. Several tests of the communication link and the IMLAC software were made using a special low capacity computer grade cable between the IMLAC and the Univac.

  4. Computer architecture providing high-performance and low-cost solutions for fast fMRI reconstruction

    NASA Astrophysics Data System (ADS)

    Chao, Hui; Goddard, J. Iain

    1998-07-01

    Due to the dynamic nature of brain studies in functional magnetic resonance imaging (fMRI), fast pulse sequences such as echo planar imaging (EPI) and spiral are often used for higher temporal resolution. Hundreds of frames of two- dimensional (2-D) images or multiple three-dimensional (3-D) images are often acquired to cover a larger space and time range. Therefore, fMRI often requires a much larger data storage, faster data transfer rate and higher processing power than conventional MRI. In Mercury Computer Systems' PCI-based embedded computer system, the computer architecture allows the concurrent use of a DMA engine for data transfer and CPU for data processing. This architecture allows a multicomputer to distribute processing and data with minimal time spent transferring data. Different types and numbers of processors are available to optimize system performance for the application. The fMRI reconstruction was first implemented in Mercury's PCI-based embedded computer system by using one digital signal processing (DSP) chip, with the host computer running under the Windows NTR platform. Double buffers in SRAM or cache were created for concurrent I/O and processing. The fMRI reconstruction was then implemented in parallel using multiple DSP chips. Data transfer and interprocessor synchronization were carefully managed to optimize algorithm efficiency. The image reconstruction times were measured with different numbers of processors ranging from one to 10. With one DSP chip, the timing for reconstructing 100 fMRI images measuring 128 X 64 pixels was 1.24 seconds, which is already faster than most existing commercial MRI systems. This PCI-based embedded multicomputer architecture, which has a nearly linear improvement in performance, provides high performance for fMRI processing. In summary, this embedded multicomputer system allows the choice of computer topologies to fit the specific application to achieve maximum system performance.

  5. A computational study of the addition of ReO3L (L = Cl(-), CH3, OCH3 and Cp) to ethenone.

    PubMed

    Aniagyei, Albert; Tia, Richard; Adei, Evans

    2016-01-01

    The periselectivity and chemoselectivity of the addition of transition metal oxides of the type ReO3L (L = Cl, CH3, OCH3 and Cp) to ethenone have been explored at the MO6 and B3LYP/LACVP* levels of theory. The activation barriers and reaction energies for the stepwise and concerted addition pathways involving multiple spin states have been computed. In the reaction of ReO3L (L = Cl(-), OCH3, CH3 and Cp) with ethenone, the concerted [2 + 2] addition of the metal oxide across the C=C and C=O double bond to form either metalla-2-oxetane-3-one or metalla-2,4-dioxolane is the most kinetically favored over the formation of metalla-2,5-dioxolane-3-one from the direct [3 + 2] addition pathway. The trends in activation and reaction energies for the formation of metalla-2-oxetane-3-one and metalla-2,4-dioxolane are Cp < Cl(-) < OCH3 < CH3 and Cp < OCH3 < CH3 < Cl(-) and for the reaction energies are Cp < OCH3 < Cl(-) < CH3 and Cp < CH3 < OCH3 < Cl CH3. The concerted [3 + 2] addition of the metal oxide across the C=C double of the ethenone to form species metalla-2,5-dioxolane-3-one is thermodynamically the most favored for the ligand L = Cp. The direct [2 + 2] addition pathways leading to the formations of metalla-2-oxetane-3-one and metalla-2,4-dioxolane is thermodynamically the most favored for the ligands L = OCH3 and Cl(-). The difference between the calculated [2 + 2] activation barriers for the addition of the metal oxide LReO3 across the C=C and C=O functionalities of ethenone are small except for the case of L = Cl(-) and OCH3. The rearrangement of the metalla-2-oxetane-3-one-metalla-2,5-dioxolane-3-one even though feasible, are unfavorable due to high activation energies of their rate-determining steps. For the rearrangement of the metalla-2-oxetane-3-one to metalla-2,5-dioxolane-3-one, the trends in activation barriers is found to follow the order OCH3 < Cl(-) < CH3 < Cp. The trends in the activation energies for

  6. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-06-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  7. ECG-Based Detection of Early Myocardial Ischemia in a Computational Model: Impact of Additional Electrodes, Optimal Placement, and a New Feature for ST Deviation

    PubMed Central

    Loewe, Axel; Schulze, Walther H. W.; Jiang, Yuan; Wilhelms, Mathias; Luik, Armin; Dössel, Olaf; Seemann, Gunnar

    2015-01-01

    In case of chest pain, immediate diagnosis of myocardial ischemia is required to respond with an appropriate treatment. The diagnostic capability of the electrocardiogram (ECG), however, is strongly limited for ischemic events that do not lead to ST elevation. This computational study investigates the potential of different electrode setups in detecting early ischemia at 10 minutes after onset: standard 3-channel and 12-lead ECG as well as body surface potential maps (BSPMs). Further, it was assessed if an additional ECG electrode with optimized position or the right-sided Wilson leads can improve sensitivity of the standard 12-lead ECG. To this end, a simulation study was performed for 765 different locations and sizes of ischemia in the left ventricle. Improvements by adding a single, subject specifically optimized electrode were similar to those of the BSPM: 2–11% increased detection rate depending on the desired specificity. Adding right-sided Wilson leads had negligible effect. Absence of ST deviation could not be related to specific locations of the ischemic region or its transmurality. As alternative to the ST time integral as a feature of ST deviation, the K point deviation was introduced: the baseline deviation at the minimum of the ST-segment envelope signal, which increased 12-lead detection rate by 7% for a reasonable threshold. PMID:26587538

  8. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-08-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  9. Tandem β-elimination/hetero-michael addition rearrangement of an N-alkylated pyridinium oxime to an O-alkylated pyridine oxime ether: an experimental and computational study.

    PubMed

    Picek, Igor; Vianello, Robert; Šket, Primož; Plavec, Janez; Foretić, Blaženka

    2015-02-20

    A novel OH(-)-promoted tandem reaction involving C(β)-N(+)(pyridinium) cleavage and ether C(β)-O(oxime) bond formation in aqueous media has been presented. The study fully elucidates the fascinating reaction behavior of N-benzoylethylpyridinium-4-oxime chloride in aqueous media under mild reaction conditions. The reaction journey begins with the exclusive β-elimination and formation of pyridine-4-oxime and phenyl vinyl ketone and ends with the formation of O-alkylated pyridine oxime ether. A combination of experimental and computational studies enabled the introduction of a new type of rearrangement process that involves a unique tandem reaction sequence. We showed that (E)-O-benzoylethylpyridine-4-oxime is formed in aqueous solution by a base-induced tandem β-elimination/hetero-Michael addition rearrangement of (E)-N-benzoylethylpyridinium-4-oximate, the novel synthetic route to this engaging target class of compounds. The complete mechanistic picture of this rearrangement process was presented and discussed in terms of the E1cb reaction scheme within the rate-limiting β-elimination step. PMID:25562471

  10. OPTIM: Computer program to generate a vertical profile which minimizes aircraft fuel burn or direct operating cost. User's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A profile of altitude, airspeed, and flight path angle as a function of range between a given set of origin and destination points for particular models of transport aircraft provided by NASA is generated. Inputs to the program include the vertical wind profile, the aircraft takeoff weight, the costs of time and fuel, certain constraint parameters and control flags. The profile can be near optimum in the sense of minimizing: (1) fuel, (2) time, or (3) a combination of fuel and time (direct operating cost (DOC)). The user can also, as an option, specify the length of time the flight is to span. The theory behind the technical details of this program is also presented.

  11. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal energy storage coupled with district heating or cooling systems. Volume I. Main text

    SciTech Connect

    Huber, H.D.; Brown, D.R.; Reilly, R.W.

    1982-04-01

    A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. The AQUASTOR model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two principal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains the main text, including introduction, program description, input data instruction, a description of the output, and Appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

  12. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    PubMed Central

    Gómez León, Nieves; Escalona, Sofía; Bandrés, Beatriz; Belda, Cristobal; Callejo, Daniel; Blasco, Juan Antonio

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (n = 40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC. PMID:25431665

  13. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  14. Accuracy of a Low-Cost Novel Computer-Vision Dynamic Movement Assessment: Potential Limitations and Future Directions

    NASA Astrophysics Data System (ADS)

    McGroarty, M.; Giblin, S.; Meldrum, D.; Wetterling, F.

    2016-04-01

    The aim of the study was to perform a preliminary validation of a low cost markerless motion capture system (CAPTURE) against an industry gold standard (Vicon). Measurements of knee valgus and flexion during the performance of a countermovement jump (CMJ) between CAPTURE and Vicon were compared. After correction algorithms were applied to the raw CAPTURE data acceptable levels of accuracy and precision were achieved. The knee flexion angle measured for three trials using Capture deviated by -3.8° ± 3° (left) and 1.7° ± 2.8° (right) compared to Vicon. The findings suggest that low-cost markerless motion capture has potential to provide an objective method for assessing lower limb jump and landing mechanics in an applied sports setting. Furthermore, the outcome of the study warrants the need for future research to examine more fully the potential implications of the use of low-cost markerless motion capture in the evaluation of dynamic movement for injury prevention.

  15. Reducing Communication in Algebraic Multigrid Using Additive Variants

    SciTech Connect

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for good performance on future exascale architectures.

  16. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    SciTech Connect

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  17. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal-energy storage oupled with district-heating or cooling systems. Volume II. Appendices

    SciTech Connect

    Huber, H.D.; Brown, D.R.; Reilly, R.W.

    1982-04-01

    A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. the AQUASTOR Model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two prinicpal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains all the appendices, including supply and distribution system cost equations and models, descriptions of predefined residential districts, key equations for the cooling degree-hour methodology, a listing of the sample case output, and appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

  18. Cost-effective pediatric head and body phantoms for computed tomography dosimetry and its evaluation using pencil ion chamber and CT dose profiler

    PubMed Central

    Saravanakumar, A.; Vaideki, K.; Govindarajan, K. N.; Jayakumar, S.; Devanand, B.

    2015-01-01

    In the present work, a pediatric head and body phantom was fabricated using polymethyl methacrylate (PMMA) at a low cost when compared to commercially available phantoms for the purpose of computed tomography (CT) dosimetry. The dimensions of head and body phantoms were 10 cm diameter, 15 cm length and 16 cm diameter, 15 cm length, respectively. The dose from a 128-slice CT machine received by the head and body phantom at the center and periphery were measured using a 100 mm pencil ion chamber and 150 mm CT dose profiler (CTDP). Using these values, the weighted computed tomography dose index (CTDIw) and in turn the volumetric CTDI (CTDIv) were calculated for various combinations of tube voltage and current-time product. A similar study was carried out using standard calibrated phantom and the results have been compared with the fabricated ones to ascertain that the performance of the latter is equivalent to that of the former. Finally, CTDIv measured using fabricated and standard phantoms were compared with respective values displayed on the console. The difference between the values was well within the limits specified by Atomic Energy Regulatory Board (AERB), India. These results indicate that the cost-effective pediatric phantom can be employed for CT dosimetry. PMID:26500404

  19. Construction and field test of a programmable and self-cleaning auto-sampler controlled by a low-cost one-board computer

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias

    2016-04-01

    This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.

  20. Development and implementation of a low-cost phantom for quality control in cone beam computed tomography.

    PubMed

    Batista, W O; Navarro, M V T; Maia, A F

    2013-12-01

    A phantom for quality control in cone beam computed tomography (CBCT) scanners was designed and constructed, and a methodology for testing was developed. The phantom had a polymethyl methacrylate structure filled with water and plastic objects that allowed the assessment of parameters related to quality control. The phantom allowed the evaluation of essential parameters in CBCT as well as the evaluation of linear and angular dimensions. The plastics used in the phantom were chosen so that their density and linear attenuation coefficient were similar to those of human facial structures. Three types of CBCT equipment, with two different technological concepts, were evaluated. The results of the assessment of the accuracy of linear and angular dimensions agreed with the existing standards. However, other parameters such as computed tomography number accuracy, uniformity and high-contrast detail did not meet the tolerances established in current regulations or the manufacturer's specifications. The results demonstrate the importance of establishing specific protocols and phantoms, which meet the specificities of CBCT. The practicality of implementation, the quality control test results for the proposed phantom and the consistency of the results using different equipment demonstrate its adequacy. PMID:23838096

  1. Performance, throughput, and cost of in-home training for the Army Reserve: Using asynchronous computer conferencing as an alternative to resident training

    SciTech Connect

    Hahn, H.A. ); Ashworth, R.L. Jr.; Phelps, R.H. ); Byers, J.C. )

    1990-01-01

    Asynchronous computer conferencing (ACC) was investigated as an alternative to resident training for the Army Reserve Component (RC). Specifically, the goals were to (1) evaluate the performance and throughput of ACC as compared with traditional Resident School instruction and (2) determine the cost-effectiveness of developing and implementing ACC. Fourteen RC students took a module of the Army Engineer Officer Advanced Course (EOAC) via ACC. Course topics included Army doctrine, technical engineering subjects, leadership, and presentation skills. Resident content was adapted for presentation via ACC. The programs of instruction for ACC and the equivalent resident course were identical; only the media used for presentation were changed. Performance on tests, homework, and practical exercises; self-assessments of learning; throughput; and cost data wee the measures of interest. Comparison data were collected on RC students taking the course in residence. Results indicated that there were no performance differences between the two groups. Students taking the course via ACC perceived greater learning benefit than did students taking the course in residence. Resident throughput was superior to ACC throughput, both in terms of numbers of students completing and time to complete the course. In spite of this fact, however, ACC was more cost-effective than resident training.

  2. Troubleshooting Costs

    NASA Astrophysics Data System (ADS)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  3. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors. PMID:14626704

  4. What Constitutes a "Good" Sensitivity Analysis? Elements and Tools for a Robust Sensitivity Analysis with Reduced Computational Cost

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin

    2016-04-01

    Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  5. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  6. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  7. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    PubMed

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-01-01

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PMID:26103165

  8. A Low-Cost, Computer-Interfaced Drawing Pad for fMRI Studies of Dysgraphia and Dyslexia

    PubMed Central

    Reitz, Frederick; Richards, Todd; Wu, Kelvin; Boord, Peter; Askren, Mary; Lewis, Thomas; Berninger, Virginia

    2013-01-01

    We have developed a pen and writing tablet for use by subjects during fMRI scanning. The pen consists of two jacketed, multi-mode optical fibers routed to the tip of a hollowed-out ball-point pen. The pen has been further modified by addition of a plastic plate to maintain a perpendicular pen-tablet orientation. The tablet is simply a non-metallic frame holding a paper print of continuously varying color gradients. The optical fibers are routed out of the MRI bore to a light-tight box in an adjacent control room. Within the box, light from a high intensity LED is coupled into one of the fibers, while the other fiber abuts a color sensor. Light from the LED exits the pen tip, illuminating a small spot on the tablet, and the resulting reflected light is routed to the color sensor. Given a lookup table of position for each color on the tablet, the coordinates of the pen on the tablet may be displayed and digitized in real-time. While simple and inexpensive, the system achieves sufficient resolution to grade writing tasks testing dysgraphic and dyslexic phenomena. PMID:23595203

  9. Breaking Barriers in Polymer Additive Manufacturing

    SciTech Connect

    Love, Lonnie J; Duty, Chad E; Post, Brian K; Lind, Randall F; Lloyd, Peter D; Kunc, Vlastimil; Peter, William H; Blue, Craig A

    2015-01-01

    Additive Manufacturing (AM) enables the creation of complex structures directly from a computer-aided design (CAD). There are limitations that prevent the technology from realizing its full potential. AM has been criticized for being slow and expensive with limited build size. Oak Ridge National Laboratory (ORNL) has developed a large scale AM system that improves upon each of these areas by more than an order of magnitude. The Big Area Additive Manufacturing (BAAM) system directly converts low cost pellets into a large, three-dimensional part at a rate exceeding 25 kg/h. By breaking these traditional barriers, it is possible for polymer AM to penetrate new manufacturing markets.

  10. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  11. ESTIMATING IRRIGATION COSTS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Having accurate estimates of the cost of irrigation is important when making irrigation decisions. Estimates of fixed costs are critical for investment decisions. Operating cost estimates can assist in decisions regarding additional irrigations. This fact sheet examines the costs associated with ...

  12. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  13. Computer proposals

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    To expand the research community's access to supercomputers, the National Science Foundation (NSF) has begun a program to match researchers who require the capabilities of a supercomputer with those facilities that have such computer resources available.Recent studies on computer needs in scientific and engineering research underscore the need for greater access to supercomputers (Eos, July 6, 1982, p. 562), especially those categorized as “Class VI” machines. Complex computer models for research on astronomy, the oceans, and the atmosphere often require such capabilities. In addition, similar needs are emerging in the earth sciences: A Union session at the AGU Fall Meeting in San Francisco this week will focus on the research computing needs of the geosciences. A Class VI supercomputer has a memory capacity of at least 1 megaword, a speed of upwards of 100 MFLOPS (million floating point operations per second), and both scalar and vector registers in the CPU (central processing unit). Examples of Class VI machines are the CRAY-1 and the CYBER 205. The high costs o f these machines, the most powerful ones available, preclude most research facilities from owning one.

  14. Additivity of Factor Effects in Reading Tasks Is Still a Challenge for Computational Models: Reply to Ziegler, Perry, and Zorzi (2009)

    ERIC Educational Resources Information Center

    Besner, Derek; O'Malley, Shannon

    2009-01-01

    J. C. Ziegler, C. Perry, and M. Zorzi (2009) have claimed that their connectionist dual process model (CDP+) can simulate the data reported by S. O'Malley and D. Besner. Most centrally, they have claimed that the model simulates additive effects of stimulus quality and word frequency on the time to read aloud when words and nonwords are randomly…

  15. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts...

  16. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts...

  17. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts...

  18. Cost-effectiveness of postural exercise therapy versus physiotherapy in computer screen-workers with early non-specific work-related upper limb disorders (WRULD); a randomized controlled trial

    PubMed Central

    2009-01-01

    Background Exercise therapies generate substantial costs in computer workers with non-specific work-related upper limb disorders (WRULD). Aims To study if postural exercise therapy is cost-effective compared to regular physiotherapy in screen-workers with early complaints, both from health care and societal perspective. Methods Prospective randomized trial including cost-effectiveness analysis; one year follow-up. Participants: Eighty-eight screen-workers with early non-specific WRULD; six drop-outs. Interventions: A ten week postural exercise program versus regular physiotherapy. Outcome measures: Effectiveness measures: Pain: visual analogous scale (VAS), self-perceived WRULD (yes/no). Functional outcome: Disabilities of Arm, Shoulder and Hand- Dutch Language Version (DASH-DLV). Quality of life outcome: EQ-5D. Economic measures: health care costs including patient and family costs and productivity costs resulting in societal costs. Cost-effectiveness measures: health care costs and societal costs related to the effectiveness measures. Outcome measures were assessed at baseline; three, six and twelve months after baseline. Results At baseline both groups were comparable for baseline characteristics except scores on the Pain Catastrophizing Scale and comparable for costs. No significant differences between the groups concerning effectiveness at one year follow-up were found. Effectiveness scores slightly improved over time. After one year 55% of participants were free of complaints. After one year the postural exercise group had higher mean total health care costs, but lower productivity costs compared to the physiotherapy group. Mean societal costs after one year (therefore) were in favor of postural exercise therapy [- €622; 95% CI -2087; +590)]. After one year, only self- perceived WRULD seemed to result in acceptable cost-effectiveness of the postural exercise strategy over physiotherapy; however the probability of acceptable cost-effectiveness did not exceed

  19. Magnetic fusion energy and computers

    SciTech Connect

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups.

  20. Computer modelling integrated with micro-CT and material testing provides additional insight to evaluate bone treatments: Application to a beta-glycan derived whey protein mice model.

    PubMed

    Sreenivasan, D; Tu, P T; Dickinson, M; Watson, M; Blais, A; Das, R; Cornish, J; Fernandez, J

    2016-01-01

    The primary aim of this study was to evaluate the influence of a whey protein diet on computationally predicted mechanical strength of murine bones in both trabecular and cortical regions of the femur. There was no significant influence on mechanical strength in cortical bone observed with increasing whey protein treatment, consistent with cortical tissue mineral density (TMD) and bone volume changes observed. Trabecular bone showed a significant decline in strength with increasing whey protein treatment when nanoindentation derived Young׳s moduli were used in the model. When microindentation, micro-CT phantom density or normalised Young׳s moduli were included in the model a non-significant decline in strength was exhibited. These results for trabecular bone were consistent with both trabecular bone mineral density (BMD) and micro-CT indices obtained independently. The secondary aim of this study was to characterise the influence of different sources of Young׳s moduli on computational prediction. This study aimed to quantify the predicted mechanical strength in 3D from these sources and evaluate if trends and conclusions remained consistent. For cortical bone, predicted mechanical strength behaviour was consistent across all sources of Young׳s moduli. There was no difference in treatment trend observed when Young׳s moduli were normalised. In contrast, trabecular strength due to whey protein treatment significantly reduced when material properties from nanoindentation were introduced. Other material property sources were not significant but emphasised the strength trend over normalised material properties. This shows strength at the trabecular level was attributed to both changes in bone architecture and material properties. PMID:26599826

  1. New Federal Cost Accounting Regulations

    ERIC Educational Resources Information Center

    Wolff, George J.; Handzo, Joseph J.

    1973-01-01

    Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)

  2. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  3. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    NASA Astrophysics Data System (ADS)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  4. Real-space finite-difference calculation method of generalized Bloch wave functions and complex band structures with reduced computational cost.

    PubMed

    Tsukamoto, Shigeru; Hirose, Kikuji; Blügel, Stefan

    2014-07-01

    Generalized Bloch wave functions of bulk structures, which are composed of not only propagating waves but also decaying and growing evanescent waves, are known to be essential for defining the open boundary conditions in the calculations of the electronic surface states and scattering wave functions of surface and junction structures. Electronic complex band structures being derived from the generalized Bloch wave functions are also essential for studying bound states of the surface and junction structures, which do not appear in conventional band structures. We present a novel calculation method to obtain the generalized Bloch wave functions of periodic bulk structures by solving a generalized eigenvalue problem, whose dimension is drastically reduced in comparison with the conventional generalized eigenvalue problem derived by Fujimoto and Hirose [Phys. Rev. B 67, 195315 (2003)]. The generalized eigenvalue problem derived in this work is even mathematically equivalent to the conventional one, and, thus, we reduce computational cost for solving the eigenvalue problem considerably without any approximation and losing the strictness of the formulations. To exhibit the performance of the present method, we demonstrate practical calculations of electronic complex band structures and electron transport properties of Al and Cu nanoscale systems. Moreover, employing atom-structured electrodes and jellium-approximated ones for both of the Al and Si monatomic chains, we investigate how much the electron transport properties are unphysically affected by the jellium parts. PMID:25122409

  5. Do We Really Need Additional Contrast-Enhanced Abdominal Computed Tomography for Differential Diagnosis in Triage of Middle-Aged Subjects With Suspected Biliary Pain

    PubMed Central

    Hwang, In Kyeom; Lee, Yoon Suk; Kim, Jaihwan; Lee, Yoon Jin; Park, Ji Hoon; Hwang, Jin-Hyeok

    2015-01-01

    Abstract Enhanced computed tomography (CT) is widely used for evaluating acute biliary pain in the emergency department (ED). However, concern about radiation exposure from CT has also increased. We investigated the usefulness of pre-contrast CT for differential diagnosis in middle-aged subjects with suspected biliary pain. A total of 183 subjects, who visited the ED for suspected biliary pain from January 2011 to December 2012, were included. Retrospectively, pre-contrast phase and multiphase CT findings were reviewed and the detection rate of findings suggesting disease requiring significant treatment by noncontrast CT (NCCT) was compared with cases detected by multiphase CT. Approximately 70% of total subjects had a significant condition, including 1 case of gallbladder cancer and 126 (68.8%) cases requiring intervention (122 biliary stone-related diseases, 3 liver abscesses, and 1 liver hemangioma). The rate of overlooking malignancy without contrast enhancement was calculated to be 0% to 1.5%. Biliary stones and liver space-occupying lesions were found equally on NCCT and multiphase CT. Calculated probable rates of overlooking acute cholecystitis and biliary obstruction were maximally 6.8% and 4.2% respectively. Incidental significant finding unrelated with pain consisted of 1 case of adrenal incidentaloma, which was also observed in NCCT. NCCT might be sufficient to detect life-threatening or significant disease requiring early treatment in young adults with biliary pain. PMID:25700321

  6. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

    NASA Astrophysics Data System (ADS)

    Gill, Wonpyong

    2016-01-01

    This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

  7. Indirect Costs of Health Research--How They are Computed, What Actions are Needed. Report by the Comptroller General of the United States.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    A review by the General Accounting Office of various aspects of indirect costs associated with federal health research grants is presented. After an introduction detailing the scope of the review and defining indirect costs and federal participation, the report focuses on the causes of the rapid increase of indirect costs. Among findings was that…

  8. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  9. Making maps with computers

    USGS Publications Warehouse

    Guptill, S.C.; Starr, L.E.

    1988-01-01

    Soon after their introduction in the 1950s, digital computers were used for various phases of the mapping process, especially for trigonometric calculations of survey data and for orientation of aerial photographs on map manuscripts. In addition, computer-controlled plotters were used to draw simple outline maps. The process of collecting data for the plotters was slow and not as precise as those produced by the best manual cartography. Only during the 1980s has it become technologically feasible and cost-effective to assemble and use the data required to automate the mapping process. -from Authors

  10. The Hidden Costs of Owning a Microcomputer.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Before purchasing computer hardware, individuals must consider the costs associated with the setup and operation of a microcomputer system. Included among the initial costs of purchasing a computer are the costs of the computer, one or more disk drives, a monitor, and a printer as well as the costs of such optional peripheral devices as a plotter…

  11. Space shuttle solid rocket booster cost-per-flight analysis technique

    NASA Technical Reports Server (NTRS)

    Forney, J. A.

    1979-01-01

    A cost per flight computer model is described which considers: traffic model, component attrition, hardware useful life, turnaround time for refurbishment, manufacturing rates, learning curves on the time to perform tasks, cost improvement curves on quantity hardware buys, inflation, spares philosophy, long lead, hardware funding requirements, and other logistics and scheduling constraints. Additional uses of the model include assessing the cost per flight impact of changing major space shuttle program parameters and searching for opportunities to make cost effective management decisions.

  12. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  13. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  14. Managing Information On Costs

    NASA Technical Reports Server (NTRS)

    Taulbee, Zoe A.

    1990-01-01

    Cost Management Model, CMM, software tool for planning, tracking, and reporting costs and information related to costs. Capable of estimating costs, comparing estimated to actual costs, performing "what-if" analyses on estimates of costs, and providing mechanism to maintain data on costs in format oriented to management. Number of supportive cost methods built in: escalation rates, production-learning curves, activity/event schedules, unit production schedules, set of spread distributions, tables of rates and factors defined by user, and full arithmetic capability. Import/export capability possible with 20/20 Spreadsheet available on Data General equipment. Program requires AOS/VS operating system available on Data General MV series computers. Written mainly in FORTRAN 77 but uses SGU (Screen Generation Utility).

  15. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGESBeta

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  16. The UCLA MEDLARS computer system.

    PubMed

    Garvis, F J

    1966-01-01

    Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355

  17. Scheduling and Estimating the Cost of Crew Time

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Levri, Julie A.; Vaccari, David A.; Luna, Bernadette (Technical Monitor)

    2000-01-01

    In a previous paper, Theory and Application of the Equivalent System Mass Metric, Julie Levri, David Vaccari, and Alan Drysdale developed a method for computing the Equivalent System Mass (ESM) of crew time. ESM is an analog of cost. The suggested approach has been applied but seems to impose too high a cost for small additional requirements for crew time. The proposed method is based on the minimum average cost of crew time. In this work, the scheduling of crew time is examined in more detail, using suggested crew time allocations and daily work schedules. Crew tasks are typically assigned using priorities, which can also be used to construct a crew time demand curve mapping the value or cost per hour versus the total number of hours worked. The cost of additional crew time can be estimated by considering the intersection and shapes of the demand and supply curves. If e assume a mathematical form for the demand curve, a revised method can be developed for computing the cost or ESM of crew time. This method indicates a low cost per hour for small additional requirements for crew time and an increasing cost per hour for larger requirements.

  18. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  19. FeO2/MgO(1 0 0) supported cluster: Computational pursual for a low-cost and low-temperature CO nanocatalyst

    NASA Astrophysics Data System (ADS)

    Zamora, A. Y.; Reveles, J. U.; Mejia-Olvera, R.; Baruah, T.; Zope, R. R.

    2014-09-01

    CO oxidation of only 0.23 eV.Regarding the CO catalytic activity of iron oxide species at low-temperatures, it has been experimentally observed that thin oxide films on metals may indeed exhibit greatly enhanced catalytic activity compared to the underlying metal substrate under the same conditions [24]. In addition to the above studies, low temperature CO oxidation over Ag supported ultrathin MgO films was recently reported [17]. In this case, the activation barrier (0.7 eV) resulted lower than the corresponding barrier of CO oxidation on Pt(1 1 1) of 0.9 eV. The gas phase reaction (½O2 + CO → CO2) was calculated to present an overall exothermicity of 3.2 eV. Importantly, this study showed the possibility to generate a catalyst in which the CO adsorption energy of only 0.4 eV is low enough to prevent CO poisoning, therefore enabling a low temperature CO oxidation route, and addressing the cold start problem [25].Despite the above mentioned studies, the development of active and stable catalysts, without noble metals, for low-temperature CO oxidation under an ambient atmosphere remains a significant challenge. Earlier reports, as mentioned above, indicate that the Fe2O3 is the most active iron oxide surface toward CO oxidation at high temperatures (∼300 °C) [8]. Furthermore, a number of theoretical and experimental cluster studies have also shown that selected iron oxide compositions and charge states are the most reactive toward CO oxidation, i.e. FeO2, Fe2O3, FeO2- Fe2O3- FeO+, FeO2+, Fe2O+, Fe2O2+ and Fe2O3+[26,27].The aim of the proposed work is to carry out a detailed investigation that will provide information about the electronic, geometrical, and catalytic properties of the iron oxide FeO2 cluster adsorbed on defect-free MgO(1 0 0) surfaces on the quest for a low-cost and low-temperature CO nano-catalysts. Note that thin oxide films have been found more active at low temperature [24] as compared to the iron oxide surfaces [14]. Our objective is to show

  20. User manual for GEOCITY: a computer model for cost analysis of geothermal district-heating-and-cooling systems. Volume I. Main text

    SciTech Connect

    Huber, H.D.; Fassbender, L.L.; Bloomster, C.H.

    1982-09-01

    The purpose of this model is to calculate the costs of residential space heating, space cooling, and sanitary water heating or process heating (cooling) using geothermal energy from a hydrothermal reservoir. The model can calculate geothermal heating and cooling costs for residential developments, a multi-district city, or a point demand such as an industrial factory or commercial building. GEOCITY simulates the complete geothermal heating and cooling system, which consists of two principal parts: the reservoir and fluid transmission system and the distribution system. The reservoir and fluid transmission submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the reservoir and fluid transmission system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. Geothermal space heating is assumed to be provided by circulating hot water through radiators, convectors, fan-coil units, or other in-house heating systems. Geothermal process heating is provided by directly using the hot water or by circulating it through a process heat exchanger. Geothermal space or process cooling is simulated by circulating hot water through lithium bromide/water absorption chillers located at each building. Retrofit costs for both heating and cooling applications can be input by the user. The life-cycle cost of thermal energy from the reservoir and fluid transmission system to the distribution system and the life-cycle cost of heat (chill) to the end-users are calculated using discounted cash flow analysis.

  1. Theoretical effect of modifications to the upper surface of two NACA airfoils using smooth polynomial additional thickness distributions which emphasize leading edge profile and which vary quadratically at the trailing edge. [using flow equations and a CDC 7600 computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of the NACA 64-206 and 64 sub 1 - 212 airfoils. The additional thickness distribution had the form of a continuous mathematical function which disappears at both the leading edge and the trailing edge. The function behaves as a polynomial of order epsilon sub 1 at the leading edge, and a polynomial of order epsilon sub 2 at the trailing edge. Epsilon sub 2 is a constant and epsilon sub 1 is varied over a range of practical interest. The magnitude of the additional thickness, y, is a second input parameter, and the effect of varying epsilon sub 1 and y on the aerodynamic performance of the airfoil was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic airfoils, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  2. Automated Water Analyser Computer Supported System (AWACSS) Part II: Intelligent, remote-controlled, cost-effective, on-line, water-monitoring measurement system.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (Automated Water Analyser Computer Supported System) based on immunochemical technology has been evaluated that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration or pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) [98/83/EC, 1998. Council Directive (98/83/EC) of 3 November 1998 relating to the quality of water intended for human consumption. Off. J. Eur. Commun. L330, 32-54] and Water Framework Directive (WFD) [2000/60/EC, 2000. Directive 2000/60/EC of the European Parliament and of the Council of 23 October 2000 establishing a framework for Community action in the field of water policy. Off. J. Eur. Commun. L327, 1-72], drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The first part article gave the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods. The systems' capability for analysing a wide range of environmental organic micro-pollutants, such as modern pesticides, endocrine disrupting compounds and pharmaceuticals in surface, ground, drinking and waste water is shown. In addition, a protocol using reconstitution of extracts of solid samples, developed and applied for analysis of river sediments and food samples, is presented. Finally, the overall performance of the AWACSS system in comparison to the conventional analytical techniques, which included liquid and gas chromatographic systems with diode-array UV and mass

  3. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  4. Overview of Computer Hardware.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1980-01-01

    Reviews development in electronics technology of digital computers, considering the binary number representation, miniaturization of electronic components, cost and space requirements of computers, ways in which computers are used, and types of computers appropriate for teaching computer literacy and demonstrating physiological simulation. (CS)

  5. Magnetic-fusion energy and computers

    SciTech Connect

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups.

  6. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  7. User manual for GEOCITY: A computer model for cost analysis of geothermal district-heating-and-cooling systems. Volume 2: Appendices

    NASA Astrophysics Data System (ADS)

    Huber, H. D.; Fassbender, L. L.; Bloomster, C. H.

    1982-09-01

    A model to calculate the costs of residential space heating, space cooling, and sanitary water heating or process heating (cooling) using geothermal energy from a hydrothermal reservoir is discussed. The model can calculate geothermal heating and cooling costs for residential developments, a multi-district city, or a point demand such as an industrial factor or commercial building. All the appendices, including cost equations and models for the reservoir and fluid transmission system and the distribution system, descriptions of predefined residential district types for the distribution system, key equations for the cooling degree hour methodology, and a listing of the sample case output are included. The indices for the input parameters and subroutines defined in the user manual are given.

  8. User manual for GEOCITY: A computer model for cost analysis of geothermal district-heating-and-cooling systems. Volume 1: Main text

    NASA Astrophysics Data System (ADS)

    Huber, H. D.; Fassbender, L. L.; Bloomster, C. H.

    1982-09-01

    The cost of residential space heating, space cooling, and sanitary water heating or process heating (cooling) using geothermal energy from a hydrothermal reservoir was calculated. The GEOCITY simulates the complete geothermal heating and cooling system, which consists of two principal parts: the reservoir and fluid transmission system and the distribution system. Geothermal space heating is provided by circulating hot water through radiators, convectors, and fan-coil units. Geothermal process heating is provided by directly using the hot water or by circulating it through a process heat exchanger. The life cycle cost of thermal energy from the reservoir and fluid transmission system to the distribution system and the life cycle cost of heat (chill) to the end users are calculated by discounted cash flow analysis.

  9. Computation Directorate 2008 Annual Report

    SciTech Connect

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  10. Tackifier for addition polyimides

    NASA Technical Reports Server (NTRS)

    Butler, J. M.; St.clair, T. L.

    1980-01-01

    A modification to the addition polyimide, LaRC-160, was prepared to improve tack and drape and increase prepeg out-time. The essentially solventless, high viscosity laminating resin is synthesized from low cost liquid monomers. The modified version takes advantage of a reactive, liquid plasticizer which is used in place of solvent and helps solve a major problem of maintaining good prepeg tack and drape, or the ability of the prepeg to adhere to adjacent plies and conform to a desired shape during the lay up process. This alternate solventless approach allows both longer life of the polymer prepeg and the processing of low void laminates. This approach appears to be applicable to all addition polyimide systems.

  11. FeO2/MgO(1 0 0) supported cluster: Computational pursual for a low-cost and low-temperature CO nanocatalyst

    NASA Astrophysics Data System (ADS)

    Zamora, A. Y.; Reveles, J. U.; Mejia-Olvera, R.; Baruah, T.; Zope, R. R.

    2014-09-01

    CO oxidation of only 0.23 eV.Regarding the CO catalytic activity of iron oxide species at low-temperatures, it has been experimentally observed that thin oxide films on metals may indeed exhibit greatly enhanced catalytic activity compared to the underlying metal substrate under the same conditions [24]. In addition to the above studies, low temperature CO oxidation over Ag supported ultrathin MgO films was recently reported [17]. In this case, the activation barrier (0.7 eV) resulted lower than the corresponding barrier of CO oxidation on Pt(1 1 1) of 0.9 eV. The gas phase reaction (½O2 + CO → CO2) was calculated to present an overall exothermicity of 3.2 eV. Importantly, this study showed the possibility to generate a catalyst in which the CO adsorption energy of only 0.4 eV is low enough to prevent CO poisoning, therefore enabling a low temperature CO oxidation route, and addressing the cold start problem [25].Despite the above mentioned studies, the development of active and stable catalysts, without noble metals, for low-temperature CO oxidation under an ambient atmosphere remains a significant challenge. Earlier reports, as mentioned above, indicate that the Fe2O3 is the most active iron oxide surface toward CO oxidation at high temperatures (∼300 °C) [8]. Furthermore, a number of theoretical and experimental cluster studies have also shown that selected iron oxide compositions and charge states are the most reactive toward CO oxidation, i.e. FeO2, Fe2O3, FeO2- Fe2O3- FeO+, FeO2+, Fe2O+, Fe2O2+ and Fe2O3+[26,27].The aim of the proposed work is to carry out a detailed investigation that will provide information about the electronic, geometrical, and catalytic properties of the iron oxide FeO2 cluster adsorbed on defect-free MgO(1 0 0) surfaces on the quest for a low-cost and low-temperature CO nano-catalysts. Note that thin oxide films have been found more active at low temperature [24] as compared to the iron oxide surfaces [14]. Our objective is to show

  12. Educational Costs.

    ERIC Educational Resources Information Center

    Arnold, Robert

    Problems in educational cost accounting and a new cost accounting approach are described in this paper. The limitations of the individualized cost (student units) approach and the comparative cost approach (in the form of fund-function-object) are illustrated. A new strategy, an activity-based system of accounting, is advocated. Borrowed from…

  13. Benefits and Costs of Ultraviolet Fluorescent Lighting

    PubMed Central

    Lestina, Diane C.; Miller, Ted R.; Knoblauch, Richard; Nitzburg, Marcia

    1999-01-01

    Objective To demonstrate the improvements in detection and recognition distances using fluorescent roadway delineation and auxiliary ultra-violet (UVA) headlights and determine the reduction in crashes needed to recover increased costs of the UVA/flourescent technology. Methods Field study comparisons with and without UVA headlights. Crash types potentially reduced by UVA/flourescent technology were estimated using annual crash and injury incidence data from the General Estimates System (1995–1996) and the 1996 Fatality Analysis Reporting System. Crash costs were computed based on body region and threat-to-life injury severity. Results Significant improvements in detection and recognition distances for pedestrian scenarios, ranging from 34% to 117%. A 19% reduction in nighttime motor vehicle crashes involving pedestrians or pedal-cycles will pay for the additional UVA headlight costs. Alternatively, a 5.5% reduction in all relevant nighttime crashes will pay for the additional costs of UVA headlights and fluorescent highway paint combined. Conclusions If the increased detection and recognition distances resulting from using UVA/flourescent technology as shown in this field study reduce relevant crashes by even small percentages, the benefit cost ratios will still be greater than 2; thus, the UVA/flourescent technology is very cost-effective and a definite priority for crash reductions.

  14. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  15. Cost Validation Using PRICE H

    NASA Technical Reports Server (NTRS)

    Jack, John; Kwan, Eric; Wood, Milana

    2011-01-01

    PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.

  16. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  17. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  18. What does an MRI scan cost?

    PubMed

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs. PMID:26685437

  19. A computer simulation model of the cost-effectiveness of routine Staphylococcus aureus screening and decolonization among lung and heart-lung transplant recipients.

    PubMed

    Clancy, C J; Bartsch, S M; Nguyen, M H; Stuckey, D R; Shields, R K; Lee, B Y

    2014-06-01

    Our objective was to model the cost-effectiveness and economic value of routine peri-operative Staphylococcus aureus screening and decolonization of lung and heart-lung transplant recipients from hospital and third-party payer perspectives. We used clinical data from 596 lung and heart-lung transplant recipients to develop a model in TreeAge Pro 2009 (Williamsport, MA, USA). Sensitivity analyses varied S. aureus colonization rate (5-15 %), probability of infection if colonized (10-30 %), and decolonization efficacy (25-90 %). Data were collected from the Cardiothoracic Transplant Program at the University of Pittsburgh Medical Center. Consecutive lung and heart-lung transplant recipients from January 2006 to December 2010 were enrolled retrospectively. Baseline rates of S. aureus colonization, infection and decolonization efficacy were 9.6 %, 36.7 %, and 31.9 %, respectively. Screening and decolonization was economically dominant for all scenarios tested, providing more cost savings and health benefits than no screening. Savings per case averted (2012 $US) ranged from $73,567 to $133,157 (hospital perspective) and $10,748 to $16,723 (third party payer perspective), varying with the probability of colonization, infection, and decolonization efficacy. Using our clinical data, screening and decolonization led to cost savings per case averted of $240,602 (hospital perspective) and averted 6.7 S. aureus infections (4.3 MRSA and 2.4 MSSA); 89 patients needed to be screened to prevent one S. aureus infection. Our data support routine S. aureus screening and decolonization of lung and heart-lung transplant patients. The economic value of screening and decolonization was greater than in previous models of other surgical populations. PMID:24500598

  20. Cost goals

    NASA Technical Reports Server (NTRS)

    Hoag, J.

    1981-01-01

    Cost goal activities for the point focusing parabolic dish program are reported. Cost goals involve three tasks: (1) determination of the value of the dish systems to potential users; (2) the cost targets of the dish system are set out; (3) the value side and cost side are integrated to provide information concerning the potential size of the market for parabolic dishes. The latter two activities are emphasized.

  1. Tracking Costs

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    Even though there's been a slight reprieve in energy costs, the reality is that the cost of non-renewable energy is increasing, and state education budgets are shrinking. One way to keep energy and operations costs from overshadowing education budgets is to develop a 10-year energy audit plan to eliminate waste. First, facility managers should…

  2. Performance Boosting Additive

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Mainstream Engineering Corporation was awarded Phase I and Phase II contracts from Goddard Space Flight Center's Small Business Innovation Research (SBIR) program in early 1990. With support from the SBIR program, Mainstream Engineering Corporation has developed a unique low cost additive, QwikBoost (TM), that increases the performance of air conditioners, heat pumps, refrigerators, and freezers. Because of the energy and environmental benefits of QwikBoost, Mainstream received the Tibbetts Award at a White House Ceremony on October 16, 1997. QwikBoost was introduced at the 1998 International Air Conditioning, Heating, and Refrigeration Exposition. QwikBoost is packaged in a handy 3-ounce can (pressurized with R-134a) and will be available for automotive air conditioning systems in summer 1998.

  3. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  4. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  5. ''When Cost Measures Contradict''

    SciTech Connect

    Montgomery, W. D.; Smith, A. E.; Biggar, S. L.; Bernstein, P. M.

    2003-05-09

    When regulators put forward new economic or regulatory policies, there is a need to compare the costs and benefits of these new policies to existing policies and other alternatives to determine which policy is most cost-effective. For command and control policies, it is quite difficult to compute costs, but for more market-based policies, economists have had a great deal of success employing general equilibrium models to assess a policy's costs. Not all cost measures, however, arrive at the same ranking. Furthermore, cost measures can produce contradictory results for a specific policy. These problems make it difficult for a policy-maker to determine the best policy. For a cost measures to be of value, one would like to be confident of two things. First one wants to be sure whether the policy is a winner or loser. Second, one wants to be confident that a measure produces the correct policy ranking. That is, one wants to have confidence in a policy measure's ability to correctly rank policies from most beneficial to most harmful. This paper analyzes empirically these two properties of different costs measures as they pertain to assessing the costs of the carbon abatement policies, especially the Kyoto Protocol, under alternative assumptions about implementation.

  6. Computer-assisted assignment of functional domains in the nonstructural polyprotein of hepatitis E virus: delineation of an additional group of positive-strand RNA plant and animal viruses.

    PubMed

    Koonin, E V; Gorbalenya, A E; Purdy, M A; Rozanov, M N; Reyes, G R; Bradley, D W

    1992-09-01

    Computer-assisted comparison of the nonstructural polyprotein of hepatitis E virus (HEV) with proteins of other positive-strand RNA viruses allowed the identification of the following putative functional domains: (i) RNA-dependent RNA polymerase, (ii) RNA helicase, (iii) methyltransferase, (iv) a domain of unknown function ("X" domain) flanking the papain-like protease domains in the polyproteins of animal positive-strand RNA viruses, and (v) papain-like cysteine protease domain distantly related to the putative papain-like protease of rubella virus (RubV). Comparative analysis of the polymerase and helicase sequences of positive-strand RNA viruses belonging to the so-called "alpha-like" supergroup revealed grouping between HEV, RubV, and beet necrotic yellow vein virus (BNYVV), a plant furovirus. Two additional domains have been identified: one showed significant conservation between HEV, RubV, and BNYVV, and the other showed conservation specifically between HEV and RubV. The large nonstructural proteins of HEV, RubV, and BNYVV retained similar domain organization, with the exceptions of relocation of the putative protease domain in HEV as compared to RubV and the absence of the protease and X domains in BNYVV. These observations show that HEV, RubV, and BNYVV encompass partially conserved arrays of distinctive putative functional domains, suggesting that these viruses constitute a distinct monophyletic group within the alpha-like supergroup of positive-strand RNA viruses. PMID:1518855

  7. Cutting Transportation Costs.

    ERIC Educational Resources Information Center

    Lewis, Barbara

    1982-01-01

    Beginning on the front cover, this article tells how school districts are reducing their transportation costs. Particularly effective measures include the use of computers for bus maintenance and scheduling, school board ownership of buses, and the conversion of gasoline-powered buses to alternative fuels. (Author/MLF)

  8. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  9. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  10. Sensitivity study of six public health risk computation cases from the US Department of Energy risk- and cost-estimate process pilot study

    SciTech Connect

    Chamberlain, P.J. II; Droppo, J.G. Jr.; Castleton, K.J.; Eslinger, P.W.

    1993-09-01

    This report contains a description of the results from the analysis of the sensitivity of estimated public health risks to changes in model parameters relating to the contaminant source releases, contaminant transports, and human exposures contaminants from six waste sites. Estimated public health risks associated with these and other sites at US Department of Energy (DOE) compounds were reported in a pilot study done by the Oak Ridge National Laboratory (ORNL) for the DOE (ORNL 1992). The objective of the sensitivity analysis was to identify the subset of model input parameters whose variations accounted for the majority of the variation in the computed public health risk values. All environmental modeling in this study and the pilot study done by ORNL (1992) was based on the Multimedia Environmental Pollutant Assessment System (Whelan et al. 1992). The results of the sensitivity analysis for the atmospheric case indicate that the most influential variables were emission rate and, to a lesser extent, population size. For groundwater cases, there was no consistent ordering of the influential variables. Depending on the case considered, some influential variables include the following: Equilibrium partition coefficient (K{sub d}), size of population, pore water velocity, constituent inventory, contaminant flux rate from source, and thickness of saturated zone. For the overland transport case, the regression model fit was not adequate for a reliable identification of the influential variables.

  11. Computational Tracking of Mental Health in Youth: Latin American Contributions to a Low-Cost and Effective Solution for Early Psychiatric Diagnosis.

    PubMed

    Mota, Natália Bezerra; Copelli, Mauro; Ribeiro, Sidarta

    2016-06-01

    The early onset of mental disorders can lead to serious cognitive damage, and timely interventions are needed in order to prevent them. In patients of low socioeconomic status, as is common in Latin America, it can be hard to identify children at risk. Here, we briefly introduce the problem by reviewing the scarce epidemiological data from Latin America regarding the onset of mental disorders, and discussing the difficulties associated with early diagnosis. Then we present computational psychiatry, a new field to which we and other Latin American researchers have contributed methods particularly relevant for the quantitative investigation of psychopathologies manifested during childhood. We focus on new technologies that help to identify mental disease and provide prodromal evaluation, so as to promote early differential diagnosis and intervention. To conclude, we discuss the application of these methods to clinical and educational practice. A comprehensive and quantitative characterization of verbal behavior in children, from hospitals and laboratories to homes and schools, may lead to more effective pedagogical and medical intervention. PMID:27254827

  12. Assessing the Costs of Adequacy in California Public Schools: A Cost Function Approach

    ERIC Educational Resources Information Center

    Imazeki, Jennifer

    2008-01-01

    In this study, a cost function is used to estimate the costs for California districts to meet the achievement goals set out for them by the state. I calculate estimates of base costs (i.e., per pupil costs in a district with relatively low levels of student need) and marginal costs (i.e., the additional costs associated with specific student…

  13. The costs of asthma.

    PubMed

    Barnes, P J; Jonsson, B; Klim, J B

    1996-04-01

    At present, asthma represents a substantial burden on health care resources in all countries so far studied. The costs of asthma are largely due to uncontrolled disease, and are likely to rise as its prevalence and severity increase. Costs could be significantly reduced if disease control is improved. A large proportion of the total cost of illness is derived from treating the consequences of poor asthma control-direct costs, such as emergency room use and hospitalizations. Indirect costs, which include time off work or school and early retirement, are incurred when the disease is not fully controlled and becomes severe enough to have an effect on daily life. In addition, quality of life assessments show that asthma has a significant socioeconomic impact, not only on the patients themselves, but on the whole family. Underuse of prescribed therapy, which includes poor compliance, significantly contributes towards the poor control of asthma. The consequences of poor compliance in asthma include increased morbidity and sometimes mortality, and increased health care expenditure. To improve asthma management, international guidelines have been introduced which recommend an increase in the use of prophylactic therapy. The resulting improvements in the control of asthma will reduce the number of hospitalizations associated with asthma, and may ultimately produce a shift within direct costs, with subsequent reductions in indirect costs. In addition, costs may be reduced by improving therapeutic interventions and through effective patient education programmes. This paper reviews current literature on the costs of asthma to assess how effectively money is spent and, by estimating the proportion of the cost attributable to uncontrolled disease, will identify where financial savings might be made. PMID:8726924

  14. Indirect Costs in Universities. ACE Special Report.

    ERIC Educational Resources Information Center

    Woodrow, Raymond J.

    Indirect costs of sponsored research projects and educational programs are as necessary as are the direct costs. This report demonstrates that they are real costs and that sponsors such as the Federal Government receive more than equitable treatment in the computation and application of indirect costs. The areas discussed include: the computation…

  15. Multi-tasking computer control of video related equipment

    NASA Technical Reports Server (NTRS)

    Molina, Rod; Gilbert, Bob

    1989-01-01

    The flexibility, cost-effectiveness and widespread availability of personal computers now makes it possible to completely integrate the previously separate elements of video post-production into a single device. Specifically, a personal computer, such as the Commodore-Amiga, can perform multiple and simultaneous tasks from an individual unit. Relatively low cost, minimal space requirements and user-friendliness, provides the most favorable environment for the many phases of video post-production. Computers are well known for their basic abilities to process numbers, text and graphics and to reliably perform repetitive and tedious functions efficiently. These capabilities can now apply as either additions or alternatives to existing video post-production methods. A present example of computer-based video post-production technology is the RGB CVC (Computer and Video Creations) WorkSystem. A wide variety of integrated functions are made possible with an Amiga computer existing at the heart of the system.

  16. Costing imaging procedures.

    PubMed

    Bretland, P M

    1988-01-01

    The existing National Health Service financial system makes comprehensive costing of any service very difficult. A method of costing using modern commercial methods has been devised, classifying costs into variable, semi-variable and fixed and using the principle of overhead absorption for expenditure not readily allocated to individual procedures. It proved possible to establish a cost spectrum over the financial year 1984-85. The cheapest examinations were plain radiographs outside normal working hours, followed by plain radiographs, ultrasound, special procedures, fluoroscopy, nuclear medicine, angiography and angiographic interventional procedures in normal working hours. This differs from some published figures, particularly those in the Körner report. There was some overlap between fluoroscopic interventional and the cheaper nuclear medicine procedures, and between some of the more expensive nuclear medicine procedures and the cheaper angiographic ones. Only angiographic and the few more expensive nuclear medicine procedures exceed the cost of the inpatient day. The total cost of the imaging service to the district was about 4% of total hospital expenditure. It is shown that where more procedures are undertaken, the semi-variable and fixed (including capital) elements of the cost decrease (and vice versa) so that careful study is required to assess the value of proposed economies. The method is initially time-consuming and requires a computer system with 512 Kb of memory, but once the basic costing system is established in a department, detailed financial monitoring should become practicable. The necessity for a standard comprehensive costing procedure of this nature, based on sound cost accounting principles, appears inescapable, particularly in view of its potential application to management budgeting. PMID:3349241

  17. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  18. The True Cost of Ownership.

    ERIC Educational Resources Information Center

    McKenzie, Jamie

    2002-01-01

    Discussion of technology planning in schools focuses on the total cost of ownership, a model from business that can be helpful in determining the real costs of educational technology. Highlights include learning resources; organizational impacts and management; computer network resources, including software; research and development; and spirit…

  19. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  20. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  1. Cost Determinants in Canadian Universities.

    ERIC Educational Resources Information Center

    Dickson, Vaughan

    1994-01-01

    A study investigated the relationship between costs per student (overall, instruction, library, computing, administration, physical plant) and enrollment in 61 Canadian universities. Also examined was the influence on cost of student mix, faculty-student ratio, faculty wages, program and discipline enrollment rates, and research intensity. (MSE)

  2. Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  3. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  4. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  5. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  6. Reducing reconditioning costs using computerized CP technology

    SciTech Connect

    Rizzo, M.E.; Wildman, T.A.

    1997-12-01

    New data collection technology and improved data interpretation diminish the need to spend hundreds of thousands or even millions of dollars to recondition poorly coated pipelines without compromising safety. Application of alternative cathodic protection criteria rewards companies with additional resources to remain competitive. This paper examines the results of applying a combination of technologies that matured throughout the 1980`s: Global Positioning Satellites, rugged field computers, fast analog-to-digital converters, solid state interruption devices, and interpretation of oscillographic cathodic protection waveprints. Cost effective application of sound engineering principles assure safe pipeline operation, exceed the letter and the spirit of NACE and DOT requirements, and yield significant financial returns.

  7. Identification of cost effective energy conservation measures

    NASA Technical Reports Server (NTRS)

    Bierenbaum, H. S.; Boggs, W. H.

    1978-01-01

    In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed

  8. Distributed computing at the SSCL

    SciTech Connect

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given.

  9. 49 CFR 1139.3 - Cost study.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... carrier (study carrier) in order to demonstrate the procedures by which the computer program distributes.... (d) Where cost studies are developed through the use of computer processing techniques, there...

  10. 49 CFR 1139.3 - Cost study.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... carrier (study carrier) in order to demonstrate the procedures by which the computer program distributes.... (d) Where cost studies are developed through the use of computer processing techniques, there...

  11. 48 CFR 246.470-1 - Assessment of additional costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality... Supplies—Fixed-Price, after considering the factors in paragraph (c) of this subsection, the...

  12. 48 CFR 246.470-1 - Assessment of additional costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality... Supplies—Fixed-Price, after considering the factors in paragraph (c) of this subsection, the...

  13. 34 CFR 643.30 - What are allowable costs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... rented space is not owned by the grantee. (f) Purchase of computer hardware, computer software, or other... allowable costs? The cost principles that apply to the Talent Search program are in 34 CFR part 74,...

  14. General aviation design synthesis utilizing interactive computer graphics

    NASA Technical Reports Server (NTRS)

    Galloway, T. L.; Smith, M. R.

    1976-01-01

    Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.

  15. 15 CFR 990.66 - Additional considerations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... NATURAL RESOURCE DAMAGE ASSESSMENTS Restoration Implementation Phase § 990.66 Additional considerations... restoration success and the need for corrective action. (b) The reasonable costs of such actions are...

  16. 15 CFR 990.66 - Additional considerations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... NATURAL RESOURCE DAMAGE ASSESSMENTS Restoration Implementation Phase § 990.66 Additional considerations... restoration success and the need for corrective action. (b) The reasonable costs of such actions are...

  17. 15 CFR 990.66 - Additional considerations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... NATURAL RESOURCE DAMAGE ASSESSMENTS Restoration Implementation Phase § 990.66 Additional considerations... restoration success and the need for corrective action. (b) The reasonable costs of such actions are...

  18. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer. PMID:11496040

  19. Textbooks: Costs and Issues

    ERIC Educational Resources Information Center

    Mize, Rita

    2004-01-01

    As community colleges seek to be as accessible as possible to students and attempt to retain low enrollment fees, manageable parking fees, and waiver of fees for those with financial needs, an additional and significant cost ? for textbooks and supplies ? has not been addressed systematically. While fees for a full-time student are $390 per…

  20. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the

  1. How to Bill Your Computer Services.

    ERIC Educational Resources Information Center

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  2. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  3. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  4. Cost Control

    ERIC Educational Resources Information Center

    Foreman, Phillip

    2009-01-01

    Education administrators involved in construction initiatives unanimously agree that when it comes to change orders, less is more. Change orders have a negative rippling effect of driving up building costs and producing expensive project delays that often interfere with school operations and schedules. Some change orders are initiated by schools…

  5. Low Cost Hydrogen Production Platform

    SciTech Connect

    Timothy M. Aaron, Jerome T. Jankowiak

    2009-10-16

    A technology and design evaluation was carried out for the development of a turnkey hydrogen production system in the range of 2.4 - 12 kg/h of hydrogen. The design is based on existing SMR technology and existing chemical processes and technologies to meet the design objectives. Consequently, the system design consists of a steam methane reformer, PSA system for hydrogen purification, natural gas compression, steam generation and all components and heat exchangers required for the production of hydrogen. The focus of the program is on packaging, system integration and an overall step change in the cost of capital required for the production of hydrogen at small scale. To assist in this effort, subcontractors were brought in to evaluate the design concepts and to assist in meeting the overall goals of the program. Praxair supplied the overall system and process design and the subcontractors were used to evaluate the components and system from a manufacturing and overall design optimization viewpoint. Design for manufacturing and assembly (DFMA) techniques, computer models and laboratory/full-scale testing of components were utilized to optimize the design during all phases of the design development. Early in the program evaluation, a review of existing Praxair hydrogen facilities showed that over 50% of the installed cost of a SMR based hydrogen plant is associated with the high temperature components (reformer, shift, steam generation, and various high temperature heat exchange). The main effort of the initial phase of the program was to develop an integrated high temperature component for these related functions. Initially, six independent concepts were developed and the processes were modeled to determine overall feasibility. The six concepts were eventually narrowed down to the highest potential concept. A US patent was awarded in February 2009 for the Praxair integrated high temperature component design. A risk analysis of the high temperature component was

  6. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  7. SEASAT economic assessment. Volume 10: The SATIL 2 program (a program for the evaluation of the costs of an operational SEASAT system as a function of operational requirements and reliability. [computer programs for economic analysis and systems analysis of SEASAT satellite systems

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The SATIL 2 computer program was developed to assist with the programmatic evaluation of alternative approaches to establishing and maintaining a specified mix of operational sensors on spacecraft in an operational SEASAT system. The program computes the probability distributions of events (i.e., number of launch attempts, number of spacecraft purchased, etc.), annual recurring cost, and present value of recurring cost. This is accomplished for the specific task of placing a desired mix of sensors in orbit in an optimal fashion in order to satisfy a specified sensor demand function. Flow charts are shown, and printouts of the programs are given.

  8. Additive Manufacturing of Hybrid Circuits

    NASA Astrophysics Data System (ADS)

    Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David; Hirschfeld, Deidre; Hall, Aaron C.; Bell, Nelson S.

    2016-07-01

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects. Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. Finally, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.

  9. Cost considerations for interstellar missions

    NASA Astrophysics Data System (ADS)

    Andrews, Dana G.

    This paper examines the technical and economic feasibility of interstellar exploration. Three candidate interstellar propulsion systems are evaluated with respect to technical viability and compared on an estimated cost basis. Two of the systems, the laser-propelled lightsail (LPL) and the particle-beam propelled magsail (PBPM), appear to be technically feasible and capable supporting one-way probes to nearby star systems within the lifetime of the principal investigators, if enough energy is available. The third propulsion system, the antimatter rocket, requires additional proof of concept demonstrations before its feasibility can be evaluated. Computer simulations of the acceleration and deceleration interactions of LPL and PBPM were completed and spacecraft configurations optimized for minimum energy usage are noted. The optimum LPL transfers about ten percent of the laser beam energy into kinetic energy of the spacecraft while the optimum PBPM transfers about thirty percent. Since particle beam generators are roughly twice as energy efficient as large lasers, the PBPM propulsion system requires roughly one-sixth the busbar electrical energy a LPL system would require to launch an identical payload. The total beam energy requirement for an interstellar probe mission is roughly 10 20 joules, which would require the complete fissioning of one thousand tons of Uranium assuming thirty-five percent powerplant efficiency. This is roughly equivalent to a recurring cost per flight of 3.0 Billion dollars in reactor grade enriched uranium using today's prices. Therefore, interstellar flight is an expensive proposition, but not unaffordable, if the nonrecurring costs of building the powerplant can be minimized.

  10. Consumer Security Perceptions and the Perceived Influence on Adopting Cloud Computing: A Quantitative Study Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Paquet, Katherine G.

    2013-01-01

    Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…

  11. Considering Thin Client Computing for Higher Education.

    ERIC Educational Resources Information Center

    Sheehan, Mark

    1998-01-01

    In response to concerns about the cost of keeping up with individual desktop computing technology, several new solutions have emerged. Referred to as "thin clients," or network-centric computers, they include two types of desktop device: the network computer and the Windows terminal. Purchase cost, life span, support costs, and overall total cost…

  12. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  13. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  14. Component Cost Analysis of Large Scale Systems

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.; Yousuff, A.

    1982-01-01

    The ideas of cost decomposition is summarized to aid in the determination of the relative cost (or 'price') of each component of a linear dynamic system using quadratic performance criteria. In addition to the insights into system behavior that are afforded by such a component cost analysis CCA, these CCA ideas naturally lead to a theory for cost-equivalent realizations.

  15. 40 CFR 30.27 - Allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of appendix E of 45 CFR part 74, “Principles for Determining Costs Applicable to Research and... accordance with the provisions of the Federal Acquisition Regulation (FAR) at 48 CFR part 31. In addition... costs. Allowability of costs shall be determined in accordance with the cost principles applicable...

  16. 26 CFR 1.1250-2 - Additional depreciation defined.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... additional depreciation for the property is $1,123, as computed in the table below: Year Actual depreciation... January 1, 1970, the additional depreciation for the property is $567, as computed in the table below... computed in the table below: Year Actual depreciation Straight line Additional depreciation (deficit)...

  17. Using Technology to Control Costs

    ERIC Educational Resources Information Center

    Ho, Simon; Schoenberg, Doug; Richards, Dan; Morath, Michael

    2009-01-01

    In this article, the authors examines the use of technology to control costs in the child care industry. One of these technology solutions is Software-as-a-Service (SaaS). SaaS solutions can help child care providers save money in many aspects of center management. In addition to cost savings, SaaS solutions are also particularly appealing to…

  18. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  19. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  20. Soft computing and fuzzy logic

    SciTech Connect

    Zadeh, L.A.

    1994-12-31

    Soft computing is a collection of methodologies that aim to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness, and low solution cost. Its principal constituents are fuzzy logic, neuro-computing, and probabilistic reasoning. Soft computing is likely to play an increasingly important role in many application areas, including software engineering. The role model for soft computing is the human mind.

  1. Metal Additive Manufacturing: A Review

    NASA Astrophysics Data System (ADS)

    Frazier, William E.

    2014-06-01

    This paper reviews the state-of-the-art of an important, rapidly emerging, manufacturing technology that is alternatively called additive manufacturing (AM), direct digital manufacturing, free form fabrication, or 3D printing, etc. A broad contextual overview of metallic AM is provided. AM has the potential to revolutionize the global parts manufacturing and logistics landscape. It enables distributed manufacturing and the productions of parts-on-demand while offering the potential to reduce cost, energy consumption, and carbon footprint. This paper explores the material science, processes, and business consideration associated with achieving these performance gains. It is concluded that a paradigm shift is required in order to fully exploit AM potential.

  2. Opportunity Cost of Distance Education

    ERIC Educational Resources Information Center

    Turkoglu, Recep

    2004-01-01

    In this study, opportunity cost (OC) of distance education (DE) has been examined. In addition, factors which affect OC of DE have been investigated. (Contains 1 table.) [Abstract modified to meet ERIC guidelines.

  3. Computing Aerodynamics Of Propfans

    NASA Technical Reports Server (NTRS)

    Chandrasekaran, B.

    1987-01-01

    Cost and duration of wind-tunnel tests reduced. Computer program developed to predict interference of slipstream of propfan on supercritical wing at subsonic speeds. Use of program reduces cost and time involved in wind-tunnel testing of newly-designed wing/nacelle configurations. Program written in FORTRAN V.

  4. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  5. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  6. Additive Manufacturing: Making Imagination the Major Limitation

    NASA Astrophysics Data System (ADS)

    Zhai, Yuwei; Lados, Diana A.; LaGoy, Jane L.

    2014-05-01

    Additive manufacturing (AM) refers to an advanced technology used for the fabrication of three-dimensional near-net-shaped functional components directly from computer models, using unit materials. The fundamentals and working principle of AM offer several advantages, including near-net-shape capabilities, superior design and geometrical flexibility, innovative multi-material fabrication, reduced tooling and fixturing, shorter cycle time for design and manufacturing, instant local production at a global scale, and material, energy, and cost efficiency. Well suiting the requests of modern manufacturing climate, AM is viewed as the new industrial revolution, making its way into a continuously increasing number of industries, such as aerospace, defense, automotive, medical, architecture, art, jewelry, and food. This overview was created to relate the historical evolution of the AM technology to its state-of-the-art developments and emerging applications. Generic thoughts on the microstructural characteristics, properties, and performance of AM-fabricated materials will also be discussed, primarily related to metallic materials. This write-up will introduce the general reader to specifics of the AM field vis-à-vis advantages and common techniques, materials and properties, current applications, and future opportunities.

  7. 47 CFR 68.318 - Additional limitations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....708 of this chapter (47 CFR 64.708). ... TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions for Terminal Equipment Approval § 68.318 Additional... activation. Note to paragraph (b)(1): Emergency alarm dialers and dialers under external computer control...

  8. 47 CFR 68.318 - Additional limitations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....708 of this chapter (47 CFR 64.708). ... TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions for Terminal Equipment Approval § 68.318 Additional... activation. Note to paragraph (b)(1): Emergency alarm dialers and dialers under external computer control...

  9. 47 CFR 68.318 - Additional limitations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....708 of this chapter (47 CFR 64.708). ... TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions for Terminal Equipment Approval § 68.318 Additional... activation. Note to paragraph (b)(1): Emergency alarm dialers and dialers under external computer control...

  10. 47 CFR 68.318 - Additional limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....708 of this chapter (47 CFR 64.708). ... TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions for Terminal Equipment Approval § 68.318 Additional... activation. Note to paragraph (b)(1): Emergency alarm dialers and dialers under external computer control...

  11. 47 CFR 68.318 - Additional limitations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....708 of this chapter (47 CFR 64.708). ... TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions for Terminal Equipment Approval § 68.318 Additional... activation. Note to paragraph (b)(1): Emergency alarm dialers and dialers under external computer control...

  12. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  13. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  14. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  15. Cost-effectiveness of colorectal cancer screening – an overview

    PubMed Central

    Lansdorp-Vogelaar, Iris; Knudsen, Amy; Brenner, Hermann

    2010-01-01

    There are several modalities available for a colorectal cancer (CRC) screening program. When determining which CRC screening program to implement, the costs of such programs should be considered in comparison to the health benefits they are expected to provide. Cost-effectiveness analysis provides a tool to do this. In this paper we review the evidence on the cost-effectiveness of CRC screening. Published studies universally indicate that when compared with no CRC screening, all screening modalities provide additional years of life at a cost that is deemed acceptable by most industrialized nations. Many recent studies even find CRC screening to be cost-saving. However, when the alternative CRC screening strategies are compared against each other in an incremental cost-effectiveness analysis, no single optimal strategy emerges across the studies. There is consensus that the new technologies of stool DNA testing, computed tomographic colonography and capsule endoscopy are not yet cost-effective compared with the established CRC screening tests. PMID:20833348

  16. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects. PMID:24772784

  17. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  18. Total Cost of Ownership: Key Infrastructure Management Tool.

    ERIC Educational Resources Information Center

    Bolton, Denny G.

    2001-01-01

    Many school districts have planned only for upfront software and hardware costs (one-quarter of "real" costs). This article examines major cost components of client-server computing, discusses TCO (total cost of ownership) as a tool for managing investment in technology, and considers how to leverage cost-reduction strategies. (MLH)

  19. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  20. Models for computing combat risk

    NASA Astrophysics Data System (ADS)

    Jelinek, Jan

    2002-07-01

    Combat always involves uncertainty and uncertainty entails risk. To ensure that a combat task is prosecuted with the desired probability of success, the task commander has to devise an appropriate task force and then adjust it continuously in the course of battle. In order to do so, he has to evaluate how the probability of task success is related to the structure, capabilities and numerical strengths of combatants. For this purpose, predictive models of combat dynamics for combats in which the combatants fire asynchronously at random instants are developed from the first principles. Combats involving forces with both unlimited and limited ammunition supply are studied and modeled by stochastic Markov processes. In addition to the Markov models, another class of models first proposed by Brown was explored. The models compute directly the probability of win, in which we are primarily interested, without integrating the state probability equations. Experiments confirm that they produce exactly the same results at much lower computational cost.

  1. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  2. Computer Technology for Industry.

    ERIC Educational Resources Information Center

    Aviation/Space, 1982

    1982-01-01

    A special National Aeronautics and Space Administration (NASA) service is contributing to national productivity by providing industry with reusable, low-cost, government-developed computer programs. Located at the University of Georgia, NASA's Computer Software Management and Information Center (COSMIC) has developed programs for equipment…

  3. Using technology to support investigations in the electronic age: tracking hackers to large scale international computer fraud

    NASA Astrophysics Data System (ADS)

    McFall, Steve

    1994-03-01

    With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.

  4. Additive usage levels.

    PubMed

    Langlais, R

    1996-01-01

    With the adoption of the European Parliament and Council Directives on sweeteners, colours and miscellaneous additives the Commission is now embarking on the project of coordinating the activities of the European Union Member States in the collection of the data that are to make up the report on food additive intake requested by the European Parliament. This presentation looks at the inventory of available sources on additive use levels and concludes that for the time being national legislation is still the best source of information considering that the directives have yet to be transposed into national legislation. Furthermore, this presentation covers the correlation of the food categories as found in the additives directives with those used by national consumption surveys and finds that in a number of instances this correlation still leaves a lot to be desired. The intake of additives via food ingestion and the intake of substances which are chemically identical to additives but which occur naturally in fruits and vegetables is found in a number of cases to be higher than the intake of additives added during the manufacture of foodstuffs. While the difficulties are recognized in contributing to the compilation of food additive intake data, industry as a whole, i.e. the food manufacturing and food additive manufacturing industries, are confident that in a concerted effort, use data on food additives by industry can be made available. Lastly, the paper points out that with the transportation of the additives directives into national legislation and the time by which the food industry will be able to make use of the new food legislative environment several years will still go by; food additives use data by the food industry will thus have to be reviewed at the beginning of the next century. PMID:8792135

  5. An additional middle cuneiform?

    PubMed Central

    Brookes-Fazakerley, S.D.; Jackson, G.E.; Platt, S.R.

    2015-01-01

    Additional cuneiform bones of the foot have been described in reference to the medial bipartite cuneiform or as small accessory ossicles. An additional middle cuneiform has not been previously documented. We present the case of a patient with an additional ossicle that has the appearance and location of an additional middle cuneiform. Recognizing such an anatomical anomaly is essential for ruling out second metatarsal base or middle cuneiform fractures and for the preoperative planning of arthrodesis or open reduction and internal fixation procedures in this anatomical location. PMID:26224890

  6. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  7. Cost-Effectiveness of Clinical Decision Support System in Improving Maternal Health Care in Ghana

    PubMed Central

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Aborigo, Raymond Akawire; Saronga, Happiness Pius; Williams, John; Blank, Antje; Kaltschmidt, Jens; Sauerborn, Rainer; Loukanova, Svetla

    2015-01-01

    Objective This paper investigated the cost-effectiveness of a computer-assisted Clinical Decision Support System (CDSS) in the identification of maternal complications in Ghana. Methods A cost-effectiveness analysis was performed in a before- and after-intervention study. Analysis was conducted from the provider’s perspective. The intervention area was the Kassena- Nankana district where computer-assisted CDSS was used by midwives in maternal care in six selected health centres. Six selected health centers in the Builsa district served as the non-intervention group, where the normal Ghana Health Service activities were being carried out. Results Computer-assisted CDSS increased the detection of pregnancy complications during antenatal care (ANC) in the intervention health centres (before-intervention= 9 /1,000 ANC attendance; after-intervention= 12/1,000 ANC attendance; P-value=0.010). In the intervention health centres, there was a decrease in the number of complications during labour by 1.1%, though the difference was not statistically significant (before-intervention =107/1,000 labour clients; after-intervention= 96/1,000 labour clients; P-value=0.305). Also, at the intervention health centres, the average cost per pregnancy complication detected during ANC (cost –effectiveness ratio) decreased from US$17,017.58 (before-intervention) to US$15,207.5 (after-intervention). Incremental cost –effectiveness ratio (ICER) was estimated at US$1,142. Considering only additional costs (cost of computer-assisted CDSS), cost per pregnancy complication detected was US$285. Conclusions Computer –assisted CDSS has the potential to identify complications during pregnancy and marginal reduction in labour complications. Implementing computer-assisted CDSS is more costly but more effective in the detection of pregnancy complications compared to routine maternal care, hence making the decision to implement CDSS very complex. Policy makers should however be guided by whether

  8. Tracking and computing

    SciTech Connect

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology.

  9. Debugging embedded computer programs. [tactical missile computers

    NASA Technical Reports Server (NTRS)

    Kemp, G. H.

    1980-01-01

    Every embedded computer program must complete its debugging cycle using some system that will allow real time debugging. Many of the common items addressed during debugging are listed. Seven approaches to debugging are analyzed to evaluate how well they treat those items. Cost evaluations are also included in the comparison. The results indicate that the best collection of capabilities to cover the common items present in the debugging task occurs in the approach where a minicomputer handles the environment simulation with an emulation of some kind representing the embedded computer. This approach can be taken at a reasonable cost. The case study chosen is an embedded computer in a tactical missile. Several choices of computer for the environment simulation are discussed as well as different approaches to the embedded emulator.

  10. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  11. Carbamate deposit control additives

    SciTech Connect

    Honnen, L.R.; Lewis, R.A.

    1980-11-25

    Deposit control additives for internal combustion engines are provided which maintain cleanliness of intake systems without contributing to combustion chamber deposits. The additives are poly(oxyalkylene) carbamates comprising a hydrocarbyloxyterminated poly(Oxyalkylene) chain of 2-5 carbon oxyalkylene units bonded through an oxycarbonyl group to a nitrogen atom of ethylenediamine.

  12. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  13. Artist meets computer

    NASA Astrophysics Data System (ADS)

    Faggin, Marzia

    1997-04-01

    I would like to share my experience ofusing the computer for creating art. I am a graphic designer originally trained without any exposure to the computer. I graduated in July of 1994 from a four-year curriculum of graphic design at the Istituto Europeo di Design in Milan Italy. Italy is famous for its excellent design capability. Art and beauty influence the life ofnearly every Italian. Everywhere you look on the streets there is art from grandiose architecture to the displays in shop windows. A keen esthetic sense and a search and appreciation for quality permeate all aspects of Italian life, manifesting in the way people cut their hair, the style ofthe clothes and how furniture and everyday objects are designed. Italian taste is fine-tuned to the appreciation ofrefined textiles and quality materials are often enhanced by simple design. The Italian culture has a long history ofexcellent artisanship and good craftsmanship is highly appreciated. Gadgets have never been popular in Italian society. Gadgets are considered useless objects which add nothing to a person's life, and since they cost money they are actually viewed as a waste. The same is true for food, exception made in the big cities filled with tourists, fast food chains have never survived. Genuine and simple food is what people truly desire. A typical Italian sandwich, for example, is minimalist, the essential ingredients are left alone without additional sauces because if something is delicious by itselfwhy would anyone want to disgnise its taste?

  14. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  15. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  16. Transmetalation from B to Rh in the course of the catalytic asymmetric 1,4-addition reaction of phenylboronic acid to enones: a computational comparison of diphosphane and diene ligands.

    PubMed

    Li, You-Gui; He, Gang; Qin, Hua-Li; Kantchev, Eric Assen B

    2015-02-14

    Transmetalation is a key elementary reaction of many important catalytic reactions. Among these, 1,4-addition of arylboronic acids to organic acceptors such as α,β-unsaturated ketones has emerged as one of the most important methods for asymmetric C-C bond formation. A key intermediate for the B-to-Rh transfer arising from quaternization on a boronic acid by a Rh-bound hydroxide (the active catalyst) has been proposed. Herein, DFT calculations (IEFPCM/PBE0/DGDZVP level of theory) establish the viability of this proposal, and characterize the associated pathways. The delivery of phenylboronic acid in the orientation suited for the B-to-Rh transfer from the very beginning is energetically preferable, and occurs with expulsion of Rh-coordinated water molecules. For the bulkier binap ligand, the barriers are higher (particularly for the phenylboronic acid activation step) due to a less favourable entropy term to the free energy, in accordance with the experimentally observed slower transmetalation rate. PMID:25422851

  17. 26 CFR 1.1250-2 - Additional depreciation defined.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... is $1,123, as computed in the table below: Year Actual depreciation Straight line Additional... depreciation for the property is $567, as computed in the table below: Years Depreciation Straight line... additional depreciation for the property is $29,000, as computed in the table below: Year Actual...

  18. 26 CFR 1.1250-2 - Additional depreciation defined.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... is $1,123, as computed in the table below: Year Actual depreciation Straight line Additional... depreciation for the property is $567, as computed in the table below: Years Depreciation Straight line... additional depreciation for the property is $29,000, as computed in the table below: Year Actual...

  19. 26 CFR 1.1250-2 - Additional depreciation defined.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... is $1,123, as computed in the table below: Year Actual depreciation Straight line Additional... depreciation for the property is $567, as computed in the table below: Years Depreciation Straight line... additional depreciation for the property is $29,000, as computed in the table below: Year Actual...

  20. Additional Security Considerations for Grid Management

    NASA Technical Reports Server (NTRS)

    Eidson, Thomas M.

    2003-01-01

    The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.

  1. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  2. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  3. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  4. Software for Tracking Costs of Mars Projects

    NASA Technical Reports Server (NTRS)

    Wong, Alvin; Warfield, Keith

    2003-01-01

    The Mars Cost Tracking Model is a computer program that administers a system set up for tracking the costs of future NASA projects that pertain to Mars. Previously, no such tracking system existed, and documentation was written in a variety of formats and scattered in various places. It was difficult to justify costs or even track the history of costs of a spacecraft mission to Mars. The present software enables users to maintain all cost-model definitions, documentation, and justifications of cost estimates in one computer system that is accessible via the Internet. The software provides sign-off safeguards to ensure the reliability of information entered into the system. This system may eventually be used to track the costs of projects other than only those that pertain to Mars.

  5. Parametric Cost Deployment

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1995-01-01

    Parametric cost analysis is a mathematical approach to estimating cost. Parametric cost analysis uses non-cost parameters, such as quality characteristics, to estimate the cost to bring forth, sustain, and retire a product. This paper reviews parametric cost analysis and shows how it can be used within the cost deployment process.

  6. THE COST OF AUDIOVISUAL INSTRUCTION.

    ERIC Educational Resources Information Center

    1964

    A REPORT OF A SURVEY ON THE COST OF AUDIOVISUAL INSTRUCTION IN THE NATION'S PUBLIC ELEMENTARY AND SECONDARY SCHOOLS DURING 1962-63 AND 1963-64 WAS PRESENTED. INCLUDED WERE THE TOTAL EXPENDITURES FOR AUDIOVISUAL INSTRUCTION AND SPECIFIC EXPENDITURES FOR AUDIOVISUAL SALARIES, AUDIOVISUAL EQUIPMENT, AND FILM RENTALS. MEDIANS WERE COMPUTED FOR (1) THE…

  7. Commodity-Based Computing Clusters at PPPL.

    NASA Astrophysics Data System (ADS)

    Wah, Darren; Davis, Steven L.; Johansson, Marques; Klasky, Scott; Tang, William; Valeo, Ernest

    2002-11-01

    In order to cost-effectively facilitate mid-scale serial and parallel computations and code development, a number of commodity-based clusters have been built at PPPL. A recent addition is the PETREL cluster, consisting of 100 dual-processor machines, both Intel and AMD, interconnected by a 100Mbit switch. Sixteen machines have an additional Myrinet 2000 interconnect. Also underway is the implementation of a Prototype Topical Computing Facility which will explore the effectiveness and scaling of cluster computing for larger scale fusion codes, specifically including those being developed under the SCIDAC auspices. This facility will consist of two parts: a 64 dual-processor node cluster, with high speed interconnect, and a 16 dual-processor node cluster, utilizing gigabit networking, built for the purpose of exploring grid-enabled computing. The initial grid explorations will be in collaboration with the Princeton University Institute for Computational Science and Engineering (PICSciE), where a 16 processor cluster dedicated to investigation of grid computing is being built. The initial objectives are to (1) grid-enable the GTC code and an MHD code, making use of MPICH-G2 and (2) implement grid-enabled interactive visualization using DXMPI and the Chromium API.

  8. Costing climate change

    NASA Astrophysics Data System (ADS)

    Reay, David S.

    2002-12-01

    Debate over how, when, and even whether man-made greenhouse-gas emissions should be controlled has grown in intensity even faster than the levels of greenhouse gas in our atmosphere. Many argue that the costs involved in reducing emissions outweigh the potential economic damage of human-induced climate change. Here, existing cost-benefit analyses of greenhouse-gas reduction policies are examined, with a view to establishing whether any such global reductions are currently worthwhile. Potential for, and cost of, cutting our own individual greenhouse-gas emissions is then assessed. I find that many abatement strategies are able to deliver significant emission reductions at little or no net cost. Additionally, I find that there is huge potential for individuals to simultaneously cut their own greenhouse-gas emissions and save money. I conclude that cuts in global greenhouse-gas emissions, such as those of the Kyoto Protocol, cannot be justifiably dismissed as posing too large an economic burden.

  9. Additive manufacturing of hybrid circuits

    DOE PAGESBeta

    Bell, Nelson S.; Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David M.; Hirschfeld, Deidre; Hall, Aaron Christopher

    2016-03-26

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects.more » Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. As a result, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.« less

  10. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  11. Realistic costs of carbon capture

    SciTech Connect

    Al Juaied, Mohammed . Belfer Center for Science and International Affiaris); Whitmore, Adam )

    2009-07-01

    There is a growing interest in carbon capture and storage (CCS) as a means of reducing carbon dioxide (CO2) emissions. However there are substantial uncertainties about the costs of CCS. Costs for pre-combustion capture with compression (i.e. excluding costs of transport and storage and any revenue from EOR associated with storage) are examined in this discussion paper for First-of-a-Kind (FOAK) plant and for more mature technologies, or Nth-of-a-Kind plant (NOAK). For FOAK plant using solid fuels the levelised cost of electricity on a 2008 basis is approximately 10 cents/kWh higher with capture than for conventional plants (with a range of 8-12 cents/kWh). Costs of abatement are found typically to be approximately US$150/tCO2 avoided (with a range of US$120-180/tCO2 avoided). For NOAK plants the additional cost of electricity with capture is approximately 2-5 cents/kWh, with costs of the range of US$35-70/tCO2 avoided. Costs of abatement with carbon capture for other fuels and technologies are also estimated for NOAK plants. The costs of abatement are calculated with reference to conventional SCPC plant for both emissions and costs of electricity. Estimates for both FOAK and NOAK are mainly based on cost data from 2008, which was at the end of a period of sustained escalation in the costs of power generation plant and other large capital projects. There are now indications of costs falling from these levels. This may reduce the costs of abatement and costs presented here may be 'peak of the market' estimates. If general cost levels return, for example, to those prevailing in 2005 to 2006 (by which time significant cost escalation had already occurred from previous levels), then costs of capture and compression for FOAK plants are expected to be US$110/tCO2 avoided (with a range of US$90-135/tCO2 avoided). For NOAK plants costs are expected to be US$25-50/tCO2. Based on these considerations a likely representative range of costs of abatement from CCS excluding

  12. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    ERIC Educational Resources Information Center

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  13. A new application for food customization with additive manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Serenó, L.; Vallicrosa, G.; Delgado, J.; Ciurana, J.

    2012-04-01

    Additive Manufacturing (AM) technologies have emerged as a freeform approach capable of producing almost any complete three dimensional (3D) objects from computer-aided design (CAD) data by successively adding material layer by layer. Despite the broad range of possibilities, commercial AM technologies remain complex and expensive, making them suitable only for niche applications. The developments of the Fab@Home system as an open AM technology discovered a new range of possibilities of processing different materials such as edible products. The main objective of this work is to analyze and optimize the manufacturing capacity of this system when producing 3D edible objects. A new heated syringe deposition tool was developed and several process parameters were optimized to adapt this technology to consumers' needs. The results revealed in this study show the potential of this system to produce customized edible objects without qualified personnel knowledge, therefore saving manufacturing costs compared to traditional technologies.

  14. Automatic Computer Mapping of Terrain

    NASA Technical Reports Server (NTRS)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  15. Computational study of the transition state for H[sub 2] addition to Vaska-type complexes (trans-Ir(L)[sub 2](CO)X). Substituent effects on the energy barrier and the origin of the small H[sub 2]/D[sub 2] kinetic isotope effect

    SciTech Connect

    Abu-Hasanayn, F.; Goldman, A.S.; Krogh-Jespersen, K. )

    1993-06-03

    Ab initio molecular orbital methods have been used to study transition state properties for the concerted addition reaction of H[sub 2] to Vaska-type complexes, trans-Ir(L)[sub 2](CO)X, 1 (L = PH[sub 3] and X = F, Cl, Br, I, CN, or H; L = NH[sub 3] and X = Cl). Stationary points on the reaction path retaining the trans-L[sub 2] arrangement were located at the Hartree-Fock level using relativistic effective core potentials and valence basis sets of double-[zeta] quality. The identities of the stationary points were confirmed by normal mode analysis. Activation energy barriers were calculated with electron correlation effects included via Moller-Plesset perturbation theory carried fully through fourth order, MP4(SDTQ). The more reactive complexes feature structurally earlier transition states and larger reaction exothermicities, in accord with the Hammond postulate. The experimentally observed increase in reactivity of Ir(PPh[sub 3])[sub 2](CO)X complexes toward H[sub 2] addition upon going from X = F to X = I is reproduced well by the calculations and is interpreted to be a consequence of diminished halide-to-Ir [pi]-donation by the heavier halogens. Computed activation barriers (L = PH[sub 3]) range from 6.1 kcal/mol (X = H) to 21.4 kcal/mol (X = F). Replacing PH[sub 3] by NH[sub 3] when X = Cl increases the barrier from 14.1 to 19.9 kcal/mol. Using conventional transition state theory, the kinetic isotope effects for H[sub 2]/D[sub 2] addition are computed to lie between 1.1 and 1.7 with larger values corresponding to earlier transition states. Judging from the computational data presented here, tunneling appears to be unimportant for H[sub 2] addition to these iridium complexes. 51 refs., 4 tabs.

  16. 34 CFR 645.40 - What are allowable costs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... activities such as field trips. (n) Purchase of computer hardware, computer software, or other equipment for... What are allowable costs? The cost principles that apply to the Upward Bound Program are in 34 CFR part... participants in an Upward Bound residential summer component, room and board—computed on a weekly basis—not...

  17. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  18. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  19. HEALTH COSTS OF AIR POLLUTION DAMAGES: A STUDY OF HOSPITALIZATION COSTS

    EPA Science Inventory

    An investigation of the hospitalization costs of exposure to air pollution in Allegheny County, Pennsylvania was conducted to determine whether persons exposed to air pollution incurred higher incidences of hospitalization or additional costs for treatment. A hospitalization data...

  20. Additives in plastics.

    PubMed Central

    Deanin, R D

    1975-01-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products. PMID:1175566

  1. Assessing the Cost Efficiency of Italian Universities

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Salerno, Carlo

    2007-01-01

    This study uses Data Envelopment Analysis to evaluate the cost efficiency of 52 Italian public universities. In addition to being one of the first such cost studies of the Italian system, it explicitly takes into account the internal cost structure of institutions' education programs; a task not prevalent in past Data Envelopment Analysis studies…

  2. 34 CFR 304.21 - Allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education Department General Administrative Regulations in 34 CFR 75.530 through 75.562, the following items are... 34 Education 2 2010-07-01 2010-07-01 false Allowable costs. 304.21 Section 304.21...

  3. 34 CFR 304.21 - Allowable costs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education Department General Administrative Regulations in 34 CFR 75.530 through 75.562, the following items are... 34 Education 2 2012-07-01 2012-07-01 false Allowable costs. 304.21 Section 304.21...

  4. 34 CFR 304.21 - Allowable costs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education Department General Administrative Regulations in 34 CFR 75.530 through 75.562, the following items are... 34 Education 2 2011-07-01 2010-07-01 true Allowable costs. 304.21 Section 304.21...

  5. Marginal Costing Techniques for Higher Education.

    ERIC Educational Resources Information Center

    Allen, Richard; Brinkman, Paul

    The techniques for calculating marginal costs in higher education are examined in detail. Marginal costs, as defined in economics, is the change in total cost associated with producing one additional unit of output. In higher education, the most frequently selected unit of output is a full-time-equivalent student or, alternatively, a student…

  6. The cost of waste: Coatings

    SciTech Connect

    Rice, S.

    1996-06-01

    Some of the greatest opportunities for tapping into hidden profit potential at industrial coatings manufacturing plants may be in their waste or, rather, in their ability to eliminate the root causes of waste generation. This occurs because the total cost of waste (TCOW) does not appear only in a plant`s cost to dispose or recycle its waste. TCOW has four principal components, each of which are shown in different lines in the monthly financial accounting report. An additional potential component--the production plant capacity and personnel that are utilized producing controllable waste instead of product for sale and profit--fails to show up at all. Expanding the focus of waste reduction from merely reducing an individual component`s costs to eliminating the root causes of controllable waste generation provides significant additional profits and frees plant production equipment and people to: make more product for sale and profit, and reduce per-unit manufacturing costs.

  7. Biobased lubricant additives

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fully biobased lubricants are those formulated using all biobased ingredients, i.e. biobased base oils and biobased additives. Such formulations provide the maximum environmental, safety, and economic benefits expected from a biobased product. Currently, there are a number of biobased base oils that...

  8. Multifunctional fuel additives

    SciTech Connect

    Baillargeon, D.J.; Cardis, A.B.; Heck, D.B.

    1991-03-26

    This paper discusses a composition comprising a major amount of a liquid hydrocarbyl fuel and a minor low-temperature flow properties improving amount of an additive product of the reaction of a suitable diol and product of a benzophenone tetracarboxylic dianhydride and a long-chain hydrocarbyl aminoalcohol.

  9. Health and economic costs of physical inactivity.

    PubMed

    Kruk, Joanna

    2014-01-01

    Physical inactivity has reached epidemic levels in developed countries and is being recognized as a serious public health problem. Recent evidence shows a high percentages of individuals worldwide who are physically inactive, i.e. do not achieve the WHO's present recommendation of 150 minutes of moderate to vigorous intensity per week in addition to usual activities. Living in sedentary lifestyle is one of the leading causes of deaths and a high risk factor for several chronic diseases, like cancer, cardiovascular disease, diabetes type 2, and osteoporosis. This article summarizes evidence for relative risk of the civilization diseases attributable to physical inactivity and the most important conclusions available from the recent investigations computing the economic costs specific to physical inactivity. The findings provide health and economic arguments needed for people to understand the meaning of a sedentary lifestyle. This may be also useful for public health policy in the creation of programmes for prevention of physical inactivity. PMID:25292019

  10. Neutron Characterization for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.

    2013-01-01

    Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL

  11. Unraveling Higher Education's Costs.

    ERIC Educational Resources Information Center

    Gordon, Gus; Charles, Maria

    1998-01-01

    The activity-based costing (ABC) method of analyzing institutional costs in higher education involves four procedures: determining the various discrete activities of the organization; calculating the cost of each; determining the cost drivers; tracing cost to the cost objective or consumer of each activity. Few American institutions have used the…

  12. A Policy "Pothole" in Instructional Computer Planning.

    ERIC Educational Resources Information Center

    Davis, Matthew D.

    1995-01-01

    Although educational leaders occasionally develop instructional technology plans, many plans omit financial provisions for continuing costs, including the ongoing repair and eventual replacement of instructional computers. Computer costs are not allocated clearly to their functions: productivity and curriculum delivery. Computers are not…

  13. 25 CFR 700.81 - Monthly housing cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for...

  14. 25 CFR 700.81 - Monthly housing cost.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for...

  15. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  16. Introduction to Cost Analysis in IR: Challenges and Opportunities.

    PubMed

    Roudsari, Bahman; McWilliams, Justin; Bresnahan, Brian; Padia, Siddharth A

    2016-04-01

    Demonstration of value has become increasingly important in the current health care system. This review summarizes four of the most commonly used cost analysis methods relevant to IR that could be adopted to demonstrate the value of IR interventions: the cost minimization study, cost-effectiveness assessment, cost-utility analysis, and cost-benefit analysis. In addition, the issues of true cost versus hospital charges, modeling in cost studies, and sensitivity analysis are discussed. PMID:26922978

  17. Additive manufacturing of RF absorbers

    NASA Astrophysics Data System (ADS)

    Mills, Matthew S.

    The ability of additive manufacturing techniques to fabricate integrated electromagnetic absorbers tuned for specific radio frequency bands within structural composites allows for unique combinations of mechanical and electromagnetic properties. These composites and films can be used for RF shielding of sensitive electromagnetic components through in-plane and out-of-plane RF absorption. Structural composites are a common building block of many commercial platforms. These platforms may be placed in situations in which there is a need for embedded RF absorbing properties along with structural properties. Instead of adding radar absorbing treatments to the external surface of existing structures, which adds increased size, weight and cost; it could prove to be advantageous to integrate the microwave absorbing properties directly into the composite during the fabrication process. In this thesis, a method based on additive manufacturing techniques of composites structures with prescribed electromagnetic loss, within the frequency range 1 to 26GHz, is presented. This method utilizes screen printing and nScrypt micro dispensing to pattern a carbon based ink onto low loss substrates. The materials chosen for this study will be presented, and the fabrication technique that these materials went through to create RF absorbing structures will be described. The calibration methods used, the modeling of the RF structures, and the applications in which this technology can be utilized will also be presented.

  18. Boron addition to alloys

    SciTech Connect

    Coad, B. C.

    1985-08-20

    A process for addition of boron to an alloy which involves forming a melt of the alloy and a reactive metal, selected from the group consisting of aluminum, titanium, zirconium and mixtures thereof to the melt, maintaining the resulting reactive mixture in the molten state and reacting the boric oxide with the reactive metal to convert at least a portion of the boric oxide to boron which dissolves in the resulting melt, and to convert at least portion of the reactive metal to the reactive metal oxide, which oxide remains with the resulting melt, and pouring the resulting melt into a gas stream to form a first atomized powder which is subsequently remelted with further addition of boric oxide, re-atomized, and thus reprocessed to convert essentially all the reactive metal to metal oxide to produce a powdered alloy containing specified amounts of boron.

  19. Vinyl capped addition polyimides

    NASA Technical Reports Server (NTRS)

    Vannucci, Raymond D. (Inventor); Malarik, Diane C. (Inventor); Delvigs, Peter (Inventor)

    1991-01-01

    Polyimide resins (PMR) are generally useful where high strength and temperature capabilities are required (at temperatures up to about 700 F). Polyimide resins are particularly useful in applications such as jet engine compressor components, for example, blades, vanes, air seals, air splitters, and engine casing parts. Aromatic vinyl capped addition polyimides are obtained by reacting a diamine, an ester of tetracarboxylic acid, and an aromatic vinyl compound. Low void materials with improved oxidative stability when exposed to 700 F air may be fabricated as fiber reinforced high molecular weight capped polyimide composites. The aromatic vinyl capped polyimides are provided with a more aromatic nature and are more thermally stable than highly aliphatic, norbornenyl-type end-capped polyimides employed in PMR resins. The substitution of aromatic vinyl end-caps for norbornenyl end-caps in addition polyimides results in polymers with improved oxidative stability.

  20. [Biologically active food additives].

    PubMed

    Velichko, M A; Shevchenko, V P

    1998-07-01

    More than half out of 40 projects for the medical science development by the year of 2000 have been connected with the bio-active edible additives that are called "the food of XXI century", non-pharmacological means for many diseases. Most of these additives--nutricevtics and parapharmacevtics--are intended for the enrichment of food rations for the sick or healthy people. The ecologicaly safest and most effective are combined domestic adaptogens with immuno-modulating and antioxidating action that give anabolic and stimulating effect,--"leveton", "phytoton" and "adapton". The MKTs-229 tablets are residue discharge means. For atherosclerosis and general adiposis they recommend "tsar tablets" and "aiconol (ikhtien)"--on the base of cod-liver oil or "splat" made out of seaweed (algae). All these preparations have been clinically tested and received hygiene certificates from the Institute of Dietology of the Russian Academy of Medical Science. PMID:9752776