Science.gov

Sample records for additional computational cost

  1. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  2. Calculators and Computers: Graphical Addition.

    ERIC Educational Resources Information Center

    Spero, Samuel W.

    1978-01-01

    A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

  3. Energetic costs of cellular computation.

    PubMed

    Mehta, Pankaj; Schwab, David J

    2012-10-30

    Cells often perform computations in order to respond to environmental cues. A simple example is the classic problem, first considered by Berg and Purcell, of determining the concentration of a chemical ligand in the surrounding media. On general theoretical grounds, it is expected that such computations require cells to consume energy. In particular, Landauer's principle states that energy must be consumed in order to erase the memory of past observations. Here, we explicitly calculate the energetic cost of steady-state computation of ligand concentration for a simple two-component cellular network that implements a noisy version of the Berg-Purcell strategy. We show that learning about external concentrations necessitates the breaking of detailed balance and consumption of energy, with greater learning requiring more energy. Our calculations suggest that the energetic costs of cellular computation may be an important constraint on networks designed to function in resource poor environments, such as the spore germination networks of bacteria.

  4. Energetic costs of cellular computation

    PubMed Central

    Mehta, Pankaj; Schwab, David J.

    2012-01-01

    Cells often perform computations in order to respond to environmental cues. A simple example is the classic problem, first considered by Berg and Purcell, of determining the concentration of a chemical ligand in the surrounding media. On general theoretical grounds, it is expected that such computations require cells to consume energy. In particular, Landauer’s principle states that energy must be consumed in order to erase the memory of past observations. Here, we explicitly calculate the energetic cost of steady-state computation of ligand concentration for a simple two-component cellular network that implements a noisy version of the Berg–Purcell strategy. We show that learning about external concentrations necessitates the breaking of detailed balance and consumption of energy, with greater learning requiring more energy. Our calculations suggest that the energetic costs of cellular computation may be an important constraint on networks designed to function in resource poor environments, such as the spore germination networks of bacteria. PMID:23045633

  5. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  6. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  7. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... scientific, cost, and other data needed to support the bids, proposals, and applications. Bid and proposal... as prescribed in 3416.307(b): Additional Cost Principles (MAR 2011) (a) Bid and Proposal Costs. Bid and proposal costs are the immediate costs of preparing bids, proposals, and applications...

  8. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... clause: Additional Cost Principles (January 2006) (a) Bid and proposal (B & P) costs. (1) B & P costs are the immediate costs of preparing bids, proposals, and applications for potential Federal and non-Federal contracts, grants, and agreements, including the development of scientific, cost, and other...

  9. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... scientific, cost, and other data needed to support the bids, proposals, and applications. Bid and proposal... as prescribed in 3416.307(b): Additional Cost Principles (MAR 2011) (a) Bid and Proposal Costs. Bid and proposal costs are the immediate costs of preparing bids, proposals, and applications...

  10. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  11. Low Cost Computer Graphics in Engineering Education.

    ERIC Educational Resources Information Center

    Phillips, Richard L.

    1981-01-01

    An evaluation of a personal computer was conducted to determine its potential for applications normally assumed to require higher resolution, higher cost graphics devices. Its resolution, stand-alone computing power, and capability to function as a distributed computing element in a time-sharing system are discussed. Six references are listed.…

  12. Computers, Costs, and Civil Liberties

    ERIC Educational Resources Information Center

    Kelley, Verne R.; Weston, Hanna B.

    1975-01-01

    The need to control costs has led many state mental health systems to set up automated data banks in which a patient's name is directly linked with his record. Many consider this a primary threat to confidentiality and civil liberties. (Author)

  13. Computer/PERT technique monitors actual versus allocated costs

    NASA Technical Reports Server (NTRS)

    Houry, E.; Walker, J. D.

    1967-01-01

    A computer method measures the users performance in cost-type contracts utilizing the existing nasa program evaluation review technique without imposing any additional reporting requirements. progress is measured by comparing actual costs with a value of work performed in a specific period.

  14. Cut Costs with Thin Client Computing.

    ERIC Educational Resources Information Center

    Hartley, Patrick H.

    2001-01-01

    Discusses how school districts can considerably increase the number of administrative computers in their districts without a corresponding increase in costs by using the "Thin Client" component of the Total Cost of Ownership (TCC) model. TCC and Thin Client are described, including its software and hardware components. An example of a…

  15. Pension Costs on DOD Contracts: Additional Guidance Needed to Ensure Costs Are Consistent and Reasonable

    DTIC Science & Technology

    2013-01-01

    support from a team of DOD actuaries . DOD audits projected and actual costs for contracts, including pension costs, to ensure they are allowable...qualified and credentialed actuaries ) and collected contractor data on incurred CAS pension costs from 2002 to 2011. To understand how pension costs... Actuary of the GAO for actuarial soundness. We also gathered contractor projections of CAS pension costs for 2012 to 2016. See appendix I for additional

  16. Computer-Controlled HVAC -- at Low Cost

    ERIC Educational Resources Information Center

    American School and University, 1974

    1974-01-01

    By tying into a computerized building-automation network, Schaumburg High School, Illinois, slashed its energy consumption by one-third. The remotely connected computer controls the mechanical system for the high school as well as other buildings in the community, with the cost being shared by all. (Author)

  17. The Hidden Costs of Wireless Computer Labs

    ERIC Educational Resources Information Center

    Daly, Una

    2005-01-01

    Various elementary schools and middle schools across the U.S. have purchased one or more mobile laboratories. Although the wireless labs have provided more classroom computing, teachers and technology aides still have mixed views about their cost-benefit ratio. This is because the proliferation of viruses and spyware has dramatically increased…

  18. Computed tomography characterisation of additive manufacturing materials.

    PubMed

    Bibb, Richard; Thompson, Darren; Winder, John

    2011-06-01

    Additive manufacturing, covering processes frequently referred to as rapid prototyping and rapid manufacturing, provides new opportunities in the manufacture of highly complex and custom-fitting medical devices and products. Whilst many medical applications of AM have been explored and physical properties of the resulting parts have been studied, the characterisation of AM materials in computed tomography has not been explored. The aim of this study was to determine the CT number of commonly used AM materials. There are many potential applications of the information resulting from this study in the design and manufacture of wearable medical devices, implants, prostheses and medical imaging test phantoms. A selection of 19 AM material samples were CT scanned and the resultant images analysed to ascertain the materials' CT number and appearance in the images. It was found that some AM materials have CT numbers very similar to human tissues, FDM, SLA and SLS produce samples that appear uniform on CT images and that 3D printed materials show a variation in internal structure.

  19. 48 CFR 246.470-1 - Assessment of additional costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Assessment of additional costs. 246.470-1 Section 246.470-1 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract...

  20. Additive Manufacturing of Low Cost Upper Stage Propulsion Components

    NASA Technical Reports Server (NTRS)

    Protz, Christopher; Bowman, Randy; Cooper, Ken; Fikes, John; Taminger, Karen; Wright, Belinda

    2014-01-01

    NASA is currently developing Additive Manufacturing (AM) technologies and design tools aimed at reducing the costs and manufacturing time of regeneratively cooled rocket engine components. These Low Cost Upper Stage Propulsion (LCUSP) tasks are funded through NASA's Game Changing Development Program in the Space Technology Mission Directorate. The LCUSP project will develop a copper alloy additive manufacturing design process and develop and optimize the Electron Beam Freeform Fabrication (EBF3) manufacturing process to direct deposit a nickel alloy structural jacket and manifolds onto an SLM manufactured GRCop chamber and Ni-alloy nozzle. In order to develop these processes, the project will characterize both the microstructural and mechanical properties of the SLMproduced GRCop-84, and will explore and document novel design techniques specific to AM combustion devices components. These manufacturing technologies will be used to build a 25K-class regenerative chamber and nozzle (to be used with tested DMLS injectors) that will be tested individually and as a system in hot fire tests to demonstrate the applicability of the technologies. These tasks are expected to bring costs and manufacturing time down as spacecraft propulsion systems typically comprise more than 70% of the total vehicle cost and account for a significant portion of the development schedule. Additionally, high pressure/high temperature combustion chambers and nozzles must be regeneratively cooled to survive their operating environment, causing their design to be time consuming and costly to build. LCUSP presents an opportunity to develop and demonstrate a process that can infuse these technologies into industry, build competition, and drive down costs of future engines.

  1. Cost Estimation of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Piili, Heidi; Happonen, Ari; Väistö, Tapio; Venkataramanan, Vijaikrishnan; Partanen, Jouni; Salminen, Antti

    Laser additive manufacturing (LAM) is a layer wise fabrication method in which a laser beam melts metallic powder to form solid objects. Although 3D printing has been invented 30 years ago, the industrial use is quite limited whereas the introduction of cheap consumer 3D printers, in recent years, has familiarized the 3D printing. Interest is focused more and more in manufacturing of functional parts. Aim of this study is to define and discuss the current economic opportunities and restrictions of LAM process. Manufacturing costs were studied with different build scenarios each with estimated cost structure by calculated build time and calculating the costs of the machine, material and energy with optimized machine utilization. All manufacturing and time simulations in this study were carried out with a research machine equal to commercial EOS M series equipment. The study shows that the main expense in LAM is the investment cost of the LAM machine, compared to which the relative proportions of the energy and material costs are very low. The manufacturing time per part is the key factor to optimize costs of LAM.

  2. Additively Manufactured Low Cost Upper Stage Combustion Chamber

    NASA Technical Reports Server (NTRS)

    Protz, Christopher; Cooper, Ken; Ellis, David; Fikes, John; Jones, Zachary; Kim, Tony; Medina, Cory; Taminger, Karen; Willingham, Derek

    2016-01-01

    Over the past two years NASA's Low Cost Upper Stage Propulsion (LCUSP) project has developed Additive Manufacturing (AM) technologies and design tools aimed at reducing the costs and manufacturing time of regeneratively cooled rocket engine components. High pressure/high temperature combustion chambers and nozzles must be regeneratively cooled to survive their operating environment, causing their design fabrication to be costly and time consuming due to the number of individual steps and different processes required. Under LCUSP, AM technologies in Sintered Laser Melting (SLM) GRCop-84 and Electron Beam Freeform Fabrication (EBF3) Inconel 625 have been significantly advanced, allowing the team to successfully fabricate a 25k-class regenerative chamber. Estimates of the costs and schedule of future builds indicate cost reductions and significant schedule reductions will be enabled by this technology. Characterization of the microstructural and mechanical properties of the SLM-produced GRCop-84, EBF3 Inconel 625 and the interface layer between the two has been performed and indicates the properties will meet the design requirements. The LCUSP chamber is to be tested with a previously demonstrated SLM injector in order to advance the Technology Readiness Level (TRL) and demonstrate the capability of the application of these processes. NASA is advancing these technologies to reduce cost and schedule for future engine applications and commercial needs.

  3. Cost Considerations in Nonlinear Finite-Element Computing

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.

    1985-01-01

    Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.

  4. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  5. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... needed to support the bids, proposals, and applications. (2) B & P costs of the current accounting period are allowable as indirect costs. (3) B & P costs of past accounting periods are unallowable in the current period. However, if the organization's established practice is to treat these costs by some...

  6. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACQUISITION REGULATION CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and... costs of the current accounting period are allowable as indirect costs; bid and proposal costs of past accounting periods are unallowable as costs of the current period. However, if the organization's...

  7. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  8. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  9. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... include independent research and development (IR & D) costs covered by the following paragraph, or pre-award costs covered by paragraph 36 of Attachment B to OMB Circular A-122. (b) IR & D costs. (1) IR & D...-Federal contracts, grants, or other agreements. (2) IR & D shall be allocated its proportionate share...

  10. 38 CFR 36.4404 - Computation of cost.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cost of adaptations. Under section 2101(b) of Chapter 21, for the purpose of computing the amount of... market value of the adaptations, including installation costs, determined to be reasonably necessary,...

  11. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  12. Computers in Education: Their Use and Cost.

    ERIC Educational Resources Information Center

    1970

    Part one of this document consists of the findings and recommendations of the President's Science Advisory Committee. The report considers the use of computers in undergraduate, secondary, and higher education. It also discusses the needs of the computer science student, interaction between research and educational uses of computers, computer…

  13. Monte Carlo simulation by computer for life-cycle costing

    NASA Technical Reports Server (NTRS)

    Gralow, F. H.; Larson, W. J.

    1969-01-01

    Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.

  14. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  15. Cost Computations for Cyber Fighter Associate

    DTIC Science & Technology

    2015-05-01

    Computational and Information Sciences Directorate, ARL Approved for public release; distribution unlimited. FOR OFFICIAL USE ONLY...delete if not FOUO) ii REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is...the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other aspect of this

  16. Estimating boiling water reactor decommissioning costs: A user`s manual for the BWR Cost Estimating Computer Program (CECP) software. Final report

    SciTech Connect

    Bierschbach, M.C.

    1996-06-01

    Nuclear power plant licensees are required to submit to the US Nuclear Regulatory Commission (NRC) for review their decommissioning cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning boiling water reactor (BWR) power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  17. Estimating boiling water reactor decommissioning costs. A user`s manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    SciTech Connect

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  18. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  19. Preparing Rapid, Accurate Construction Cost Estimates with a Personal Computer.

    ERIC Educational Resources Information Center

    Gerstel, Sanford M.

    1986-01-01

    An inexpensive and rapid method for preparing accurate cost estimates of construction projects in a university setting, using a personal computer, purchased software, and one estimator, is described. The case against defined estimates, the rapid estimating system, and adjusting standard unit costs are discussed. (MLW)

  20. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  1. Low-Cost Computers for Education in Developing Countries

    ERIC Educational Resources Information Center

    James, Jeffrey

    2011-01-01

    This paper studies the distribution of computer use in a comparison between two of the most dominant suppliers of low-cost computers for education in developing countries (partly because they involve diametrically opposite ways of tackling the problem). The comparison is made in the context of an analytical framework which traces the changing…

  2. Fixed-point image orthorectification algorithms for reduced computational cost

    NASA Astrophysics Data System (ADS)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  3. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    PubMed

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p < 0.001). Significant differences in mean costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p < 0.001). Significant differences were also obtained between the costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit.

  4. A Low-Cost, Portable, Parallel Computing Cluster

    NASA Astrophysics Data System (ADS)

    Bullock, Daniel; Poppeliers, Christian; Allen, Charles

    2006-10-01

    Research in modern physical sciences has placed an increasing demand on computers for complex algorithms that push the limits of consumer personal computers. Parallel supercomputers are often required for large-scale algorithms, however the cost of these systems can be prohibitive. The purpose of this project is to construct a low-cost, portable, parallel computer system as an alternative to large-scale supercomputers, using Commercial Off The Shelf (COTS) components. These components can be networked together to allow processors to communicate with one another for faster computations. The overall design of this system is based on the development of ``Little Fe'' at Contra Costa College in San Pablo, California. Revisions to this design include improved design components, smaller physical size, easier transportation, less wiring, and a single AC power supply.

  5. 47 CFR 25.111 - Additional information and ITU cost recovery.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Additional information and ITU cost recovery....111 Additional information and ITU cost recovery. (a) The Commission may request from any party at any... interference caused by radio stations authorized by other Administrations is guaranteed unless ITU...

  6. Computer-generated fiscal reports for food cost accounting.

    PubMed

    Fromm, B; Moore, A N; Hoover, L W

    1980-08-01

    To optimize resource utilization for the provision of health-care services, well designed food cost accounting systems should facilitate effective decision-making. Fiscal reports reflecting the financial status of an organization at a given time must be current and representative so that managers have adequate data for planning and controlling. The computer-assisted food cost accounting discussed in this article can be integrated with other sub-systems and operations management techniques to provide the information needed to make decisions regarding revenues and expenses. Management information systems must be routinely evaluated and updated to meet the current needs of administrators. Further improvements in the food cost accounting system will be desirable whenever substantial changes occur within the foodservice operation at the University of Missouri-Columbia Medical Center or when advancements in computer technology provide more efficient methods for manipulating data and generating reports. Development of new systems and better applications of present systems could contribute significantly to the efficiency of operations in both health care and commercial foodservices. The computer-assisted food cost accounting system reported here might serve s a prototype for other management cost information systems.

  7. Low cost spacecraft computers: Oxymoron or future trend?

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  8. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  9. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  10. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  11. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  12. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  13. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  14. Computer-Aided Final Design Cost Estimating System Overview.

    DTIC Science & Technology

    1977-05-01

    laboratory _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /1— COMPUTER-AIDED FINAL DESIGN • COST ESTIMATING SYSTEM OVERVIEW © by...PROJECT . TASKAAEA~~ WORK UNIT NUMBERSCONSTRUCTION ENGINEERING RESEARCH LABORATORY ~~~~~~~~~ .• . — P.O. Box 4005 ~~ 4A7627~ %T4fldt11 Champa ign , IL 61820...Construction Division (FA), U.S. Army Construction Engineering Re- search Laboratory (CERL), Champaign , IL. The Principal Investigator was Mr. Michael

  15. Prospects for cost reductions from relaxing additional cross-border measures related to livestock trade.

    PubMed

    Hop, G E; Mourits, M C M; Slager, R; Oude Lansink, A G J M; Saatkamp, H W

    2013-05-01

    Compared with the domestic trade in livestock, intra-communal trade across the European Union (EU) is subject to costly, additional veterinary measures. Short-distance transportation just across a border requires more measures than long-distance domestic transportation, while the need for such additional cross-border measures can be questioned. This study examined the prospects for cost reductions from relaxing additional cross-border measures related to trade within the cross-border region of the Netherlands (NL) and Germany (GER); that is, North Rhine Westphalia and Lower Saxony. The study constructed a deterministic spread-sheet cost model to calculate the costs of both routine veterinary measures (standard measures that apply to both domestic and cross-border transport) and additional cross-border measures (extra measures that only apply to cross-border transport) as applied in 2010. This model determined costs by stakeholder, region and livestock sector, and studied the prospects for cost reduction by calculating the costs after the relaxation of additional cross-border measures. The selection criteria for relaxing these measures were (1) a low expected added value on preventing contagious livestock diseases, (2) no expected additional veterinary risks in case of relaxation of measures and (3) reasonable cost-saving possibilities. The total cost of routine veterinary measures and additional cross-border measures for the cross-border region was €22.1 million, 58% (€12.7 million) of which came from additional cross-border measures. Two-thirds of this €12.7 million resulted from the trade in slaughter animals. The main cost items were veterinary checks on animals (twice in the case of slaughter animals), export certification and control of export documentation. Four additional cross-border measures met the selection criteria for relaxation. The relaxation of these measures could save €8.2 million (€5.0 million for NL and €3.2 million for GER) annually

  16. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  17. Low-Cost Computers for Education in Developing Countries.

    PubMed

    James, Jeffrey

    2011-09-01

    This paper studies the distribution of computer use in a comparison between two of the most dominant suppliers of low-cost computers for education in developing countries (partly because they involve diametrically opposite ways of tackling the problem). The comparison is made in the context of an analytical framework which traces the changing characteristics of products as income rises over time. The crucial distinction turns out to be the way sharing is handled in the two cases. In the one no sharing is allowed while in the other sharing is the basis of the entire product design. Put somewhat differently, the one computer is intensive in a high-income characteristic whereas the other relies entirely on a low-income characteristic.

  18. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  19. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  20. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of...

  1. Addressing the computational cost of large EIT solutions.

    PubMed

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  2. A Web-Based Computer-Tailored Alcohol Prevention Program for Adolescents: Cost-Effectiveness and Intersectoral Costs and Benefits

    PubMed Central

    2016-01-01

    Background Preventing excessive alcohol use among adolescents is important not only to foster individual and public health, but also to reduce alcohol-related costs inside and outside the health care sector. Computer tailoring can be both effective and cost-effective for working with many lifestyle behaviors, yet the available information on the cost-effectiveness of computer tailoring for reducing alcohol use by adolescents is limited as is information on the costs and benefits pertaining to sectors outside the health care sector, also known as intersectoral costs and benefits (ICBs). Objective The aim was to assess the cost-effectiveness of a Web-based computer-tailored intervention for reducing alcohol use and binge drinking by adolescents from a health care perspective (excluding ICBs) and from a societal perspective (including ICBs). Methods Data used were from the Alcoholic Alert study, a cluster randomized controlled trial with randomization at the level of schools into two conditions. Participants either played a game with tailored feedback on alcohol awareness after the baseline assessment (intervention condition) or received care as usual (CAU), meaning that they had the opportunity to play the game subsequent to the final measurement (waiting list control condition). Data were recorded at baseline (T0=January/February 2014) and after 4 months (T1=May/June 2014) and were used to calculate incremental cost-effectiveness ratios (ICERs), both from a health care perspective and a societal perspective. Stochastic uncertainty in the data was dealt with by using nonparametric bootstraps (5000 simulated replications). Additional sensitivity analyses were conducted based on excluding cost outliers. Subgroup cost-effectiveness analyses were conducted based on several background variables, including gender, age, educational level, religion, and ethnicity. Results From both the health care perspective and the societal perspective for both outcome measures, the

  3. Dissociating compatibility effects and distractor costs in the additional singleton paradigm.

    PubMed

    Folk, Charles L

    2013-01-01

    The interpretation of identity compatibility effects associated with irrelevant items outside the nominal focus of attention has fueled much of the debate over early versus late selection and perceptual load theory. However, compatibility effects have also played a role in the debate over the extent to which the involuntary allocation of spatial attention (i.e., attentional capture) is completely stimulus-driven or whether it is contingent on top-down control settings. For example, in the context of the additional singleton paradigm, irrelevant color singletons have been found to produce not only an overall cost in search performance but also significant compatibility effects. This combination of search costs and compatibility effects has been taken as evidence that spatial attention is indeed allocated in a bottom-up fashion to the salient but irrelevant singletons. However, it is possible that compatibility effects in the additional singleton paradigm reflect parallel processing of identity associated with low perceptual load rather than an involuntary shift of spatial attention. In the present experiments, manipulations of load were incorporated into the traditional additional singleton paradigm. Under low-load conditions, both search costs and compatibility effects were obtained, replicating previous studies. Under high-load conditions, search costs were still present, but compatibility effects were eliminated. This dissociation suggests that the costs associated with irrelevant singletons may reflect filtering processes rather than the allocation of spatial attention.

  4. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  5. Resource Costs for Fault-Tolerant Linear Optical Quantum Computing

    NASA Astrophysics Data System (ADS)

    Li, Ying; Humphreys, Peter C.; Mendoza, Gabriel J.; Benjamin, Simon C.

    2015-10-01

    Linear optical quantum computing (LOQC) seems attractively simple: Information is borne entirely by light and processed by components such as beam splitters, phase shifters, and detectors. However, this very simplicity leads to limitations, such as the lack of deterministic entangling operations, which are compensated for by using substantial hardware overheads. Here, we quantify the resource costs for full-scale LOQC by proposing a specific protocol based on the surface code. With the caveat that our protocol can be further optimized, we report that the required number of physical components is at least 5 orders of magnitude greater than in comparable matter-based systems. Moreover, the resource requirements grow further if the per-component photon-loss rate is worse than 1 0-3 or the per-component noise rate is worse than 1 0-5. We identify the performance of switches in the network as the single most influential factor influencing resource scaling.

  6. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production.

  7. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  8. Computer physician order entry: benefits, costs, and issues.

    PubMed

    Kuperman, Gilad J; Gibson, Richard F

    2003-07-01

    Several analyses have detected substantial quality problems throughout the health care system. Information technology has consistently been identified as an important component of any approach for improvement. Computerized physician order entry (CPOE) is a promising technology that allows physicians to enter orders into a computer instead of handwriting them. Because CPOE fundamentally changes the ordering process, it can substantially decrease the overuse, underuse, and misuse of health care services. Studies have documented that CPOE can decrease costs, shorten length of stay, decrease medical errors, and improve compliance with several types of guidelines. The costs of CPOE are substantial both in terms of technology and organizational process analysis and redesign, system implementation, and user training and support. Computerized physician order entry is a relatively new technology, and there is no consensus on the best approaches to many of the challenges it presents. This technology can yield many significant benefits and is an important platform for future changes to the health care system. Organizational leaders must advocate for CPOE as a critical tool in improving health care quality.

  9. Manual of phosphoric acid fuel cell power plant cost model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  10. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services.

  11. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  12. Computational calculation of equilibrium constants: addition to carbonyl compounds.

    PubMed

    Gómez-Bombarelli, Rafael; González-Pérez, Marina; Pérez-Prior, María Teresa; Calle, Emilio; Casado, Julio

    2009-10-22

    Hydration reactions are relevant for understanding many organic mechanisms. Since the experimental determination of hydration and hemiacetalization equilibrium constants is fairly complex, computational calculations now offer a useful alternative to experimental measurements. In this work, carbonyl hydration and hemiacetalization constants were calculated from the free energy differences between compounds in solution, using absolute and relative approaches. The following conclusions can be drawn: (i) The use of a relative approach in the calculation of hydration and hemiacetalization constants allows compensation of systematic errors in the solvation energies. (ii) On average, the methodology proposed here can predict hydration constants within +/- 0.5 log K(hyd) units for aldehydes. (iii) Hydration constants can be calculated for ketones and carboxylic acid derivatives within less than +/- 1.0 log K(hyd), on average, at the CBS-Q level of theory. (iv) The proposed methodology can predict hemiacetal formation constants accurately at the MP2 6-31++G(d,p) level using a common reference. If group references are used, the results obtained using the much cheaper DFT-B3LYP 6-31++G(d,p) level are almost as accurate. (v) In general, the best results are obtained if a common reference for all compounds is used. The use of group references improves the results at the lower levels of theory, but at higher levels, this becomes unnecessary.

  13. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  14. Low Cost Injection Mold Creation via Hybrid Additive and Conventional Manufacturing

    SciTech Connect

    Dehoff, Ryan R.; Watkins, Thomas R.; List, III, Frederick Alyious; Carver, Keith; England, Roger

    2015-12-01

    The purpose of the proposed project between Cummins and ORNL is to significantly reduce the cost of the tooling (machining and materials) required to create injection molds to make plastic components. Presently, the high cost of this tooling forces the design decision to make cast aluminum parts because Cummins typical production volumes are too low to allow injection molded plastic parts to be cost effective with the amortized cost of the injection molding tooling. In addition to reducing the weight of components, polymer injection molding allows the opportunity for the alternative cooling methods, via nitrogen gas. Nitrogen gas cooling offers an environmentally and economically attractive cooling option, if the mold can be manufactured economically. In this project, a current injection molding design was optimized for cooling using nitrogen gas. The various components of the injection mold tooling were fabricated using the Renishaw powder bed laser additive manufacturing technology. Subsequent machining was performed on the as deposited components to form a working assembly. The injection mold is scheduled to be tested in a projection setting at a commercial vendor selected by Cummins.

  15. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  16. The Cost of an Additional Disability-Free Life Year for Older Americans: 1992–2005

    PubMed Central

    Cai, Liming

    2013-01-01

    Objective To estimate the cost of an additional disability-free life year for older Americans in 1992–2005. Data Source This study used 1992–2005 Medicare Current Beneficiary Survey, a longitudinal survey of Medicare beneficiaries with a rotating panel design. Study Design This analysis used multistate life table model to estimate probabilities of transition among a discrete set of health states (nondisabled, disabled, and dead) for two panels of older Americans in 1992 and 2002. Health spending incurred between annual health interviews was estimated by a generalized linear mixed model. Health status, including death, was simulated for each member of the panel using these transition probabilities; the associated health spending was cross-walked to the simulated health changes. Principal Findings Disability-free life expectancy (DFLE) increased significantly more than life expectancy during the study period. Assuming that 50 percent of the gains in DFLE between 1992 and 2005 were attributable to increases in spending, the average discounted cost per additional disability-free life year was $71,000. There were small differences between gender and racial/ethnic groups. Conclusions The cost of an additional disability-free life year was substantially below previous estimates based on mortality trends alone. PMID:22670874

  17. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  18. Computation of electric power production cost with transmission contraints

    NASA Astrophysics Data System (ADS)

    Earle, Robert Leonard

    The production cost in operating an electric power system is the cost of generation to meet the customer load or demand. Production costing models are used in analysis of electric power systems to estimate this cost for various purposes such as evaluating long term investments in generating capacity, contracts for sales, purchases, or trades of power. A multi-area production costing model includes the effects of transmission constraints in calculating costs. Including transmission constraints in production costing models is important because the electric power industry is interconnected and trades or sales of power amongst systems can lower costs. This thesis develops an analytical model for multi-area production costing. The advantage of this approach is that it explicitly examines the underlying structure of the problem. The major contributions of our research are as follows. First, we develop the multivariate model not just for transportation type models of electric power network flows, but also for the direct current power flow model. Second, this thesis derives the multi-area production cost curve in the general case. This new result gives a simple formula for determination of system cost and the gradient of cost with respect to transmission capacities. Third, we give an algorithm for generating the non-redundant constraints from a Gale-Hoffman type region. The Gale-Hoffman conditions characterize feasibility of flow in a network. We also gather together some existing and new results on Gale-Hoffman regions and put them in a unified framework. Fourth, in order to derive the multi-area production cost curves and also to perform the integration of the multivariate Edgeworth series, we need wedge shaped regions (a wedge is the affine image of an orthant). We give an algorithm for decomposing any polyhedral set into wedges. Fifth, this thesis gives a new method for one dimensional numerical integration of the trivariate normal. The best methods previously known

  19. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motions: Analysis and results

    NASA Technical Reports Server (NTRS)

    Rowe, W. S.; Sebastian, J. D.; Petrarca, J. R.

    1979-01-01

    Results of theoretical and numerical investigations conducted to develop economical computing procedures were applied to an existing computer program that predicts unsteady aerodynamic loadings caused by leading and trailing edge control surface motions in subsonic compressible flow. Large reductions in computing costs were achieved by removing the spanwise singularity of the downwash integrand and evaluating its effect separately in closed form. Additional reductions were obtained by modifying the incremental pressure term that account for downwash singularities at control surface edges. Accuracy of theoretical predictions of unsteady loading at high reduced frequencies was increased by applying new pressure expressions that exactly satisified the high frequency boundary conditions of an oscillating control surface. Comparative computer result indicated that the revised procedures provide more accurate predictions of unsteady loadings as well as providing reduction of 50 to 80 percent in computer usage costs.

  20. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false If a tribe incurs unforeseen construction costs, can it get additional funds? 170.602 Section 170.602 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE... Funding Process § 170.602 If a tribe incurs unforeseen construction costs, can it get additional...

  1. Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    2005-01-01

    Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…

  2. Cost-Effective Additive Manufacturing in Space: HELIOS Technology Challenge Guide

    NASA Technical Reports Server (NTRS)

    DeVieneni, Alayna; Velez, Carlos Andres; Benjamin, David; Hollenbeck, Jay

    2012-01-01

    Welcome to the HELIOS Technology Challenge Guide. This document is intended to serve as a general road map for participants of the HELIOS Technology Challenge [HTC] Program and the associated inaugural challenge: HTC-01: Cost-Effective Additive Manufacturing in Space. Please note that this guide is not a rule book and is not meant to hinder the development of innovative ideas. Its primary goal is to highlight the objectives of the HTC-01 Challenge and to describe possible solution routes and pitfalls that such technology may encounter in space. Please also note that participants wishing to demonstrate any hardware developed under this program during any future HELIOS Technology Challenge showcase event(s) may be subject to event regulations to be published separately at a later date.

  3. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motion. Addendum to computer program description

    NASA Technical Reports Server (NTRS)

    Rowe, W. S.; Petrarca, J. R.

    1980-01-01

    Changes to be made that provide increased accuracy and increased user flexibility in prediction of unsteady loadings caused by control surface motions are described. Analysis flexibility is increased by reducing the restrictions on the location of the downwash stations relative to the leading edge and the edges of the control surface boundaries. Analysis accuracy is increased in predicting unsteady loading for high Mach number analysis conditions through use of additional chordwise downwash stations. User guideline are presented to enlarge analysis capabilities of unusual wing control surface configurations. Comparative results indicate that the revised procedures provide accurate predictions of unsteady loadings as well as providing reductions of 40 to 75 percent in computer usage cost required by previous versions of this program.

  4. Additive Manufacturing for Cost Efficient Production of Compact Ceramic Heat Exchangers and Recuperators

    SciTech Connect

    Shulman, Holly; Ross, Nicole

    2015-10-30

    An additive manufacture technique known as laminated object manufacturing (LOM) was used to fabricate compact ceramic heat exchanger prototypes. LOM uses precision CO2 laser cutting of ceramic green tapes, which are then precision stacked to build a 3D object with fine internal features. Modeling was used to develop prototype designs and predict the thermal response, stress, and efficiency in the ceramic heat exchangers. Build testing and materials analyses were used to provide feedback for the design selection. During this development process, laminated object manufacturing protocols were established. This included laser optimization, strategies for fine feature integrity, lamination fluid control, green handling, and firing profile. Three full size prototypes were fabricated using two different designs. One prototype was selected for performance testing. During testing, cross talk leakage prevented the application of a high pressure differential, however, the prototype was successful at withstanding the high temperature operating conditions (1300 °F). In addition, analysis showed that the bulk of the part did not have cracks or leakage issues. This led to the development of a module method for next generation LOM heat exchangers. A scale-up cost analysis showed that given a purpose built LOM system, these ceramic heat exchangers would be affordable for the applications.

  5. Computer-based manufacturing cost analysis for the fabrication of thermoplastic composite structures

    NASA Astrophysics Data System (ADS)

    Foley, Michael; Bernardon, Edward

    1990-01-01

    Advanced composite structures are very expensive to manufacture. Cost estimation techniques are useful as tools for increasing cost effectiveness in part design, in selecting materials, and in the design of automated systems and manufacturing processes. A computer-based cost estimation model has been developed for analyzing the manufacturing costs involved in the fabrication of thermoplastic composite structures. The model, described in detail in this paper, evaluates existing manual and automated techniques for manufacturing a thermoplastic composite skin. Cost analysis results and their relevance to increasing cost effectiveness are discussed.

  6. Energy Drain by Computers Stifles Efforts at Cost Control

    ERIC Educational Resources Information Center

    Keller, Josh

    2009-01-01

    The high price of storing and processing data is hurting colleges and universities across the country. In response, some institutions are embracing greener technologies to keep costs down and help the environment. But compared with other industries, colleges and universities have been slow to understand the problem and to adopt energy-saving…

  7. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    ERIC Educational Resources Information Center

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  8. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Payment Plan? 171.555 Section 171.555 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND... Collections § 171.555 What additional costs will I incur if I am granted a Payment Plan? You will incur the following costs: (a) An administrative fee to process your Payment Plan, as required by 31 CFR 901.9....

  9. High Temperature Thermoplastic Additive Manufacturing Using Low-Cost, Open-Source Hardware

    NASA Technical Reports Server (NTRS)

    Gardner, John M.; Stelter, Christopher J.; Yashin, Edward A.; Siochi, Emilie J.

    2016-01-01

    Additive manufacturing (or 3D printing) via Fused Filament Fabrication (FFF), also known as Fused Deposition Modeling (FDM), is a process where material is placed in specific locations layer-by-layer to create a complete part. Printers designed for FFF build parts by extruding a thermoplastic filament from a nozzle in a predetermined path. Originally developed for commercial printers, 3D printing via FFF has become accessible to a much larger community of users since the introduction of Reprap printers. These low-cost, desktop machines are typically used to print prototype parts or novelty items. As the adoption of desktop sized 3D printers broadens, there is increased demand for these machines to produce functional parts that can withstand harsher conditions such as high temperature and mechanical loads. Materials meeting these requirements tend to possess better mechanical properties and higher glass transition temperatures (Tg), thus requiring printers with high temperature printing capability. This report outlines the problems and solutions, and includes a detailed description of the machine design, printing parameters, and processes specific to high temperature thermoplastic 3D printing.

  10. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  11. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing

  12. Estimating Development Cost for a Tailored Interactive Computer Program to Enhance Colorectal Cancer Screening Compliance

    PubMed Central

    Lairson, David R.; Chang, Yu-Chia; Bettencourt, Judith L.; Vernon, Sally W.; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was $328,866. The development cost was $52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  13. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions.

  14. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  15. Multiple sequence alignment with arbitrary gap costs: computing an optimal solution using polyhedral combinatorics.

    PubMed

    Althaus, Ernst; Caprara, Alberto; Lenhof, Hans-Peter; Reinert, Knut

    2002-01-01

    Multiple sequence alignment is one of the dominant problems in computational molecular biology. Numerous scoring functions and methods have been proposed, most of which result in NP-hard problems. In this paper we propose for the first time a general formulation for multiple alignment with arbitrary gap-costs based on an integer linear program (ILP). In addition we describe a branch-and-cut algorithm to effectively solve the ILP to optimality. We evaluate the performances of our approach in terms of running time and quality of the alignments using the BAliBase database of reference alignments. The results show that our implementation ranks amongst the best programs developed so far.

  16. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  17. Minnesota Computer Aided Library System (MCALS); University of Minnesota Subsystem Cost/Benefits Analysis.

    ERIC Educational Resources Information Center

    Lourey, Eugene D., Comp.

    The Minnesota Computer Aided Library System (MCALS) provides a basis of unification for library service program development in Minnesota for eventual linkage to the national information network. A prototype plan for communications functions is illustrated. A cost/benefits analysis was made to show the cost/effectiveness potential for MCALS. System…

  18. Two Computer Programs for Equipment Cost Estimation and Economic Evaluation of Chemical Processes.

    ERIC Educational Resources Information Center

    Kuri, Carlos J.; Corripio, Armando B.

    1984-01-01

    Describes two computer programs for use in process design courses: an easy-to-use equipment cost estimation program based on latest cost correlations available and an economic evaluation program which calculates two profitability indices. Comparisons between programed and hand-calculated results are included. (JM)

  19. A low computation cost method for seizure prediction.

    PubMed

    Zhang, Yanli; Zhou, Weidong; Yuan, Qi; Wu, Qi

    2014-10-01

    The dynamic changes of electroencephalograph (EEG) signals in the period prior to epileptic seizures play a major role in the seizure prediction. This paper proposes a low computation seizure prediction algorithm that combines a fractal dimension with a machine learning algorithm. The presented seizure prediction algorithm extracts the Higuchi fractal dimension (HFD) of EEG signals as features to classify the patient's preictal or interictal state with Bayesian linear discriminant analysis (BLDA) as a classifier. The outputs of BLDA are smoothed by a Kalman filter for reducing possible sporadic and isolated false alarms and then the final prediction results are produced using a thresholding procedure. The algorithm was evaluated on the intracranial EEG recordings of 21 patients in the Freiburg EEG database. For seizure occurrence period of 30 min and 50 min, our algorithm obtained an average sensitivity of 86.95% and 89.33%, an average false prediction rate of 0.20/h, and an average prediction time of 24.47 min and 39.39 min, respectively. The results confirm that the changes of HFD can serve as a precursor of ictal activities and be used for distinguishing between interictal and preictal epochs. Both HFD and BLDA classifier have a low computational complexity. All of these make the proposed algorithm suitable for real-time seizure prediction.

  20. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ...; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal Communications... issues in the ongoing virtual workshop. DATES: Comments are due on or before June 18, 2013. If you... comments. Virtual Workshop: In addition to the usual methods for filing electronic comments, the...

  1. Using a small/low cost computer in an information center

    NASA Technical Reports Server (NTRS)

    Wilde, D. U.

    1972-01-01

    Small/low cost computers are available with I/O capacities that make them suitable for SDI and retrospective searching on any of the many commercially available data bases. A small two-tape computer system is assumed, and an analysis of its run-time equations leads to a three-step search procedure. Run times and costs are shown as a function of file size, number of search terms, and input transmission rates. Actual examples verify that it is economically feasible for an information center to consider its own small, dedicated computer system.

  2. Utilizing a Collaborative Cross Number Puzzle Game to Develop the Computing Ability of Addition and Subtraction

    ERIC Educational Resources Information Center

    Chen, Yen-Hua; Looi, Chee-Kit; Lin, Chiu-Pin; Shao, Yin-Juan; Chan, Tak-Wai

    2012-01-01

    While addition and subtraction is a key mathematical skill for young children, a typical activity for them in classrooms involves doing repetitive arithmetic calculation exercises. In this study, we explore a collaborative way for students to learn these skills in a technology-enabled way with wireless computers. Two classes, comprising a total of…

  3. 78 FR 12271 - Wireline Competition Bureau Seeks Additional Comment In Connect America Cost Model Virtual Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... Virtual Workshop AGENCY: Federal Communications Commission. ACTION: Proposed rule. SUMMARY: In this... Site: http://fjallfoss.fcc.gov/ecfs2/ . Follow the instructions for submitting comments. Virtual...://www.fcc.gov/blog/wcb-cost-model-virtual-workshop-2012 . People with Disabilities: Contact the FCC...

  4. Government regulation and public opposition create high additional costs for field trials with GM crops in Switzerland.

    PubMed

    Bernauer, Thomas; Tribaldos, Theresa; Luginbühl, Carolin; Winzeler, Michael

    2011-12-01

    Field trials with GM crops are not only plant science experiments. They are also social experiments concerning the implications of government imposed regulatory constraints and public opposition for scientific activity. We assess these implications by estimating additional costs due to government regulation and public opposition in a recent set of field trials in Switzerland. We find that for every Euro spent on research, an additional 78 cents were spent on security, an additional 31 cents on biosafety, and an additional 17 cents on government regulatory supervision. Hence the total additional spending due to government regulation and public opposition was around 1.26 Euros for every Euro spent on the research per se. These estimates are conservative; they do not include additional costs that are hard to monetize (e.g. stakeholder information and dialogue activities, involvement of various government agencies). We conclude that further field experiments with GM crops in Switzerland are unlikely unless protected sites are set up to reduce these additional costs.

  5. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  6. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise

    PubMed Central

    Burge, Johannes

    2017-01-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA’s compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA’s unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  7. 25 CFR 170.602 - If a tribe incurs unforeseen construction costs, can it get additional funds?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false If a tribe incurs unforeseen construction costs, can it get additional funds? 170.602 Section 170.602 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER INDIAN RESERVATION ROADS PROGRAM Service Delivery for Indian Reservation...

  8. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false What additional costs will I incur if I am granted a Payment Plan? 171.555 Section 171.555 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  9. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What additional costs will I incur if I am granted a Payment Plan? 171.555 Section 171.555 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing,...

  10. Materials Testing and Cost Modeling for Composite Parts Through Additive Manufacturing

    DTIC Science & Technology

    2016-04-30

    êÅÜ=mêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= - 246 - Panel 17. Reducing Life- Cycle Costs: Adopting Emerging Manufacturing Technologies...chain. Introduction Modern manufacturing processes tend to reflect globalization, a concentration on core activities, shorter product life- cycles ... cycle perspective, a number of organizations recognize that environmental benefits and performance improvements can be achieved (Horn & Harrysson, 2012

  11. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  12. A low-cost vector processor boosting compute-intensive image processing operations

    NASA Technical Reports Server (NTRS)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  13. Modeling access, cost, and perceived quality: computer simulation benefits orthodontic clinic staffing decisions.

    PubMed

    Montgomery, J B; LaFrancois, G G; Perry, M J

    2000-02-01

    Given limited financial resources, simulation permits a financial analysis of the optimum staffing levels for orthodontists and dental assistants in an orthodontic clinic. A computer simulation provides the information for managerial review. This study, by building a computer simulation of an orthodontic service, set out to determine the most efficient mix between providers and support staff to maximize access, maximize perceived quality, and minimize expenditures. Six combinations of providers and support staff were compared during an animated, computer-generated what-if analysis. Based on the clinic workload and size, on the cost per patient, and on the cost per quality point, the research team recommended a staffing mix of one orthodontist and three assistants. This study shows that computer simulation is an enormous asset as a decision support tool for management.

  14. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  15. Low-cost Electromagnetic Heating Technology for Polymer Extrusion-based Additive Manufacturing

    SciTech Connect

    Carter, William G.; Rios, Orlando; Akers, Ronald R.; Morrison, William A.

    2016-01-07

    To improve the flow of materials used in in polymer additive manufacturing, ORNL and Ajax Tocco created an induction system for heating fused deposition modeling (FDM) nozzles used in polymer additive manufacturing. The system is capable of reaching a temperature of 230 C, a typical nozzle temperature for extruding ABS polymers, in 17 seconds. A prototype system was built at ORNL and sent to Ajax Tocco who analyzed the system and created a finalized power supply. The induction system was mounted to a PrintSpace Altair desktop printer and used to create several test parts similar in quality to those created using a resistive heated nozzle.

  16. Municipal Rebate Programs for Environmental Retrofits: An Evaluation of Additionality and Cost-Effectiveness

    ERIC Educational Resources Information Center

    Bennear, Lori S.; Lee, Jonathan M.; Taylor, Laura O.

    2013-01-01

    When policies incentivize voluntary activities that also take place in the absence of the incentive, it is critical to identify the additionality of the policy--that is, the degree to which the policy results in actions that would not have occurred otherwise. Rebate programs have become a common conservation policy tool for local municipalities…

  17. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    NASA Technical Reports Server (NTRS)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  18. Open-source meteor detection software for low-cost single-board computers

    NASA Astrophysics Data System (ADS)

    Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.

    2016-01-01

    This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.

  19. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    ERIC Educational Resources Information Center

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  20. Cost-Effective Computing: Making the Most of Your PC Dollars.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1992-01-01

    Lists 27 suggestions for making cost-effective decisions when buying personal computers. Topics covered include physical comfort; modem speed; color graphics; institutional discounts; direct-order firms; brand names; replacing versus upgrading; expanding hard disk capacity; printers; software; wants versus needs; and RLIN (Research Libraries…

  1. Low-Cost Nanocellulose-Reinforced High-Temperature Polymer Composites for Additive Manufacturing

    SciTech Connect

    Ozcan, Soydan; Tekinalp, Halil L.; Love, Lonnie J.; Kunc, Vlastimil; Nelson, Kim

    2016-07-13

    ORNL worked with American Process Inc. to demonstrate the potential use of bio-based BioPlus® lignin-coated cellulose nanofibrils (L-CNF) as a reinforcing agent in the development of polymer feedstock suitable for additive manufacturing. L-CNF-reinforced polylactic acid (PLA) testing coupons were prepared and up to 69% increase in tensile strength and 133% increase in elastic modulus were demonstrated.

  2. Low-cost space-varying FIR filter architecture for computational imaging systems

    NASA Astrophysics Data System (ADS)

    Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.

    2010-01-01

    Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.

  3. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S r

  4. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  5. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  6. Soybean protein as a cost-effective lignin-blocking additive for the saccharification of sugarcane bagasse.

    PubMed

    Florencio, Camila; Badino, Alberto C; Farinas, Cristiane S

    2016-12-01

    Addition of surfactants, polymers, and non-catalytic proteins can improve the enzymatic hydrolysis of lignocellulosic materials by blocking the exposed lignin surfaces, but involves extra expense. Here, soybean protein, one of the cheapest proteins available, was evaluated as an alternative additive for the enzymatic hydrolysis of pretreated sugarcane bagasse. The effect of the enzyme source was investigated using enzymatic cocktails from A. niger and T. reesei cultivated under solid-state, submerged, and sequential fermentation. The use of soybean protein led to approximately 2-fold increases in hydrolysis, relative to the control, for both A. niger and T. reesei enzymatic cocktails from solid-state fermentation. The effect was comparable to that of BSA. Moreover, the use of soybean protein and a 1:1 combination of A. niger and T. reesei enzymatic cocktails resulted in 54% higher glucose release, compared to the control. Soybean protein is a potential cost-effective additive for use in the biomass conversion process.

  7. PAH growth initiated by propargyl addition: mechanism development and computational kinetics.

    PubMed

    Raj, Abhijeet; Al Rashidi, Mariam J; Chung, Suk Ho; Sarathy, S Mani

    2014-04-24

    Polycyclic aromatic hydrocarbon (PAH) growth is known to be the principal pathway to soot formation during fuel combustion, as such, a physical understanding of the PAH growth mechanism is needed to effectively assess, predict, and control soot formation in flames. Although the hydrogen abstraction C2H2 addition (HACA) mechanism is believed to be the main contributor to PAH growth, it has been shown to under-predict some of the experimental data on PAHs and soot concentrations in flames. This article presents a submechanism of PAH growth that is initiated by propargyl (C3H3) addition onto naphthalene (A2) and the naphthyl radical. C3H3 has been chosen since it is known to be a precursor of benzene in combustion and has appreciable concentrations in flames. This mechanism has been developed up to the formation of pyrene (A4), and the temperature-dependent kinetics of each elementary reaction has been determined using density functional theory (DFT) computations at the B3LYP/6-311++G(d,p) level of theory and transition state theory (TST). H-abstraction, H-addition, H-migration, β-scission, and intramolecular addition reactions have been taken into account. The energy barriers of the two main pathways (H-abstraction and H-addition) were found to be relatively small if not negative, whereas the energy barriers of the other pathways were in the range of (6-89 kcal·mol(-1)). The rates reported in this study may be extrapolated to larger PAH molecules that have a zigzag site similar to that in naphthalene, and the mechanism presented herein may be used as a complement to the HACA mechanism to improve prediction of PAH and soot formation.

  8. A performance/cost evaluation for a GPU-based drug discovery application on volunteer computing.

    PubMed

    Guerrero, Ginés D; Imbernón, Baldomero; Pérez-Sánchez, Horacio; Sanz, Francisco; García, José M; Cecilia, José M

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  9. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    PubMed Central

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  10. On Training Efficiency and Computational Costs of a Feed Forward Neural Network: A Review

    PubMed Central

    Laudani, Antonino; Lozito, Gabriele Maria; Riganti Fulginei, Francesco; Salvini, Alessandro

    2015-01-01

    A comprehensive review on the problem of choosing a suitable activation function for the hidden layer of a feed forward neural network has been widely investigated. Since the nonlinear component of a neural network is the main contributor to the network mapping capabilities, the different choices that may lead to enhanced performances, in terms of training, generalization, or computational costs, are analyzed, both in general-purpose and in embedded computing environments. Finally, a strategy to convert a network configuration between different activation functions without altering the network mapping capabilities will be presented. PMID:26417368

  11. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates...

  12. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates...

  13. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING POWERPLANTS Pt. 504, App. I Appendix I to Part 504—Procedures for the Computation of the Real Cost of...

  14. A Decision Support System for Cost-Effectiveness Analysis for Control and Security of Computer Systems.

    DTIC Science & Technology

    1985-09-01

    Support System for Cost- Master’s Thesis Effectiveness Analysis for Control and September 1985 Security of Computer Systems 6. PERFORMING ORG . REPORT...F )3010 >~T .0 0 Find directory U Figulre 8. i reaFlw iara fies.obe ->~Ne8 DrbelExoue Controls Inc z W &Z ,~L. UJ. LiL La CA CC 449 -*LA- D. P Erase

  15. Application of a single-board computer as a low-cost pulse generator

    NASA Astrophysics Data System (ADS)

    Fedrizzi, Marcus; Soria, Julio

    2015-09-01

    A BeagleBone Black (BBB) single-board open-source computer was implemented as a low-cost fully programmable pulse generator. The pulse generator makes use of the BBB Programmable Real-Time Unit (PRU) subsystem to achieve a deterministic temporal resolution of 5 ns, an RMS jitter of 290 ps and a timebase stability on the order of 10 ppm. A Python-based software framework has also been developed to simplify the usage of the pulse generator.

  16. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    SciTech Connect

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case.

  17. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    SciTech Connect

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  18. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  19. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  20. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  1. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGES

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  2. Predicting Cost/Performance Trade-Offs for Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. We develop closed form expressions detailing the number and size of messages sent by each benchmark. Coupling these with measured single processor performance, network latency, and network bandwidth, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $1,000,000.

  3. Predicting Cost/Performance Trade-offs For Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanDerWijngaart, Rob F.; Tweten, Dave (Technical Monitor)

    1998-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. By measuring single processor benchmark performance, network latency, and network bandwidth, and using closed form expressions detailing the number and size of messages sent by each benchmark, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $ 1,000,000.

  4. Reducing annotation cost and uncertainty in computer-aided diagnosis through selective iterative classification

    NASA Astrophysics Data System (ADS)

    Riely, Amelia; Sablan, Kyle; Xiaotao, Thomas; Furst, Jacob; Raicu, Daniela

    2015-03-01

    Medical imaging technology has always provided radiologists with the opportunity to view and keep records of anatomy of the patient. With the development of machine learning and intelligent computing, these images can be used to create Computer-Aided Diagnosis (CAD) systems, which can assist radiologists in analyzing image data in various ways to provide better health care to patients. This paper looks at increasing accuracy and reducing cost in creating CAD systems, specifically in predicting the malignancy of lung nodules in the Lung Image Database Consortium (LIDC). Much of the cost in creating an accurate CAD system stems from the need for multiple radiologist diagnoses or annotations of each image, since there is rarely a ground truth diagnosis and even different radiologists' diagnoses of the same nodule often disagree. To resolve this issue, this paper outlines an method of selective iterative classification that predicts lung nodule malignancy by using multiple radiologist diagnoses only for cases that can benefit from them. Our method achieved 81% accuracy while costing only 46% of the method that indiscriminately used all annotations, which achieved a lower accuracy of 70%, while costing more.

  5. Reducing metal alloy powder costs for use in powder bed fusion additive manufacturing: Improving the economics for production

    NASA Astrophysics Data System (ADS)

    Medina, Fransisco

    Titanium and its associated alloys have been used in industry for over 50 years and have become more popular in the recent decades. Titanium has been most successful in areas where the high strength to weight ratio provides an advantage over aluminum and steels. Other advantages of titanium include biocompatibility and corrosion resistance. Electron Beam Melting (EBM) is an additive manufacturing (AM) technology that has been successfully applied in the manufacturing of titanium components for the aerospace and medical industry with equivalent or better mechanical properties as parts fabricated via more traditional casting and machining methods. As the demand for titanium powder continues to increase, the price also increases. Titanium spheroidized powder from different vendors has a price range from 260/kg-450/kg, other spheroidized alloys such as Niobium can cost as high as $1,200/kg. Alternative titanium powders produced from methods such as the Titanium Hydride-Dehydride (HDH) process and the Armstrong Commercially Pure Titanium (CPTi) process can be fabricated at a fraction of the cost of powders fabricated via gas atomization. The alternative powders can be spheroidized and blended. Current sectors in additive manufacturing such as the medical industry are concerned that there will not be enough spherical powder for production and are seeking other powder options. It is believed the EBM technology can use a blend of spherical and angular powder to build fully dense parts with equal mechanical properties to those produced using traditional powders. Some of the challenges with angular and irregular powders are overcoming the poor flow characteristics and the attainment of the same or better packing densities as spherical powders. The goal of this research is to demonstrate the feasibility of utilizing alternative and lower cost powders in the EBM process. As a result, reducing the cost of the raw material to reduce the overall cost of the product produced with

  6. Effort cost computation in schizophrenia: a commentary on the recent literature.

    PubMed

    Gold, James M; Waltz, James A; Frank, Michael J

    2015-12-01

    The cognitive and affective factors implicated in the motivational impairments seen in many people with schizophrenia remain poorly understood. Many research groups have done studies in the past 2 years examining the role of effort-cost computations driven by the hypothesis that overestimation of the cost of effort involved in volitional behavior might underlie the reduction in goal-directed behavior seen in some people with schizophrenia. The goal of this review is to assess the available evidence and the interpretative ambiguities that remain to be addressed by further studies. There is a clear preponderance of evidence suggesting that people with schizophrenia demonstrate altered effort allocation by failing to make high-effort response choices to maximize reward. The evidence relating altered effort allocation to the severity of negative symptoms is mixed. It remains for future work to determine the precise mechanisms implicated in altered effort allocation with two prominent possibilities: that patients 1) overestimate the cost of effort or 2) underestimate the value of potential awards. Other mechanisms that need to be investigated include the potential contributions of other impairments associated with the illness that increase the cost of effort. Furthermore, it is possible that accurate value representations fail to invigorate behavior. Although questions remain, evidence available to date suggests that the study of cost/benefit decision making may shed new light on the motivational impairments seen in many people with schizophrenia.

  7. Computation of octanol-water partition coefficients by guiding an additive model with knowledge.

    PubMed

    Cheng, Tiejun; Zhao, Yuan; Li, Xun; Lin, Fu; Xu, Yong; Zhang, Xinglong; Li, Yan; Wang, Renxiao; Lai, Luhua

    2007-01-01

    We have developed a new method, i.e., XLOGP3, for logP computation. XLOGP3 predicts the logP value of a query compound by using the known logP value of a reference compound as a starting point. The difference in the logP values of the query compound and the reference compound is then estimated by an additive model. The additive model implemented in XLOGP3 uses a total of 87 atom/group types and two correction factors as descriptors. It is calibrated on a training set of 8199 organic compounds with reliable logP data through a multivariate linear regression analysis. For a given query compound, the compound showing the highest structural similarity in the training set will be selected as the reference compound. Structural similarity is quantified based on topological torsion descriptors. XLOGP3 has been tested along with its predecessor, i.e., XLOGP2, as well as several popular logP methods on two independent test sets: one contains 406 small-molecule drugs approved by the FDA and the other contains 219 oligopeptides. On both test sets, XLOGP3 produces more accurate predictions than most of the other methods with average unsigned errors of 0.24-0.51 units. Compared to conventional additive methods, XLOGP3 does not rely on an extensive classification of fragments and correction factors in order to improve accuracy. It is also able to utilize the ever-increasing experimentally measured logP data more effectively.

  8. Evolutionary adaptive eye tracking for low-cost human computer interaction applications

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Shin, Hak Chul; Sung, Won Jun; Khim, Sarang; Kim, Honglak; Rhee, Phill Kyu

    2013-01-01

    We present an evolutionary adaptive eye-tracking framework aiming for low-cost human computer interaction. The main focus is to guarantee eye-tracking performance without using high-cost devices and strongly controlled situations. The performance optimization of eye tracking is formulated into the dynamic control problem of deciding on an eye tracking algorithm structure and associated thresholds/parameters, where the dynamic control space is denoted by genotype and phenotype spaces. The evolutionary algorithm is responsible for exploring the genotype control space, and the reinforcement learning algorithm organizes the evolved genotype into a reactive phenotype. The evolutionary algorithm encodes an eye-tracking scheme as a genetic code based on image variation analysis. Then, the reinforcement learning algorithm defines internal states in a phenotype control space limited by the perceived genetic code and carries out interactive adaptations. The proposed method can achieve optimal performance by compromising the difficulty in the real-time performance of the evolutionary algorithm and the drawback of the huge search space of the reinforcement learning algorithm. Extensive experiments were carried out using webcam image sequences and yielded very encouraging results. The framework can be readily applied to other low-cost vision-based human computer interactions in solving their intrinsic brittleness in unstable operational environments.

  9. Experiments with a low-cost system for computer graphics material model acquisition

    NASA Astrophysics Data System (ADS)

    Rushmeier, Holly; Lockerman, Yitzhak; Cartwright, Luke; Pitera, David

    2015-03-01

    We consider the design of an inexpensive system for acquiring material models for computer graphics rendering applications in animation, games and conceptual design. To be useful in these applications a system must be able to model a rich range of appearances in a computationally tractable form. The range of appearance of interest in computer graphics includes materials that have spatially varying properties, directionality, small-scale geometric structure, and subsurface scattering. To be computationally tractable, material models for graphics must be compact, editable, and efficient to numerically evaluate for ray tracing importance sampling. To construct appropriate models for a range of interesting materials, we take the approach of separating out directly and indirectly scattered light using high spatial frequency patterns introduced by Nayar et al. in 2006. To acquire the data at low cost, we use a set of Raspberry Pi computers and cameras clamped to miniature projectors. We explore techniques to separate out surface and subsurface indirect lighting. This separation would allow the fitting of simple, and so tractable, analytical models to features of the appearance model. The goal of the system is to provide models for physically accurate renderings that are visually equivalent to viewing the original physical materials.

  10. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    PubMed

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  11. Dealing with electronic waste: modeling the costs and environmental benefits of computer monitor disposal.

    PubMed

    Macauley, Molly; Palmer, Karen; Shih, Jhih-Shyang

    2003-05-01

    The importance of information technology to the world economy has brought about a surge in demand for electronic equipment. With rapid technological change, a growing fraction of the increasing stock of many types of electronics becomes obsolete each year. We model the costs and benefits of policies to manage 'e-waste' by focusing on a large component of the electronic waste stream-computer monitors-and the environmental concerns associated with disposal of the lead embodied in cathode ray tubes (CRTs) used in most monitors. We find that the benefits of avoiding health effects associated with CRT disposal appear far outweighed by the costs for a wide range of policies. For the stock of monitors disposed of in the United States in 1998, we find that policies restricting or banning some popular disposal options would increase disposal costs from about US dollar 1 per monitor to between US dollars 3 and US dollars 20 per monitor. Policies to promote a modest amount of recycling of monitor parts, including lead, can be less expensive. In all cases, however, the costs of the policies exceed the value of the avoided health effects of CRT disposal.

  12. Computing confidence intervals on solution costs for stochastic grid generation expansion problems.

    SciTech Connect

    Woodruff, David L..; Watson, Jean-Paul

    2010-12-01

    A range of core operations and planning problems for the national electrical grid are naturally formulated and solved as stochastic programming problems, which minimize expected costs subject to a range of uncertain outcomes relating to, for example, uncertain demands or generator output. A critical decision issue relating to such stochastic programs is: How many scenarios are required to ensure a specific error bound on the solution cost? Scenarios are the key mechanism used to sample from the uncertainty space, and the number of scenarios drives computational difficultly. We explore this question in the context of a long-term grid generation expansion problem, using a bounding procedure introduced by Mak, Morton, and Wood. We discuss experimental results using problem formulations independently minimizing expected cost and down-side risk. Our results indicate that we can use a surprisingly small number of scenarios to yield tight error bounds in the case of expected cost minimization, which has key practical implications. In contrast, error bounds in the case of risk minimization are significantly larger, suggesting more research is required in this area in order to achieve rigorous solutions for decision makers.

  13. Treatment of a simulated textile wastewater in a sequencing batch reactor (SBR) with addition of a low-cost adsorbent.

    PubMed

    Santos, Sílvia C R; Boaventura, Rui A R

    2015-06-30

    Color removal from textile wastewaters, at a low-cost and consistent technology, is even today a challenge. Simultaneous biological treatment and adsorption is a known alternative to the treatment of wastewaters containing biodegradable and non-biodegradable contaminants. The present work aims at evaluating the treatability of a simulated textile wastewater by simultaneously combining biological treatment and adsorption in a SBR (sequencing batch reactor), but using a low-cost adsorbent, instead of a commercial one. The selected adsorbent was a metal hydroxide sludge (WS) from an electroplating industry. Direct Blue 85 dye (DB) was used in the preparation of the synthetic wastewater. Firstly, adsorption kinetics and equilibrium were studied, in respect to many factors (temperature, pH, WS dosage and presence of salts and dyeing auxiliary chemicals in the aqueous media). At 25 °C and pH 4, 7 and 10, maximum DB adsorption capacities in aqueous solution were 600, 339 and 98.7 mg/g, respectively. These values are quite considerable, compared to other reported in literature, but proved to be significantly reduced by the presence of dyeing auxiliary chemicals in the wastewater. The simulated textile wastewater treatment in SBR led to BOD5 removals of 53-79%, but color removal was rather limited (10-18%). The performance was significantly enhanced by the addition of WS, with BOD5 removals above 91% and average color removals of 60-69%.

  14. User's guide to SERICPAC: A computer program for calculating electric-utility avoided costs rates

    SciTech Connect

    Wirtshafter, R.; Abrash, M.; Koved, M.; Feldman, S.

    1982-05-01

    SERICPAC is a computer program developed to calculate average avoided cost rates for decentralized power producers and cogenerators that sell electricity to electric utilities. SERICPAC works in tandem with SERICOST, a program to calculate avoided costs, and determines the appropriate rates for buying and selling of electricity from electric utilities to qualifying facilities (QF) as stipulated under Section 210 of PURA. SERICPAC contains simulation models for eight technologies including wind, hydro, biogas, and cogeneration. The simulations are converted in a diversified utility production which can be either gross production or net production, which accounts for an internal electricity usage by the QF. The program allows for adjustments to the production to be made for scheduled and forced outages. The final output of the model is a technology-specific average annual rate. The report contains a description of the technologies and the simulations as well as complete user's guide to SERICPAC.

  15. Versatile, low-cost, computer-controlled, sample positioning system for vacuum applications

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1991-01-01

    A versatile, low-cost, easy to implement, microprocessor-based motorized positioning system (MPS) suitable for accurate sample manipulation in a Second Ion Mass Spectrometry (SIMS) system, and for other ultra-high vacuum (UHV) applications was designed and built at NASA LeRC. The system can be operated manually or under computer control. In the latter case, local, as well as remote operation is possible via the IEEE-488 bus. The position of the sample can be controlled in three linear orthogonal and one angular coordinates.

  16. Development and Use of the Life Cycle Cost in Design Computer Program (LCCID).

    DTIC Science & Technology

    1985-11-01

    34..... ........ .... .-. * . -. . . ..-. .%- ’ . Signing Off the Computer ~Once the program is complete: U: $OFF CR Iwill log you off the Harris system.] --. 2 .9.... o . 17...Values for this Alternative by Title 52 7 7xi SPCF ANUA VALUE S = Define / Change Annual Values D = Delete an Annual Value . ’ ’."<cr> = exit SPECIFY...causes printer to begin logging all text coming to screen. .4 64-’. . ~~64 2. I. . .••.. ,... . . . . o. * *, . *% - - . . * -, . LIFE CYCLE COST ANALYSIS

  17. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace

    PubMed Central

    Pyshkin, P. V.; Luo, Da-Wei; Jing, Jun; You, J. Q.; Wu, Lian-Ao

    2016-01-01

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol. PMID:27886234

  18. Make or Buy: Cost Impacts of Additive Manufacturing, 3D Laser Scanning Technology, and Collaborative Product Lifecycle Management on Ship Maintenance and Modernization

    DTIC Science & Technology

    2015-05-01

    1 Make or Buy: Cost Impacts of Additive Manufacturing , 3D Laser Scanning Technology, and Collaborative Product Lifecycle Management on Ship...DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Make or Buy: Cost Impacts of Additive Manufacturing , 3D Laser Scanning Technology...management during operations 4 Potential Technology 3: Additive Manufacturing (“3D Printing”) 5 • 3D design/image (e.g. from 3D LS) of final part

  19. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  20. A novel cost based model for energy consumption in cloud computing.

    PubMed

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  1. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    NASA Technical Reports Server (NTRS)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  2. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a)...

  3. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a)...

  4. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage... property value: Interest rate: Monthly advance: Initial draw: Line of credit: Initial Loan Charges Closing...: $301.80 Initial draw: $1,000 Line of credit: $4,000 Initial Loan Charges Closing costs: $5,000...

  5. Addition of flexible body option to the TOLA computer program. Part 2: User and programmer documentation

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    User and programmer oriented documentation for the flexible body option of the Takeoff and Landing Analysis (TOLA) computer program are provided. The user information provides sufficient knowledge of the development and use of the option to enable the engineering user to successfully operate the modified program and understand the results. The programmer's information describes the option structure and logic enabling a programmer to make major revisions to this part of the TOLA computer program.

  6. DOD Business Systems Modernization: Additional Enhancements Are Needed for Army Business System Schedule and Cost Estimates to Fully Meet Best Practices

    DTIC Science & Technology

    2014-09-01

    DOD BUSINESS SYSTEMS MODERNIZATION Additional Enhancements Are Needed for Army Business System Schedule and Cost...DATE SEP 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE DOD Business Systems Modernization: Additional...Enhancements Are Needed for Army Business System Schedule and Cost Estimates to Fully Meet Best Practices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  7. A cost-benefit analysis of blood donor vaccination as an alternative to additional DNA testing for reducing transfusion transmission of hepatitis B virus.

    PubMed

    Fischinger, J M; Stephan, B; Wasserscheid, K; Eichler, H; Gärtner, B C

    2010-11-16

    A survey-based, cost-benefit analysis was performed comparing blood screening strategies with vaccination strategies for the reduction of transfusion transmission of HBV. 231 whole blood donors and 126 apheresis donors were eligible and completed a questionnaire detailing their donation habits. The cost-benefit analysis included current mandatory HBV testing (HbsAg+anti-Hbc, A1), A1 with additional nucleic acid testing (NAT) for minipool (A2) or individual donation testing (A3), as well as HBV vaccination strategies using time-dependant (B1) or titre dependent booster vaccination solely (B2), or B2 in addition to current mandatory testing procedures (B3). Different cost models were applied using a 5% rate of discount. Absolute costs for current mandatory testing procedures (A1) over 20 years in Germany were €1009 million. Additional NAT would lead to incremental costs of 43% (A2) or 339% (A3), respectively. Vaccination strategies B1 and B2 showed cost-reductions relative to A1 of 30% and 14%, respectively. The number of remaining HBV infections could be reduced from 360 (for A1) to 13, using vaccination, compared with 144 or 105 remaining infections for A2 or A3, respectively. Absolute cost per prevented infection is similar (€2.0 million) for A2 and B3. HBV vaccination offers the near-elimination of transfusion infections while representing a potential cost-reduction.

  8. Low-cost monitoring of patients during unsupervised robot/computer assisted motivating stroke rehabilitation.

    PubMed

    Johnson, Michelle J; Shakya, Yuniya; Strachota, Elaine; Ahamed, Sheikh Iqbal

    2011-02-01

    There is a need for effective stroke rehabilitation systems that can be used in undersupervised/unsupervised environments such as the home to assist in improving and/or sustaining functional outcomes. We determined the stability, accuracy and usability of an extremely low-cost mobile robot for use with a robot/computer motivating rehabilitation device, TheraDrive. The robot provided cues to discourage excessive trunk movements and to encourage arm movements. The mobile robot system was positively received by potential users, and it was accurate and stable over time. Feedback from users suggests that finding the optimal frequency and type of encouragement and corrective feedback given by the robot helper will be critical for long-term acceptance.

  9. A simple, low-cost, data logging pendulum built from a computer mouse

    SciTech Connect

    Gintautas, Vadas; Hubler, Alfred

    2009-01-01

    Lessons and homework problems involving a pendulum are often a big part of introductory physics classes and laboratory courses from high school to undergraduate levels. Although laboratory equipment for pendulum experiments is commercially available, it is often expensive and may not be affordable for teachers on fixed budgets, particularly in developing countries. We present a low-cost, easy-to-build rotary sensor pendulum using the existing hardware in a ball-type computer mouse. We demonstrate how this apparatus may be used to measure both the frequency and coefficient of damping of a simple physical pendulum. This easily constructed laboratory equipment makes it possible for all students to have hands-on experience with one of the most important simple physical systems.

  10. Can Computer-Assisted Discovery Learning Foster First Graders' Fluency with the Most Basic Addition Combinations?

    ERIC Educational Resources Information Center

    Baroody, Arthur J.; Eiland, Michael D.; Purpura, David J.; Reid, Erin E.

    2013-01-01

    In a 9-month training experiment, 64 first graders with a risk factor were randomly assigned to computer-assisted structured discovery of the add-1 rule (e.g., the sum of 7 + 1 is the number after "seven" when we count), unstructured discovery learning of this regularity, or an active-control group. Planned contrasts revealed that the…

  11. Low cost, highly effective parallel computing achieved through a Beowulf cluster.

    PubMed

    Bitner, Marc; Skelton, Gordon

    2003-01-01

    A Beowulf cluster is a means of bringing together several computers and using software and network components to make this cluster of computers appear and function as one computer with multiple parallel computing processors. A cluster of computers can provide comparable computing power usually found only in very expensive super computers or servers.

  12. Low-cost, high-performance and efficiency computational photometer design

    NASA Astrophysics Data System (ADS)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  13. Computational Sensing Using Low-Cost and Mobile Plasmonic Readers Designed by Machine Learning.

    PubMed

    Ballard, Zachary S; Shir, Daniel; Bhardwaj, Aashish; Bazargan, Sarah; Sathianathan, Shyama; Ozcan, Aydogan

    2017-02-28

    Plasmonic sensors have been used for a wide range of biological and chemical sensing applications. Emerging nanofabrication techniques have enabled these sensors to be cost-effectively mass manufactured onto various types of substrates. To accompany these advances, major improvements in sensor read-out devices must also be achieved to fully realize the broad impact of plasmonic nanosensors. Here, we propose a machine learning framework which can be used to design low-cost and mobile multispectral plasmonic readers that do not use traditionally employed bulky and expensive stabilized light sources or high-resolution spectrometers. By training a feature selection model over a large set of fabricated plasmonic nanosensors, we select the optimal set of illumination light-emitting diodes needed to create a minimum-error refractive index prediction model, which statistically takes into account the varied spectral responses and fabrication-induced variability of a given sensor design. This computational sensing approach was experimentally validated using a modular mobile plasmonic reader. We tested different plasmonic sensors with hexagonal and square periodicity nanohole arrays and revealed that the optimal illumination bands differ from those that are "intuitively" selected based on the spectral features of the sensor, e.g., transmission peaks or valleys. This framework provides a universal tool for the plasmonics community to design low-cost and mobile multispectral readers, helping the translation of nanosensing technologies to various emerging applications such as wearable sensing, personalized medicine, and point-of-care diagnostics. Beyond plasmonics, other types of sensors that operate based on spectral changes can broadly benefit from this approach, including e.g., aptamer-enabled nanoparticle assays and graphene-based sensors, among others.

  14. Least-squares reverse-time migration with cost-effective computation and memory storage

    NASA Astrophysics Data System (ADS)

    Liu, Xuejian; Liu, Yike; Huang, Xiaogang; Li, Peng

    2016-06-01

    Least-squares reverse-time migration (LSRTM), which involves several iterations of reverse-time migration (RTM) and Born modeling procedures, can provide subsurface images with better balanced amplitudes, higher resolution and fewer artifacts than standard migration. However, the same source wavefield is repetitively computed during the Born modeling and RTM procedures of different iterations. We developed a new LSRTM method with modified excitation-amplitude imaging conditions, where the source wavefield for RTM is forward propagated only once while the maximum amplitude and its excitation-time at each grid are stored. Then, the RTM procedure of different iterations only involves: (1) backward propagation of the residual between Born modeled and acquired data, and (2) implementation of the modified excitation-amplitude imaging condition by multiplying the maximum amplitude by the back propagated data residuals only at the grids that satisfy the imaging time at each time-step. For a complex model, 2 or 3 local peak-amplitudes and corresponding traveltimes should be confirmed and stored for all the grids so that multiarrival information of the source wavefield can be utilized for imaging. Numerical experiments on a three-layer and the Marmousi2 model demonstrate that the proposed LSRTM method saves huge computation and memory cost.

  15. The Effects of Computer-Assisted Instruction on Student Achievement in Addition and Subtraction at First Grade Level.

    ERIC Educational Resources Information Center

    Spivey, Patsy M.

    This study was conducted to determine whether the traditional classroom approach to instruction involving the addition and subtraction of number facts (digits 0-6) is more or less effective than the traditional classroom approach plus a commercially-prepared computer game. A pretest-posttest control group design was used with two groups of first…

  16. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  17. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 1 2011-04-01 2011-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  18. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  19. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  20. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  1. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    SciTech Connect

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  2. Projections of costs, financing, and additional resource requirements for low- and lower middle-income country immunization programs over the decade, 2011-2020.

    PubMed

    Gandhi, Gian; Lydon, Patrick; Cornejo, Santiago; Brenzel, Logan; Wrobel, Sandra; Chang, Hugh

    2013-04-18

    The Decade of Vaccines Global Vaccine Action Plan has outlined a set of ambitious goals to broaden the impact and reach of immunization across the globe. A projections exercise has been undertaken to assess the costs, financing availability, and additional resource requirements to achieve these goals through the delivery of vaccines against 19 diseases across 94 low- and middle-income countries for the period 2011-2020. The exercise draws upon data from existing published and unpublished global forecasts, country immunization plans, and costing studies. A combination of an ingredients-based approach and use of approximations based on past spending has been used to generate vaccine and non-vaccine delivery costs for routine programs, as well as supplementary immunization activities (SIAs). Financing projections focused primarily on support from governments and the GAVI Alliance. Cost and financing projections are presented in constant 2010 US dollars (US$). Cumulative total costs for the decade are projected to be US$57.5 billion, with 85% for routine programs and the remaining 15% for SIAs. Delivery costs account for 54% of total cumulative costs, and vaccine costs make up the remainder. A conservative estimate of total financing for immunization programs is projected to be $34.3 billion over the decade, with country governments financing 65%. These projections imply a cumulative funding gap of $23.2 billion. About 57% of the total resources required to close the funding gap are needed just to maintain existing programs and scale up other currently available vaccines (i.e., before adding in the additional costs of vaccines still in development). Efforts to mobilize additional resources, manage program costs, and establish mutual accountability between countries and development partners will all be necessary to ensure the goals of the Decade of Vaccines are achieved. Establishing or building on existing mechanisms to more comprehensively track resources and

  3. Improving the precision and speed of Euler angles computation from low-cost rotation sensor data.

    PubMed

    Janota, Aleš; Šimák, Vojtech; Nemec, Dušan; Hrbček, Jozef

    2015-03-23

    This article compares three different algorithms used to compute Euler angles from data obtained by the angular rate sensor (e.g., MEMS gyroscope)-the algorithms based on a rotational matrix, on transforming angular velocity to time derivations of the Euler angles and on unit quaternion expressing rotation. Algorithms are compared by their computational efficiency and accuracy of Euler angles estimation. If attitude of the object is computed only from data obtained by the gyroscope, the quaternion-based algorithm seems to be most suitable (having similar accuracy as the matrix-based algorithm, but taking approx. 30% less clock cycles on the 8-bit microcomputer). Integration of the Euler angles' time derivations has a singularity, therefore is not accurate at full range of object's attitude. Since the error in every real gyroscope system tends to increase with time due to its offset and thermal drift, we also propose some measures based on compensation by additional sensors (a magnetic compass and accelerometer). Vector data of mentioned secondary sensors has to be transformed into the inertial frame of reference. While transformation of the vector by the matrix is slightly faster than doing the same by quaternion, the compensated sensor system utilizing a matrix-based algorithm can be approximately 10% faster than the system utilizing quaternions (depending on implementation and hardware).

  4. Improving the Precision and Speed of Euler Angles Computation from Low-Cost Rotation Sensor Data

    PubMed Central

    Janota, Aleš; Šimák, Vojtech; Nemec, Dušan; Hrbček, Jozef

    2015-01-01

    This article compares three different algorithms used to compute Euler angles from data obtained by the angular rate sensor (e.g., MEMS gyroscope)—the algorithms based on a rotational matrix, on transforming angular velocity to time derivations of the Euler angles and on unit quaternion expressing rotation. Algorithms are compared by their computational efficiency and accuracy of Euler angles estimation. If attitude of the object is computed only from data obtained by the gyroscope, the quaternion-based algorithm seems to be most suitable (having similar accuracy as the matrix-based algorithm, but taking approx. 30% less clock cycles on the 8-bit microcomputer). Integration of the Euler angles’ time derivations has a singularity, therefore is not accurate at full range of object’s attitude. Since the error in every real gyroscope system tends to increase with time due to its offset and thermal drift, we also propose some measures based on compensation by additional sensors (a magnetic compass and accelerometer). Vector data of mentioned secondary sensors has to be transformed into the inertial frame of reference. While transformation of the vector by the matrix is slightly faster than doing the same by quaternion, the compensated sensor system utilizing a matrix-based algorithm can be approximately 10% faster than the system utilizing quaternions (depending on implementation and hardware). PMID:25806874

  5. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  6. A low-computational-cost inverse heat transfer technique for convective heat transfer measurements in hypersonic flows

    NASA Astrophysics Data System (ADS)

    Avallone, F.; Greco, C. S.; Schrijer, F. F. J.; Cardone, G.

    2015-04-01

    The measurement of the convective wall heat flux in hypersonic flows may be particularly challenging in the presence of high-temperature gradients and when using high-thermal-conductivity materials. In this case, the solution of multidimensional problems is necessary, but it considerably increases the computational cost. In this paper, a low-computational-cost inverse data reduction technique is presented. It uses a recursive least-squares approach in combination with the trust-region-reflective algorithm as optimization procedure. The computational cost is reduced by performing the discrete Fourier transform on the discrete convective heat flux function and by identifying the most relevant coefficients as objects of the optimization algorithm. In the paper, the technique is validated by means of both synthetic data, built in order to reproduce physical conditions, and experimental data, carried out in the Hypersonic Test Facility Delft at Mach 7.5 on two wind tunnel models having different thermal properties.

  7. Postprocessing of Voxel-Based Topologies for Additive Manufacturing Using the Computational Geometry Algorithms Library (CGAL)

    DTIC Science & Technology

    2015-06-01

    that a structure is built up by layers. Typically, additive manufacturing devices (3-dimensional [3-D] printers , e.g.), use the stereolithography (STL...begin with a standard, voxel-based topology optimization scheme and end with an STL file, ready for use in a 3-D printer or other additive manufacturing...S, Yvinec M. Cgal 4.6 - 3d alpha shapes. 2015 [accessed 2015 May 18]. http://doc.cgal.org/latest/Alpha_shapes_3/index.html#Chapter_3D_ Alpha_Shapes

  8. Addition of higher order plate and shell elements into NASTRAN computer program

    NASA Technical Reports Server (NTRS)

    Narayanaswami, R.; Goglia, G. L.

    1976-01-01

    Two higher order plate elements, the linear strain triangular membrane element and the quintic bending element, along with a shallow shell element, suitable for inclusion into the NASTRAN (NASA Structural Analysis) program are described. Additions to the NASTRAN Theoretical Manual, Users' Manual, Programmers' Manual and the NASTRAN Demonstration Problem Manual, for inclusion of these elements into the NASTRAN program are also presented.

  9. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    NASA Astrophysics Data System (ADS)

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-08-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, Eyegrade, a system for automatic grading of multiple choice exams is presented. While most current solutions are based on expensive scanners, Eyegrade offers a truly low-cost solution requiring only a regular off-the-shelf webcam. Additionally, Eyegrade performs both mark recognition as well as optical character recognition of handwritten student identification numbers, which avoids the use of bubbles in the answer sheet. When compared with similar webcam-based systems, the user interface in Eyegrade has been designed to provide a more efficient and error-free data collection procedure. The tool has been validated with a set of experiments that show the ease of use (both setup and operation), the reduction in grading time, and an increase in the reliability of the results when compared with conventional, more expensive systems.

  10. Addition of visual noise boosts evoked potential-based brain-computer interface

    PubMed Central

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-01-01

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7–36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs. PMID:24828128

  11. Efficient method for computing the maximum-likelihood quantum state from measurements with additive Gaussian noise.

    PubMed

    Smolin, John A; Gambetta, Jay M; Smith, Graeme

    2012-02-17

    We provide an efficient method for computing the maximum-likelihood mixed quantum state (with density matrix ρ) given a set of measurement outcomes in a complete orthonormal operator basis subject to Gaussian noise. Our method works by first changing basis yielding a candidate density matrix μ which may have nonphysical (negative) eigenvalues, and then finding the nearest physical state under the 2-norm. Our algorithm takes at worst O(d(4)) for the basis change plus O(d(3)) for finding ρ where d is the dimension of the quantum state. In the special case where the measurement basis is strings of Pauli operators, the basis change takes only O(d(3)) as well. The workhorse of the algorithm is a new linear-time method for finding the closest probability distribution (in Euclidean distance) to a set of real numbers summing to one.

  12. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  13. Phase Transition in Computing Cost of Overconstrained NP-Complete 3-SAT Problems

    NASA Astrophysics Data System (ADS)

    Woodson, Adam; O'Donnell, Thomas; Maniloff, Peter

    2002-03-01

    Many intractable, NP-Complete problems such as Traveling Salesmen (TSP) and 3-Satisfiability (3-Sat) which arise in hundreds of computer science, industrial and commercial applications, are now known to exhibit phase transitions in computational cost. While these problems appear to not have any structure which would make them amenable to attack with quantum computing, their critical behavior may allow physical insights derived from statistical mechanics and critical theory to shed light on these computationally ``hardest" of problems. While computational theory indicates that ``the intractability of the NP-Complete class resides solely in the exponential growth of the possible solutions" with the number of variables, n, the present work instead investigates the complex patterns of ``overlap" amongst 3-SAT clauses (their combined effects) when n-tuples of these act in succession to reduce the space of valid solutions. An exhaustive-search algorithm was used to eliminate `bad' states from amongst the `good' states residing within the spaces of all 2^n--possible solutions of randomly generated 3-Sat problems. No backtracking nor optimization heuristics were employed, nor was problem structure exploited (i.e., phtypical cases were generated), and the (k=3)-Sat propositional logic problems generated were in standard, conjunctive normal form (CNF). Each problem had an effectively infinite number of clauses, m (i.e., with r = m/n >= 10), to insure every problem would not be satisfiable (i.e. that each would fail), and duplicate clauses were not permitted. This process was repeated for each of several low values of n (i.e., 4 <= n <= 20). The entire history of solution-states elimination as successive clauses were applied was archived until, in each instance, sufficient clauses were applied to kill all possible solutions . An asymmetric, sigmoid-shaped phase transition is observed in Fg = F_g(m'/n), the fraction of the original 2^n ``good" solutions remaining valid as a

  14. Enantioselective conjugate addition of nitro compounds to α,β-unsaturated ketones: an experimental and computational study.

    PubMed

    Manzano, Rubén; Andrés, José M; Álvarez, Rosana; Muruzábal, María D; de Lera, Ángel R; Pedrosa, Rafael

    2011-05-16

    A series of chiral thioureas derived from easily available diamines, prepared from α-amino acids, have been tested as catalysts in the enantioselective Michael additions of nitroalkanes to α,β-unsaturated ketones. The best results are obtained with the bifunctional catalyst prepared from L-valine. This thiourea promotes the reaction with high enantioselectivities and chemical yields for aryl/vinyl ketones, but the enantiomeric ratio for alkyl/vinyl derivatives is very modest. The addition of substituted nitromethanes led to the corresponding adducts with excellent enantioselectivity but very poor diastereoselectivity. Evidence for the isomerization of the addition products has been obtained from the reaction of chalcone with [D(3)]nitromethane, which shows that the final addition products epimerize under the reaction conditions. The epimerization explains the low diastereoselectivity observed in the formation of adducts with two adjacent tertiary stereocenters. Density functional studies of the transition structures corresponding to two alternative activation modes of the nitroalkanes and α,β-unsaturated ketones by the bifunctional organocatalyst have been carried out at the B3LYP/3-21G* level. The computations are consistent with a reaction model involving the Michael addition of the thiourea-activated nitronate to the ketone activated by the protonated amine of the organocatalyst. The enantioselectivities predicted by the computations are consistent with the experimental values obtained for aryl- and alkyl-substituted α,β-unsaturated ketones.

  15. Definition and computation of intermolecular contact in liquids using additively weighted Voronoi tessellation.

    PubMed

    Isele-Holder, Rolf E; Rabideau, Brooks D; Ismail, Ahmed E

    2012-05-10

    We present a definition of intermolecular surface contact by applying weighted Voronoi tessellations to configurations of various organic liquids and water obtained from molecular dynamics simulations. This definition of surface contact is used to link the COSMO-RS model and molecular dynamics simulations. We demonstrate that additively weighted tessellation is the superior tessellation type to define intermolecular surface contact. Furthermore, we fit a set of weights for the elements C, H, O, N, F, and S for this tessellation type to obtain optimal agreement between the models. We use these radii to successfully predict contact statistics for compounds that were excluded from the fit and mixtures. The observed agreement between contact statistics from COSMO-RS and molecular dynamics simulations confirms the capability of the presented method to describe intermolecular contact. Furthermore, we observe that increasing polarity of the surfaces of the examined molecules leads to weaker agreement in the contact statistics. This is especially pronounced for pure water.

  16. 12 CFR Appendix L to Part 1026 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. L Appendix L to Part 1026—Assumed Loan Periods...

  17. 12 CFR Appendix L to Part 1026 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Assumed Loan Periods for Computations of Total Annual Loan Cost Rates L Appendix L to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. L Appendix L to Part 1026—Assumed Loan Periods...

  18. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  19. Effectiveness and Cost Benefits of Computer-Based Decision Aids for Equipment Maintenance

    DTIC Science & Technology

    2003-02-01

    Computer-based assessment of problem solving. Computers in Human Behavior , 15, 269–282. Booher, H.R. (1978). Job performance aids: Research and...Perspectives on computer-based performance assessment of problem solving. Computers in Human Behavior , 15, 255–268. Orasanu, J., and Connolly, T. (1993). The

  20. Afghanistan Drawdown Preparations: DOD Decision Makers Need Additional Analyses to Determine Costs and Benefits of Returning Excess Equipment

    DTIC Science & Technology

    2012-12-19

    Major end items are equipment that is important to operational readiness such as aircraft; boats; motorized wheeled , tracked, and towed vehicles...Process (cont.) Loader , Scoop Type (July 2012 Playbook, p. 292) There are155 Marine Corps Scoop Type Loaders in Afghanistan, all of which are... loaders are determined to be excess when the disposition instructions are issued, the transportation cost for the return of these loaders could range

  1. EPA evaluation of the SYNERGY-1 fuel additive under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1981-06-01

    This document announces the conclusions of the EPA evaluation of the 'SYNERGY-1' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. This additive is intended to improve fuel economy and exhaust emission levels of two and four cycle gasoline fueled engines.

  2. Low cost computer subsystem for the Solar Electric Propulsion Stage (SEPS)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Solar Electric Propulsion Stage (SEPS) subsystem which consists of the computer, custom input/output (I/O) unit, and tape recorder for mass storage of telemetry data was studied. Computer software and interface requirements were developed along with computer and I/O unit design parameters. Redundancy implementation was emphasized. Reliability analysis was performed for the complete command computer sybsystem. A SEPS fault tolerant memory breadboard was constructed and its operation demonstrated.

  3. Commonsense System Pricing; Or, How Much Will that $1,200 Computer Really Cost?

    ERIC Educational Resources Information Center

    Crawford, Walt

    1984-01-01

    Three methods employed to price and sell computer equipment are discussed: computer pricing, hardware pricing, system pricing (system includes complete computer and support hardware system and relatively complete software package). Advantages of system pricing are detailed, the author's system is described, and 10 systems currently available are…

  4. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  5. Theoretical thermodynamics for large molecules: walking the thin line between accuracy and computational cost.

    PubMed

    Schwabe, Tobias; Grimme, Stefan

    2008-04-01

    The thermodynamic properties of molecules are of fundamental interest in physics, chemistry, and biology. This Account deals with the developments that we have made in the about last five years to find quantum chemical electronic structure methods that have the prospect of being applicable to larger molecules. The typical target accuracy is about 0.5-1 kcal mol(-1) for chemical reaction and 0.1 kcal mol(-1) for conformational energies. These goals can be achieved when a few physically motivated corrections to first-principles methods are introduced to standard quantum chemical techniques. These do not lead to a significantly increased computational expense, and thus our methods have the computer hardware requirements of the corresponding standard treatments. Together with the use of density-fitting (RI) integral approximations, routine computations on systems with about 100 non-hydrogen atoms (2000-4000 basis functions) can be performed on modern PCs. Our improvements regarding accuracy are basically due to the use of modified second-order perturbation theory to account for many-particle (electron correlation) effects. Such nonlocal correlations are responsible for important parts of the interaction in and between atoms and molecules. A common example is the long-range dispersion interaction that lead to van der Waals complexes, but as shown here also the conventional thermodynamics of large molecules is significantly influenced by intramolecular dispersion effects. We first present the basic theoretical ideas behind our approaches, which are the spin-component-scaled Møller-Plesset perturbation theory (SCS-MP2) and double-hybrid density functionals (DHDF). Furthermore, the effect of the independently developed empirical dispersion correction (DFT-D) is discussed. Together with the use of large atomic orbital basis sets (of at least triple- or quadruple-zeta quality), the accuracy of the new methods is even competitive with computationally very expensive coupled

  6. Low-cost computing and network communication for a point-of-care device to perform a 3-part leukocyte differential

    NASA Astrophysics Data System (ADS)

    Powless, Amy J.; Feekin, Lauren E.; Hutcheson, Joshua A.; Alapat, Daisy V.; Muldoon, Timothy J.

    2016-03-01

    Point-of-care approaches for 3-part leukocyte differentials (granulocyte, monocyte, and lymphocyte), traditionally performed using a hematology analyzer within a panel of tests called a complete blood count (CBC), are essential not only to reduce cost but to provide faster results in low resource areas. Recent developments in lab-on-a-chip devices have shown promise in reducing the size and reagents used, relating to a decrease in overall cost. Furthermore, smartphone diagnostic approaches have shown much promise in the area of point-of-care diagnostics, but the relatively high per-unit cost may limit their utility in some settings. We present here a method to reduce computing cost of a simple epi-fluorescence imaging system using a Raspberry Pi (single-board computer, <$40) to perform a 3-part leukocyte differential comparable to results from a hematology analyzer. This system uses a USB color camera in conjunction with a leukocyte-selective vital dye (acridine orange) in order to determine a leukocyte count and differential from a low volume (<20 microliters) of whole blood obtained via fingerstick. Additionally, the system utilizes a "cloud-based" approach to send image data from the Raspberry Pi to a main server and return results back to the user, exporting the bulk of the computational requirements. Six images were acquired per minute with up to 200 cells per field of view. Preliminary results showed that the differential count varied significantly in monocytes with a 1 minute time difference indicating the importance of time-gating to produce an accurate/consist differential.

  7. Low-Cost Magnetic Stirrer from Recycled Computer Parts with Optional Hot Plate

    ERIC Educational Resources Information Center

    Guidote, Armando M., Jr.; Pacot, Giselle Mae M.; Cabacungan, Paul M.

    2015-01-01

    Magnetic stirrers and hot plates are key components of science laboratories. However, these are not readily available in many developing countries due to their high cost. This article describes the design of a low-cost magnetic stirrer with hot plate from recycled materials. Some of the materials used are neodymium magnets and CPU fans from…

  8. Cost-effectiveness of SQ® HDM SLIT-tablet in addition to pharmacotherapy for the treatment of house dust mite allergic rhinitis in Germany

    PubMed Central

    Green, William; Kleine-Tebbe, Jörg; Klimek, Ludger; Hahn-Pedersen, Julie; Nørgaard Andreasen, Jakob; Taylor, Matthew

    2017-01-01

    Background Allergic rhinitis is a global health problem that burdens society due to associated health care costs and its impact on health. Standardized quality (SQ®) house dust mite (HDM) sublingual immunotherapy (SLIT)-tablet is a sublingually administered allergy immunotherapy tablet for patients with persistent moderate to severe HDM allergic rhinitis despite use of allergy pharmacotherapy. Objective To assess the cost-effectiveness of SQ HDM SLIT-tablet in Germany for patients suffering from HDM allergic rhinitis. Methods A pharmacoeconomic analysis, based on data collected in a double-blinded, phase III randomized placebo-controlled trial (n=992), was undertaken to compare SQ HDM SLIT-tablet in addition to allergy pharmacotherapy to placebo plus allergy pharmacotherapy. Quality-adjusted life year (QALY) scores and health care resource use data recorded in the trial were applied to each treatment group and extrapolated over a nine-year time horizon. A series of scenarios were used to investigate the impact of changes on long-term patient health for both treatment groups, which was measured by annual changes in QALY scores. Deterministic and probabilistic sensitivity analyses were also performed. Results In the base case analysis, compared with allergy pharmacotherapy, SQ HDM SLIT-tablet led to a QALY gain of 0.31 at an incremental cost of €2,276 over the nine-year time horizon, equating to an incremental cost-effectiveness ratio of €7,519. The treatment was cost-effective for all scenarios analyzed; however, results were sensitive to changes in individual parameter values during the deterministic sensitivity analysis. Conclusion SQ HDM SLIT-tablet in addition to pharmacotherapy is cost-effective compared with allergy pharmacotherapy plus placebo for the treatment of persistent moderate to severe HDM allergic rhinitis that is not well controlled by allergy pharmacotherapy. PMID:28243132

  9. Additive manufacturing of liquid/gas diffusion layers for low-cost and high-efficiency hydrogen production

    SciTech Connect

    Mo, Jingke; Zhang, Feng -Yuan; Dehoff, Ryan R.; Peter, William H.; Toops, Todd J.; Green, Jr., Johney Boyd

    2016-01-14

    The electron beam melting (EBM) additive manufacturing technology was used to fabricate titanium liquid/gas diffusion media with high-corrosion resistances and well-controllable multifunctional parameters, including two-phase transport and excellent electric/thermal conductivities, has been first demonstrated. Their applications in proton exchange membrane eletrolyzer cells have been explored in-situ in a cell and characterized ex-situ with SEM and XRD. Compared with the conventional woven liquid/gas diffusion layers (LGDLs), much better performance with EBM fabricated LGDLs is obtained due to their significant reduction of ohmic loss. The EBM technology components exhibited several distinguished advantages in fabricating gas diffusion layer: well-controllable pore morphology and structure, rapid prototyping, fast manufacturing, highly customizing and economic. In addition, by taking advantage of additive manufacturing, it possible to fabricate complicated three-dimensional designs of virtually any shape from a digital model into one single solid object faster, cheaper and easier, especially for titanium. More importantly, this development will provide LGDLs with control of pore size, pore shape, pore distribution, and therefore porosity and permeability, which will be very valuable to develop modeling and to validate simulations of electrolyzers with optimal and repeatable performance. Further, it will lead to a manufacturing solution to greatly simplify the PEMEC/fuel cell components and to couple the LGDLs with other parts, since they can be easily integrated together with this advanced manufacturing process

  10. Additive manufacturing of liquid/gas diffusion layers for low-cost and high-efficiency hydrogen production

    DOE PAGES

    Mo, Jingke; Zhang, Feng -Yuan; Dehoff, Ryan R.; ...

    2016-01-14

    The electron beam melting (EBM) additive manufacturing technology was used to fabricate titanium liquid/gas diffusion media with high-corrosion resistances and well-controllable multifunctional parameters, including two-phase transport and excellent electric/thermal conductivities, has been first demonstrated. Their applications in proton exchange membrane eletrolyzer cells have been explored in-situ in a cell and characterized ex-situ with SEM and XRD. Compared with the conventional woven liquid/gas diffusion layers (LGDLs), much better performance with EBM fabricated LGDLs is obtained due to their significant reduction of ohmic loss. The EBM technology components exhibited several distinguished advantages in fabricating gas diffusion layer: well-controllable pore morphology and structure,more » rapid prototyping, fast manufacturing, highly customizing and economic. In addition, by taking advantage of additive manufacturing, it possible to fabricate complicated three-dimensional designs of virtually any shape from a digital model into one single solid object faster, cheaper and easier, especially for titanium. More importantly, this development will provide LGDLs with control of pore size, pore shape, pore distribution, and therefore porosity and permeability, which will be very valuable to develop modeling and to validate simulations of electrolyzers with optimal and repeatable performance. Further, it will lead to a manufacturing solution to greatly simplify the PEMEC/fuel cell components and to couple the LGDLs with other parts, since they can be easily integrated together with this advanced manufacturing process« less

  11. The application of cloud computing to scientific workflows: a study of cost and performance.

    PubMed

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  12. Can low-cost VOR and Omega receivers suffice for RNAV - A new computer-based navigation technique

    NASA Technical Reports Server (NTRS)

    Hollaar, L. A.

    1978-01-01

    It is shown that although RNAV is particularly valuable for the personal transportation segment of general aviation, it has not gained complete acceptance. This is due, in part, to its high cost and the necessary special-handling air traffic control. VOR/DME RNAV calculations are ideally suited for analog computers, and the use of microprocessor technology has been suggested for reducing RNAV costs. Three navigation systems, VOR, Omega, and DR, are compared for common navigational difficulties, such as station geometry, siting errors, ground disturbances, and terminal area coverage. The Kalman filtering technique is described with reference to the disadvantages when using a system including standard microprocessors. An integrated navigation system, using input data from various low-cost sensor systems, is presented and current simulation studies are noted.

  13. Cost analysis of non-invasive fractional flow reserve derived from coronary computed tomographic angiography in Japan.

    PubMed

    Kimura, Takeshi; Shiomi, Hiroki; Kuribayashi, Sachio; Isshiki, Takaaki; Kanazawa, Susumu; Ito, Hiroshi; Ikeda, Shunya; Forrest, Ben; Zarins, Christopher K; Hlatky, Mark A; Norgaard, Bjarne L

    2015-01-01

    Percutaneous coronary intervention (PCI) based on fractional flow reserve (FFRcath) measurement during invasive coronary angiography (CAG) results in improved patient outcome and reduced healthcare costs. FFR can now be computed non-invasively from standard coronary CT angiography (cCTA) scans (FFRCT). The purpose of this study is to determine the potential impact of non-invasive FFRCT on costs and clinical outcomes of patients with suspected coronary artery disease in Japan. Clinical data from 254 patients in the HeartFlowNXT trial, costs of goods and services in Japan, and clinical outcome data from the literature were used to estimate the costs and outcomes of 4 clinical pathways: (1) CAG-visual guided PCI, (2) CAG-FFRcath guided PCI, (3) cCTA followed by CAG-visual guided PCI, (4) cCTA-FFRCT guided PCI. The CAG-visual strategy demonstrated the highest projected cost ($10,360) and highest projected 1-year death/myocardial infarction rate (2.4 %). An assumed price for FFRCT of US $2,000 produced equivalent clinical outcomes (death/MI rate: 1.9 %) and healthcare costs ($7,222) for the cCTA-FFRCT strategy and the CAG-FFRcath guided PCI strategy. Use of the cCTA-FFRCT strategy to select patients for PCI would result in 32 % lower costs and 19 % fewer cardiac events at 1 year compared to the most commonly used CAG-visual strategy. Use of cCTA-FFRCT to select patients for CAG and PCI may reduce costs and improve clinical outcome in patients with suspected coronary artery disease in Japan.

  14. Model implementation for dynamic computation of system cost for advanced life support

    NASA Technical Reports Server (NTRS)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  15. Thoracoabdominal Computed Tomography in Trauma Patients: A Cost-Consequences Analysis

    PubMed Central

    van Vugt, Raoul; Kool, Digna R.; Brink, Monique; Dekker, Helena M.; Deunk, Jaap; Edwards, Michael J.

    2014-01-01

    Background: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. Objectives: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use thoracoabdominal CT in primary evaluation of adult patients with high-energy blunt trauma. Materials and Methods: We compared three different algorithms in which CT was applied as an immediate diagnostic tool (rush CT), a diagnostic tool after limited conventional work-up (routine CT), and a selective tool (selective CT). Probabilities of detecting and missing clinically relevant injuries were retrospectively derived. We collected data on radiation exposure and performed a micro-cost analysis on a reference case-based approach. Results: Both rush and routine CT detected all thoracoabdominal injuries in 99.1% of the patients during primary evaluation (n = 1040). Selective CT missed one or more diagnoses in 11% of the patients in which a change of treatment was necessary in 4.8%. Rush CT algorithm costed € 2676 (US$ 3660) per patient with a mean radiation dose of 26.40 mSv per patient. Routine CT costed € 2815 (US$ 3850) and resulted in the same radiation exposure. Selective CT resulted in less radiation dose (23.23 mSv) and costed € 2771 (US$ 3790). Conclusions: Rush CT seems to result in the least costs and is comparable in terms of radiation dose exposure and diagnostic certainty with routine CT after a limited conventional work-up. However, selective CT results in less radiation dose exposure but a slightly higher cost and less certainty. PMID:25337521

  16. 38 CFR 17.274 - Cost sharing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the beneficiary cost share. (b) In addition to the beneficiary cost share, an annual (calendar year... illness or injury, a calendar year cost limit or “catastrophic cap” has been placed on the beneficiary... cap computation. After a family has paid the maximum cost-share and deductible amounts for a...

  17. 38 CFR 17.274 - Cost sharing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the beneficiary cost share. (b) In addition to the beneficiary cost share, an annual (calendar year... illness or injury, a calendar year cost limit or “catastrophic cap” has been placed on the beneficiary... cap computation. After a family has paid the maximum cost-share and deductible amounts for a...

  18. Neural Correlates of Task Cost for Stance Control with an Additional Motor Task: Phase-Locked Electroencephalogram Responses

    PubMed Central

    Hwang, Ing-Shiou; Huang, Cheng-Ya

    2016-01-01

    With appropriate reallocation of central resources, the ability to maintain an erect posture is not necessarily degraded by a concurrent motor task. This study investigated the neural control of a particular postural-suprapostural procedure involving brain mechanisms to solve crosstalk between posture and motor subtasks. Participants completed a single posture task and a dual-task while concurrently conducting force-matching and maintaining a tilted stabilometer stance at a target angle. Stabilometer movements and event-related potentials (ERPs) were recorded. The added force-matching task increased the irregularity of postural response rather than the size of postural response prior to force-matching. In addition, the added force-matching task during stabilometer stance led to marked topographic ERP modulation, with greater P2 positivity in the frontal and sensorimotor-parietal areas of the N1-P2 transitional phase and in the sensorimotor-parietal area of the late P2 phase. The time-frequency distribution of the ERP primary principal component revealed that the dual-task condition manifested more pronounced delta (1–4 Hz) and beta (13–35 Hz) synchronizations but suppressed theta activity (4–8 Hz) before force-matching. The dual-task condition also manifested coherent fronto-parietal delta activity in the P2 period. In addition to a decrease in postural regularity, this study reveals spatio-temporal and temporal-spectral reorganizations of ERPs in the fronto-sensorimotor-parietal network due to the added suprapostural motor task. For a particular set of postural-suprapostural task, the behavior and neural data suggest a facilitatory role of autonomous postural response and central resource expansion with increasing interregional interactions for task-shift and planning the motor-suprapostural task. PMID:27010634

  19. Reduction of computer usage costs in predicting unsteady aerodynamic loadings caused by control surface motions: Computer program description

    NASA Technical Reports Server (NTRS)

    Petrarca, J. R.; Harrison, B. A.; Redman, M. C.; Rowe, W. S.

    1979-01-01

    A digital computer program was developed to calculate unsteady loadings caused by motions of lifting surfaces with leading edge and trailing edge controls based on the subsonic kernel function approach. The pressure singularities at hinge line and side edges were extracted analytically as a preliminary step to solving the integral equation of collocation. The program calculates generalized aerodynamic forces for user supplied deflection modes. Optional intermediate output includes pressure at an array of points, and sectional generalized forces. From one to six controls on the half span can be accomodated.

  20. Community-Based Health Education Programs Designed to Improve Clinical Measures Are Unlikely to Reduce Short-Term Costs or Utilization Without Additional Features Targeting These Outcomes.

    PubMed

    Burton, Joe; Eggleston, Barry; Brenner, Jeffrey; Truchil, Aaron; Zulkiewicz, Brittany A; Lewis, Megan A

    2016-06-07

    Stakeholders often expect programs for persons with chronic conditions to "bend the cost curve." This study assessed whether a diabetes self-management education (DSME) program offered as part of a multicomponent initiative could affect emergency department (ED) visits, hospital stays, and the associated costs for an underserved population in addition to the clinical indicators that DSME programs attempt to improve. The program was implemented in Camden, New Jersey, by the Camden Coalition of Healthcare Providers to address disparities in diabetes care. Data used are from medical records and from patient-level information about hospital services from Camden's hospitals. Using multivariate regression models to control for individual characteristics, changes in utilization over time and changes relative to 2 comparison groups were assessed. No reductions in ED visits, inpatient stays, or costs for participants were found over time or relative to the comparison groups. High utilization rates and costs for diabetes are associated with longer term disease progression and its sequelae; thus, DSME or peer support may not affect these in the near term. Some clinical indicators improved among participants, and these might lead to fewer costly adverse health events in the future. DSME deployed at the community level, without explicit segmentation and targeting of high health care utilizers or without components designed to affect costs and utilization, should not be expected to reduce short-term medical needs for participating individuals or care-seeking behaviors such that utilization is reduced. Stakeholders must include financial outcomes in a program's design if those outcomes are to improve. (Population Health Management 20XX;XX:XXX-XXX).

  1. Turbulence computations with 3-D small-scale additive turbulent decomposition and data-fitting using chaotic map combinations

    SciTech Connect

    Mukerji, Sudip

    1997-01-01

    Although the equations governing turbulent fluid flow, the Navier-Stokes (N.S.) equations, have been known for well over a century and there is a clear technological necessity in obtaining solutions to these equations, turbulence remains one of the principal unsolved problems in physics today. It is still not possible to make accurate quantitative predictions about turbulent flows without relying heavily on empirical data. In principle, it is possible to obtain turbulent solutions from a direct numerical simulation (DNS) of the N.-S. equations. The author first provides a brief introduction to the dynamics of turbulent flows. The N.-S. equations which govern fluid flow, are described thereafter. Then he gives a brief overview of DNS calculations and where they stand at present. He next introduces the two most popular approaches for doing turbulent computations currently in use, namely, the Reynolds averaging of the N.-S. equations (RANS) and large-eddy simulation (LES). Approximations, often ad hoc ones, are present in these methods because use is made of heuristic models for turbulence quantities (the Reynolds stresses) which are otherwise unknown. They then introduce a new computational method called additive turbulent decomposition (ATD), the small-scale version of which is the topic of this research. The rest of the thesis is organized as follows. In Chapter 2 he describes the ATD procedure in greater detail; how dependent variables are split and the decomposition into large- and small-scale sets of equations. In Chapter 3 the spectral projection of the small-scale momentum equations are derived in detail. In Chapter 4 results of the computations with the small-scale ATD equations are presented. In Chapter 5 he describes the data-fitting procedure which can be used to directly specify the parameters of a chaotic-map turbulence model.

  2. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computing the AAPCC. (c) Adjustment factors—(1) Geographic. CMS makes an adjustment to reflect the relative...) Age, sex, and disability status. CMS makes adjustments to reflect the age and sex distribution and...

  3. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computing the AAPCC. (c) Adjustment factors—(1) Geographic. CMS makes an adjustment to reflect the relative...) Age, sex, and disability status. CMS makes adjustments to reflect the age and sex distribution and...

  4. Computer simulation of energy use, greenhouse gas emissions, and costs for alternative methods of processing fluid milk.

    PubMed

    Tomasula, P M; Datta, N; Yee, W C F; McAloon, A J; Nutter, D W; Sampedro, F; Bonnaillie, L M

    2014-07-01

    Computer simulation is a useful tool for benchmarking electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature, short time (HTST) pasteurization was extended to include models for processes for shelf-stable milk and extended shelf-life milk that may help prevent the loss or waste of milk that leads to increases in the greenhouse gas (GHG) emissions for fluid milk. The models were for UHT processing, crossflow microfiltration (MF) without HTST pasteurization, crossflow MF followed by HTST pasteurization (MF/HTST), crossflow MF/HTST with partial homogenization, and pulsed electric field (PEF) processing, and were incorporated into the existing model for the fluid milk process. Simulation trials were conducted assuming a production rate for the plants of 113.6 million liters of milk per year to produce only whole milk (3.25%) and 40% cream. Results showed that GHG emissions in the form of process-related CO₂ emissions, defined as CO₂ equivalents (e)/kg of raw milk processed (RMP), and specific energy consumptions (SEC) for electricity and natural gas use for the HTST process alone were 37.6g of CO₂e/kg of RMP, 0.14 MJ/kg of RMP, and 0.13 MJ/kg of RMP, respectively. Emissions of CO2 and SEC for electricity and natural gas use were highest for the PEF process, with values of 99.1g of CO₂e/kg of RMP, 0.44 MJ/kg of RMP, and 0.10 MJ/kg of RMP, respectively, and lowest for the UHT process at 31.4 g of CO₂e/kg of RMP, 0.10 MJ/kg of RMP, and 0.17 MJ/kg of RMP. Estimated unit production costs associated with the various processes were lowest for the HTST process and MF/HTST with partial homogenization at $0.507/L and highest for the UHT process at $0.60/L. The increase in shelf life associated with the UHT and MF processes may eliminate some of the supply chain product and consumer losses and waste of milk and compensate for the small increases in GHG

  5. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    DISA is leading the way for the development of a private DOD cloud computing environment in conjunction with the Army. Operational in 2008, DISA...significant opportunities and security challenges when implementing a cloud computing environment . The transformation of DOD information technology...is this shared pool of resources, espe- cially shared resources in a commercial environment , that also creates numerous risks not usually seen in

  6. COEFUV: A Computer Implementation of a Generalized Unmanned Vehicle Cost Model.

    DTIC Science & Technology

    1978-10-01

    that ingress area attrition is twice egress area attrition whence _ -1 + /1 + 8 x PSS(2Ps 2 .(2) si2 Ps2 T (I + P s) (3) Similarly, Ptl and Pt2 are...no vehicles ini- tially ready, n = n ma x ) in steps of 0.05 n max . The cost determination is treated in the next section. The value of n max can...log-linear cumulative average curve I which gives the average cost of a vehicle in a buy of x vehicles as = axb (1) where x = total vehicles produced

  7. Non-additive benefit or cost? Disentangling the indirect effects that occur when plants bearing extrafloral nectaries and honeydew-producing insects share exotic ant mutualists

    PubMed Central

    Savage, Amy M.; Rudgers, Jennifer A.

    2013-01-01

    Background and Aims In complex communities, organisms often form mutualisms with multiple different partners simultaneously. Non-additive effects may emerge among species linked by these positive interactions. Ants commonly participate in mutualisms with both honeydew-producing insects (HPI) and their extrafloral nectary (EFN)-bearing host plants. Consequently, HPI and EFN-bearing plants may experience non-additive benefits or costs when these groups co-occur. The outcomes of these interactions are likely to be influenced by variation in preferences among ants for honeydew vs. nectar. In this study, a test was made for non-additive effects on HPI and EFN-bearing plants resulting from sharing exotic ant guards. Preferences of the dominant exotic ant species for nectar vs. honeydew resources were also examined. Methods Ant access, HPI and nectar availability were manipulated on the EFN-bearing shrub, Morinda citrifolia, and ant and HPI abundances, herbivory and plant growth were assessed. Ant-tending behaviours toward HPI across an experimental gradient of nectar availability were also tracked in order to investigate mechanisms underlying ant responses. Key Results The dominant ant species, Anoplolepis gracilipes, differed from less invasive ants in response to multiple mutualists, with reductions in plot-wide abundances when nectar was reduced, but no response to HPI reduction. Conversely, at sites where A. gracilipes was absent or rare, abundances of less invasive ants increased when nectar was reduced, but declined when HPI were reduced. Non-additive benefits were found at sites dominated by A. gracilipes, but only for M. citrifolia plants. Responses of HPI at these sites supported predictions of the non-additive cost model. Interestingly, the opposite non-additive patterns emerged at sites dominated by other ants. Conclusions It was demonstrated that strong non-additive benefits and costs can both occur when a plant and herbivore share mutualist partners. These

  8. Low-Cost Computer-Controlled Current Stimulator for the Student Laboratory

    ERIC Educational Resources Information Center

    Guclu, Burak

    2007-01-01

    Electrical stimulation of nerve and muscle tissues is frequently used for teaching core concepts in physiology. It is usually expensive to provide every student group in the laboratory with an individual stimulator. This article presents the design and application of a low-cost [about $100 (U.S.)] isolated stimulator that can be controlled by two…

  9. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    NASA Astrophysics Data System (ADS)

    Handford, Matthew L.; Srinivasan, Manoj

    2016-02-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost – even lower than assuming that the non-amputee’s ankle torques are cost-free.

  10. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs.

    PubMed

    Handford, Matthew L; Srinivasan, Manoj

    2016-02-09

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user's walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost--even lower than assuming that the non-amputee's ankle torques are cost-free.

  11. Cost-Effectiveness Specification for Computer-Based Training Systems. Volume 1. Development

    DTIC Science & Technology

    1977-09-01

    Included in this element are such places as auditoria , study halls, demonstration rooms, etc., where large numbers of students can be trained. Offices...etc.) which may accrue over several years will be permitted to surface and balance the large initial capital investment costs of implementing a

  12. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    PubMed Central

    Handford, Matthew L.; Srinivasan, Manoj

    2016-01-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost – even lower than assuming that the non-amputee’s ankle torques are cost-free. PMID:26857747

  13. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    ERIC Educational Resources Information Center

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-01-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, "Eyegrade," a system for automatic grading of multiple…

  14. Linking process, structure, property, and performance for metal-based additive manufacturing: computational approaches with experimental support

    NASA Astrophysics Data System (ADS)

    Smith, Jacob; Xiong, Wei; Yan, Wentao; Lin, Stephen; Cheng, Puikei; Kafka, Orion L.; Wagner, Gregory J.; Cao, Jian; Liu, Wing Kam

    2016-04-01

    Additive manufacturing (AM) methods for rapid prototyping of 3D materials (3D printing) have become increasingly popular with a particular recent emphasis on those methods used for metallic materials. These processes typically involve an accumulation of cyclic phase changes. The widespread interest in these methods is largely stimulated by their unique ability to create components of considerable complexity. However, modeling such processes is exceedingly difficult due to the highly localized and drastic material evolution that often occurs over the course of the manufacture time of each component. Final product characterization and validation are currently driven primarily by experimental means as a result of the lack of robust modeling procedures. In the present work, the authors discuss primary detrimental hurdles that have plagued effective modeling of AM methods for metallic materials while also providing logical speculation into preferable research directions for overcoming these hurdles. The primary focus of this work encompasses the specific areas of high-performance computing, multiscale modeling, materials characterization, process modeling, experimentation, and validation for final product performance of additively manufactured metallic components.

  15. Incidental findings on computed tomography scans for acute appendicitis: prevalence, costs, and outcome.

    PubMed

    Ozao-Choy, Junko; Kim, Unsup; Vieux, Ulrich; Menes, Tehillah S

    2011-11-01

    CT scan is increasingly being used to diagnose appendicitis due to its specificity and literature suggesting its cost-effectiveness. CT scans are associated with incidental findings. We sought to investigate the rates of incidental findings identified on CT scans, the follow-up of these findings, and the added cost associated with this follow-up. A retrospective review of patients who underwent appendectomies for acute appendicitis between 2003 and 2005 was completed at Elmhurst Hospital Center (Elmhurst, NY). Incidental findings were grouped into low and high significance, based on workup or follow-up needed. The diagnostic workup and cost of each incidental finding was ascertained. For patients who did not receive a workup due to lack of follow-up, an estimate of the minimum workup was calculated. Of 1142 patients with acute appendicitis, 876 (77%) had a CT scan. This rate increased over time (from 66% in 2003 to 85% in 2005, P < 0.01) and with age (70% in patients under 20 and 98% in patients over 50, P < 0.001). Incidental findings were common and increased with age (23% in the youngest group vs 78% in patients older than 50, P < 0.001). The cost associated with workup of these incidental findings increased with age as well. The increased use of CT scans is associated with a high rate of incidental findings. These findings are usually of low clinical significance but may require further workup and follow-up. Physicians need to be aware of the high rate of incidental findings, the need for further workup, and the associated costs.

  16. Computer analysis of effects of altering jet fuel properties on refinery costs and yields

    NASA Technical Reports Server (NTRS)

    Breton, T.; Dunbar, D.

    1984-01-01

    This study was undertaken to evaluate the adequacy of future U.S. jet fuel supplies, the potential for large increases in the cost of jet fuel, and to what extent a relaxation in jet fuel properties would remedy these potential problems. The results of the study indicate that refiners should be able to meet jet fuel output requirements in all regions of the country within the current Jet A specifications during the 1990-2010 period. The results also indicate that it will be more difficult to meet Jet A specifications on the West Coast, because the feedstock quality is worse and the required jet fuel yield (jet fuel/crude refined) is higher than in the East. The results show that jet fuel production costs could be reduced by relaxing fuel properties. Potential cost savings in the East (PADDs I-IV) through property relaxation were found to be about 1.3 cents/liter (5 cents/gallon) in January 1, 1981 dollars between 1990 and 2010. However, the savings from property relaxation were all obtained within the range of current Jet A specifications, so there is no financial incentive to relax Jet A fuel specifications in the East. In the West (PADD V) the potential cost savings from lowering fuel quality were considerably greater than in the East. Cost savings from 2.7 to 3.7 cents/liter (10-14 cents/gallon) were found. In contrast to the East, on the West Coast a significant part of the savings was obtained through relaxation of the current Jet A fuel specifications.

  17. Evaluation of low‐cost computer monitors for the detection of cervical spine injuries in the emergency room: an observer confidence‐based study

    PubMed Central

    Brem, M H; Böhner, C; Brenning, A; Gelse, K; Radkow, T; Blanke, M; Schlechtweg, P M; Neumann, G; Wu, I Y; Bautz, W; Hennig, F F; Richter, H

    2006-01-01

    Background To compare the diagnostic value of low‐cost computer monitors and a Picture Archiving and Communication System (PACS) workstation for the evaluation of cervical spine fractures in the emergency room. Methods Two groups of readers blinded to the diagnoses (2 radiologists and 3 orthopaedic surgeons) independently assessed–digital radiographs of the cervical spine (anterior–posterior, oblique and trans‐oral‐dens views). The radiographs of 57 patients who arrived consecutively to the emergency room in 2004 with clinical suspicion of a cervical spine injury were evaluated. The diagnostic values of these radiographs were scored on a 3‐point scale (1 = diagnosis not possible/bad image quality, 2 = diagnosis uncertain, 3 = clear diagnosis of fracture or no fracture) on a PACS workstation and on two different liquid crystal display (LCD) personal computer monitors. The images were randomised to avoid memory effects. We used logistic mixed‐effects models to determine the possible effects of monitor type on the evaluation of x ray images. To determine the overall effects of monitor type, this variable was used as a fixed effect, and the image number and reader group (radiologist or orthopaedic surgeon) were used as random effects on display quality. Group‐specific effects were examined, with the reader group and additional fixed effects as terms. A significance level of 0.05 was established for assessing the contribution of each fixed effect to the model. Results Overall, the diagnostic score did not differ significantly between standard personal computer monitors and the PACS workstation (both p values were 0.78). Conclusion Low‐cost LCD personal computer monitors may be useful in establishing a diagnosis of cervical spine fractures in the emergency room. PMID:17057136

  18. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  19. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  20. High-Speed, Low-Cost Workstation for Computation-Intensive Statistics. Phase 1

    DTIC Science & Technology

    1990-06-20

    OISTRIBUTION /AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Unlimited distribution 13. ABSTRACT (Maximum 200 words) High -performance and low-cost...subroutines are coded and inserted into high -level statistical algorithms. The use of a high -level, portable language for development is advised. The...intensive high -level statistical algorithms were coded for use on the DSP during this effort. We do not intend to give complete descriptions of the

  1. Low cost SCR lamp driver indicates contents of digital computer registers

    NASA Technical Reports Server (NTRS)

    Cliff, R. A.

    1967-01-01

    Silicon Controlled Rectifier /SCR/ lamp driver is adapted for use in integrated circuit digital computers where it indicates the contents of the various registers. The threshold voltage at which visual indication begins is very sharply defined and can be adjusted to suit particular system requirements.

  2. Can computer mice be used as low-cost devices for the acquisition of planar human movement velocity signals?

    PubMed

    O'Reilly, Christian; Plamondon, Réjean

    2011-03-01

    The main goal of this work is to determine whether a computer mouse can be used as a low-cost device for the acquisition of two-dimensional human movement velocity signals in the context of psychophysical studies and biomedical applications. A comprehensive overview of the related literature is presented, and the problem of characterizing mouse movement acquisition is analyzed and discussed. Then, the quality of velocity signals acquired with this kind of device is measured on horizontal oscillatory movements by comparing the mouse data to the signals acquired simultaneously by a video motion tracking system and a digitizing tablet. A synthesis of the information gathered in this work indicates that the computer mouse can be used for the reliable acquisition of biosignals in the context of human movement studies, particularly for many applications dealing with the velocity of the end effector of the upper limb. This paper concludes by discussing the possibilities and limitations of such use.

  3. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  4. Computer Generated Imagery (CGI) Current Technology and Cost Measures Feasibility Study.

    DTIC Science & Technology

    1980-09-26

    4-10 4-5 VITAL Visual System Block Diagram ............... 4-15 4-6 Singer-Link Image Processing Hardware ........... 4-19 4-7 Block Diagram Of...SCENE I and II systems which are described in Table E-1. G.E.’s current real time systems process three cycles concurrently. These computation cycles... processing ; bus interface; active face, block, cluster, and region assignment; and vector processing . The opera- tion of each sequencer is basically

  5. [VALIDATION OF A COMPUTER PROGRAM FOR DETECTION OF MALNUTRITION HOSPITAL AND ANALYSIS OF HOSPITAL COSTS].

    PubMed

    Fernández Valdivia, Antonia; Rodríguez Rodríguez, José María; Valero Aguilera, Beatriz; Lobo Támer, Gabriela; Pérez de la Cruz, Antonio Jesús; García Larios, José Vicente

    2015-07-01

    Introducción: uno de los métodos de diagnóstico de la desnutrición es la albúmina sérica, por la sencillez de su determinación y bajo coste. Objetivos: el objetivo principal es validar e implementar un programa informático, basado en la determinación de albúmina sérica, que permita detectar y tratar precozmente a los pacientes desnutridos o en riesgo de desnutrición, siendo otro objetivo la evaluación de costes por grupos relacionados por el diagnóstico. Métodos: el diseño del estudio es de tipo cohorte, dinámico y prospectivo, en el que se han incluido las altas hospitalarias desde noviembre del año 2012 hasta marzo del año 2014, siendo la población de estudio los pacientes mayores de 14 años que ingresen en los diversos servicios de un Hospital Médico Quirúrgico del Complejo Hospitalario Universitario de Granada, cuyas cifras de albúmina sérica sean menores de 3,5 g/dL, siendo el total de 307 pacientes. Resultados: de los 307 pacientes, 141 presentan desnutrición (sensibilidad del programa: 45,9%). El 54,7% de los pacientes son hombres y el 45,3% mujeres. La edad media es de 65,68 años. La mediana de la estancia es de 16 días. El 13,4% de los pacientes han fallecido. El coste medio de los GRD es de 5.958,30 € y dicho coste medio después de detectar la desnutrición es de 11.376,48 €. Conclusiones: el algoritmo que implementa el programa informático identifica a casi la mitad de los pacientes hospitalizados desnutridos. Es fundamental registrar el diagnóstico de desnutrición.

  6. Energy sources for laparoscopic colectomy: a prospective randomized comparison of conventional electrosurgery, bipolar computer-controlled electrosurgery and ultrasonic dissection. Operative outcome and costs analysis.

    PubMed

    Targarona, Eduardo Ma; Balague, Carmen; Marin, Juan; Neto, Rene Berindoague; Martinez, Carmen; Garriga, Jordi; Trias, Manuel

    2005-12-01

    The development of operative laparoscopic surgery is linked to advances in ancillary surgical instrumentation. Ultrasonic energy devices avoid the use of electricity and provide effective control of small- to medium-sized vessels. Bipolar computer-controlled electrosurgical technology eliminates the disadvantages of electrical energy, and a mechanical blade adds a cutting action. This instrument can provide effective hemostasis of large vessels up to 7 mm. Such devices significantly increase the cost of laparoscopic procedures, however, and the amount of evidence-based information on this topic is surprisingly scarce. This study compared the effectiveness of three different energy sources on the laparoscopic performance of a left colectomy. The trial included 38 nonselected patients with a disease of the colon requiring an elective segmental left-sided colon resection. Patients were preoperatively randomized into three groups. Group I had electrosurgery; vascular dissection was performed entirely with an electrosurgery generator, and vessels were controlled with clips. Group II underwent computer-controlled bipolar electrosurgery; vascular and mesocolon section was completed by using the 10-mm Ligasure device alone. In group III, 5-mm ultrasonic shears (Harmonic Scalpel) were used for bowel dissection, vascular pedicle dissection, and mesocolon transection. The mesenteric vessel pedicle was controlled with an endostapler. Demographics (age, sex, body mass index, comorbidity, previous surgery and diagnoses requiring surgery) were recorded, as were surgical details (operative time, conversion, blood loss), additional disposable instruments (number of trocars, EndoGIA charges, and clip appliers), and clinical outcome. Intraoperative economic costs were also evaluated. End points of the trial were operative time and intraoperative blood loss, and an intention-to-treat principle was followed. The three groups were well matched for demographic and pathologic features

  7. Matched filtering of gravitational waves from inspiraling compact binaries: Computational cost and template placement

    NASA Astrophysics Data System (ADS)

    Owen, Benjamin J.; Sathyaprakash, B. S.

    1999-07-01

    We estimate the number of templates, computational power, and storage required for a one-step matched filtering search for gravitational waves from inspiraling compact binaries. Our estimates for the one-step search strategy should serve as benchmarks for the evaluation of more sophisticated strategies such as hierarchical searches. We use a discrete family of two-parameter wave form templates based on the second post-Newtonian approximation for binaries composed of nonspinning compact bodies in circular orbits. We present estimates for all of the large- and mid-scale interferometers now under construction: LIGO (three configurations), VIRGO, GEO600, and TAMA. To search for binaries with components more massive than mmin=0.2Msolar while losing no more than 10% of events due to coarseness of template spacing, the initial LIGO interferometers will require about 1.0×1011 flops (floating point operations per second) for data analysis to keep up with data acquisition. This is several times higher than estimated in previous work by Owen, in part because of the improved family of templates and in part because we use more realistic (higher) sampling rates. Enhanced LIGO, GEO600, and TAMA will require computational power similar to initial LIGO. Advanced LIGO will require 7.8×1011 flops, and VIRGO will require 4.8×1012 flops to take full advantage of its broad target noise spectrum. If the templates are stored rather than generated as needed, storage requirements range from 1.5×1011 real numbers for TAMA to 6.2×1014 for VIRGO. The computational power required scales roughly as m-8/3min and the storage as m-13/3min. Since these scalings are perturbed by the curvature of the parameter space at second post-Newtonian order, we also provide estimates for a search with mmin=1Msolar. Finally, we sketch and discuss an algorithm for placing the templates in the parameter space.

  8. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  9. Potentially Low Cost Solution to Extend Use of Early Generation Computed Tomography

    PubMed Central

    Tonna, Joseph E.; Balanoff, Amy M.; Lewin, Matthew R.; Saandari, Namjilmaa; Wintermark, Max

    2010-01-01

    In preparing a case report on Brown-Séquard syndrome for publication, we made the incidental finding that the inexpensive, commercially available three-dimensional (3D) rendering software we were using could produce high quality 3D spinal cord reconstructions from any series of two-dimensional (2D) computed tomography (CT) images. This finding raises the possibility that spinal cord imaging capabilities can be expanded where bundled 2D multi-planar reformats and 3D reconstruction software for CT are not available and in situations where magnetic resonance imaging (MRI) is either not available or appropriate (e.g. metallic implants). Given the worldwide burden of trauma and considering the limited availability of MRI and advanced generation CT scanners, we propose an alternative, potentially useful approach to imaging spinal cord that might be useful in areas where technical capabilities and support are limited. PMID:21293767

  10. Avoiding the Enumeration of Infeasible Elementary Flux Modes by Including Transcriptional Regulatory Rules in the Enumeration Process Saves Computational Costs.

    PubMed

    Jungreuthmayer, Christian; Ruckerbauer, David E; Gerstl, Matthias P; Hanscho, Michael; Zanghellini, Jürgen

    2015-01-01

    Despite the significant progress made in recent years, the computation of the complete set of elementary flux modes of large or even genome-scale metabolic networks is still impossible. We introduce a novel approach to speed up the calculation of elementary flux modes by including transcriptional regulatory information into the analysis of metabolic networks. Taking into account gene regulation dramatically reduces the solution space and allows the presented algorithm to constantly eliminate biologically infeasible modes at an early stage of the computation procedure. Thereby, computational costs, such as runtime, memory usage, and disk space, are extremely reduced. Moreover, we show that the application of transcriptional rules identifies non-trivial system-wide effects on metabolism. Using the presented algorithm pushes the size of metabolic networks that can be studied by elementary flux modes to new and much higher limits without the loss of predictive quality. This makes unbiased, system-wide predictions in large scale metabolic networks possible without resorting to any optimization principle.

  11. Avoiding the Enumeration of Infeasible Elementary Flux Modes by Including Transcriptional Regulatory Rules in the Enumeration Process Saves Computational Costs

    PubMed Central

    Jungreuthmayer, Christian; Ruckerbauer, David E.; Gerstl, Matthias P.; Hanscho, Michael; Zanghellini, Jürgen

    2015-01-01

    Despite the significant progress made in recent years, the computation of the complete set of elementary flux modes of large or even genome-scale metabolic networks is still impossible. We introduce a novel approach to speed up the calculation of elementary flux modes by including transcriptional regulatory information into the analysis of metabolic networks. Taking into account gene regulation dramatically reduces the solution space and allows the presented algorithm to constantly eliminate biologically infeasible modes at an early stage of the computation procedure. Thereby, computational costs, such as runtime, memory usage, and disk space, are extremely reduced. Moreover, we show that the application of transcriptional rules identifies non-trivial system-wide effects on metabolism. Using the presented algorithm pushes the size of metabolic networks that can be studied by elementary flux modes to new and much higher limits without the loss of predictive quality. This makes unbiased, system-wide predictions in large scale metabolic networks possible without resorting to any optimization principle. PMID:26091045

  12. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    NASA Technical Reports Server (NTRS)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  13. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  14. Computer vision on color-band resistor and its cost-effective diffuse light source design

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Sheng; Wang, Jeng-Yau

    2016-11-01

    Color-band resistor possessing specular surface is worthy of studying in the area of color image processing and color material recognition. The specular reflection and halo effects appearing in the acquired resistor image will result in the difficulty of color band extraction and recognition. A computer vision system is proposed to detect the resistor orientation, segment the resistor's main body, extract and identify the color bands, as well as recognize the color code sequence and read the resistor value. The effectiveness of reducing the specular reflection and halo effects are confirmed by several cheap covers, e.g., paper bowl, cup, or box inside pasted with white paper combining with a ring-type LED controlled automatically by the detected resistor orientation. The calibration of the microscope used to acquire the resistor image is described and the proper environmental light intensity is suggested. Experiments are evaluated by 200 4-band and 200 5-band resistors comprising 12 colors used on color-band resistors and show the 90% above correct rate of reading resistor. The performances reported by the failed number of horizontal alignment, color band extraction, color identification, as well as color code sequence flip over checking confirm the feasibility of the presented approach.

  15. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  16. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  17. Computer Vision Tools for Low-Cost and Noninvasive Measurement of Autism-Related Behaviors in Infants

    PubMed Central

    Vallin Spina, Thiago; Papanikolopoulos, Nikolaos; Egger, Helen

    2014-01-01

    The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments. PMID:25045536

  18. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  19. Effect of zinc addition and vacuum annealing time on the properties of spin-coated low-cost transparent conducting 1 at% Ga-ZnO thin films.

    PubMed

    Srivastava, Amit Kumar; Kumar, Jitendra

    2013-12-01

    Pure and 1 at% gallium (Ga)-doped zinc oxide (ZnO) thin films have been prepared with a low-cost spin coating technique on quartz substrates and annealed at 500 °C in vacuum ∼10(-3) mbar to create anion vacancies and generate charge carriers for photovoltaic application. Also, 0.5-1.5 at% extra zinc species were added in the precursor sol to investigate changes in film growth, morphology, optical absorption, electrical properties and photoluminescence. It is shown that 1 at% Ga-ZnO thin films with 0.5 at% extra zinc content after vacuum annealing for 60 min correspond to wurtzite-type hexagonal structure with (0001) preferred orientation, electrical resistivity of ∼9 × 10(-3) Ω cm and optical transparency of ∼65-90% in the visible range. Evidence has been advanced for the presence of defect levels within bandgap such as zinc vacancy (VZn), zinc interstitial (Zni), oxygen vacancy (Vo) and oxygen interstitial (Oi). Further, variation in ZnO optical bandgap occurring with Ga doping and insertion of additional zinc species has been explained by invoking two competing phenomena, namely bandgap widening and renormalization, usually observed in semiconductors with increasing carrier concentration.

  20. A Low-Cost Environmental Monitoring System: How to Prevent Systematic Errors in the Design Phase through the Combined Use of Additive Manufacturing and Thermographic Techniques.

    PubMed

    Salamone, Francesco; Danza, Ludovico; Meroni, Italo; Pollastro, Maria Cristina

    2017-04-11

    nEMoS (nano Environmental Monitoring System) is a 3D-printed device built following the Do-It-Yourself (DIY) approach. It can be connected to the web and it can be used to assess indoor environmental quality (IEQ). It is built using some low-cost sensors connected to an Arduino microcontroller board. The device is assembled in a small-sized case and both thermohygrometric sensors used to measure the air temperature and relative humidity, and the globe thermometer used to measure the radiant temperature, can be subject to thermal effects due to overheating of some nearby components. A thermographic analysis was made to rule out this possibility. The paper shows how the pervasive technique of additive manufacturing can be combined with the more traditional thermographic techniques to redesign the case and to verify the accuracy of the optimized system in order to prevent instrumental systematic errors in terms of the difference between experimental and actual values of the above-mentioned environmental parameters.

  1. Effect of zinc addition and vacuum annealing time on the properties of spin-coated low-cost transparent conducting 1 at% Ga–ZnO thin films

    PubMed Central

    Srivastava, Amit Kumar; Kumar, Jitendra

    2013-01-01

    Pure and 1 at% gallium (Ga)-doped zinc oxide (ZnO) thin films have been prepared with a low-cost spin coating technique on quartz substrates and annealed at 500 °C in vacuum ∼10−3 mbar to create anion vacancies and generate charge carriers for photovoltaic application. Also, 0.5–1.5 at% extra zinc species were added in the precursor sol to investigate changes in film growth, morphology, optical absorption, electrical properties and photoluminescence. It is shown that 1 at% Ga–ZnO thin films with 0.5 at% extra zinc content after vacuum annealing for 60 min correspond to wurtzite-type hexagonal structure with (0001) preferred orientation, electrical resistivity of ∼9 × 10−3 Ω cm and optical transparency of ∼65–90% in the visible range. Evidence has been advanced for the presence of defect levels within bandgap such as zinc vacancy (VZn), zinc interstitial (Zni), oxygen vacancy (Vo) and oxygen interstitial (Oi). Further, variation in ZnO optical bandgap occurring with Ga doping and insertion of additional zinc species has been explained by invoking two competing phenomena, namely bandgap widening and renormalization, usually observed in semiconductors with increasing carrier concentration. PMID:27877622

  2. Effect of zinc addition and vacuum annealing time on the properties of spin-coated low-cost transparent conducting 1 at% Ga-ZnO thin films

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit Kumar; Kumar, Jitendra

    2013-12-01

    Pure and 1 at% gallium (Ga)-doped zinc oxide (ZnO) thin films have been prepared with a low-cost spin coating technique on quartz substrates and annealed at 500 °C in vacuum ˜10-3 mbar to create anion vacancies and generate charge carriers for photovoltaic application. Also, 0.5-1.5 at% extra zinc species were added in the precursor sol to investigate changes in film growth, morphology, optical absorption, electrical properties and photoluminescence. It is shown that 1 at% Ga-ZnO thin films with 0.5 at% extra zinc content after vacuum annealing for 60 min correspond to wurtzite-type hexagonal structure with (0001) preferred orientation, electrical resistivity of ˜9 × 10-3 Ω cm and optical transparency of ˜65-90% in the visible range. Evidence has been advanced for the presence of defect levels within bandgap such as zinc vacancy (VZn), zinc interstitial (Zni), oxygen vacancy (Vo) and oxygen interstitial (Oi). Further, variation in ZnO optical bandgap occurring with Ga doping and insertion of additional zinc species has been explained by invoking two competing phenomena, namely bandgap widening and renormalization, usually observed in semiconductors with increasing carrier concentration.

  3. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  4. Development of ANFIS models for air quality forecasting and input optimization for reducing the computational cost and time

    NASA Astrophysics Data System (ADS)

    Prasad, Kanchan; Gorai, Amit Kumar; Goyal, Pramila

    2016-03-01

    This study aims to develop adaptive neuro-fuzzy inference system (ANFIS) for forecasting of daily air pollution concentrations of five air pollutants [sulphur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3) and particular matters (PM10)] in the atmosphere of a Megacity (Howrah). Air pollution in the city (Howrah) is rising in parallel with the economics and thus observing, forecasting and controlling the air pollution becomes increasingly important due to the health impact. ANFIS serve as a basis for constructing a set of fuzzy IF-THEN rules, with appropriate membership functions to generate the stipulated input-output pairs. The ANFIS model predictor considers the value of meteorological factors (pressure, temperature, relative humidity, dew point, visibility, wind speed, and precipitation) and previous day's pollutant concentration in different combinations as the inputs to predict the 1-day advance and same day air pollution concentration. The concentration value of five air pollutants and seven meteorological parameters of the Howrah city during the period 2009 to 2011 were used for development of the ANFIS model. Collinearity tests were conducted to eliminate the redundant input variables. A forward selection (FS) method is used for selecting the different subsets of input variables. Application of collinearity tests and FS techniques reduces the numbers of input variables and subsets which helps in reducing the computational cost and time. The performances of the models were evaluated on the basis of four statistical indices (coefficient of determination, normalized mean square error, index of agreement, and fractional bias).

  5. An Experimental and Computational Approach to Defining Structure/Reactivity Relationships for Intramolecular Addition Reactions to Bicyclic Epoxonium Ions

    PubMed Central

    Wan, Shuangyi; Gunaydin, Hakan; Houk, K. N.; Floreancig, Paul E.

    2008-01-01

    In this manuscript we report that oxidative cleavage reactions can be used to form oxocarbenium ions that react with pendent epoxides to form bicyclic epoxonium ions as an entry to the formation of cyclic oligoether compounds. Bicyclic epoxonium ion structure was shown to have a dramatic impact on the ratio of exo- to endo-cyclization reactions, with bicyclo[4.1.0] intermediates showing a strong preference for endo-closures and bicyclo[3.1.0] intermediates showing a preference for exo-closures. Computational studies on the structures and energetics of the transition states using the B3LYP/6-31G(d) method provide substantial insight into the origins of this selectivity. PMID:17547399

  6. Strapdown cost trend study and forecast

    NASA Technical Reports Server (NTRS)

    Eberlein, A. J.; Savage, P. G.

    1975-01-01

    The potential cost advantages offered by advanced strapdown inertial technology in future commercial short-haul aircraft are summarized. The initial procurement cost and six year cost-of-ownership, which includes spares and direct maintenance cost were calculated for kinematic and inertial navigation systems such that traditional and strapdown mechanization costs could be compared. Cost results for the inertial navigation systems showed that initial costs and the cost of ownership for traditional triple redundant gimbaled inertial navigators are three times the cost of the equivalent skewed redundant strapdown inertial navigator. The net cost advantage for the strapdown kinematic system is directly attributable to the reduction in sensor count for strapdown. The strapdown kinematic system has the added advantage of providing a fail-operational inertial navigation capability for no additional cost due to the use of inertial grade sensors and attitude reference computers.

  7. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... cost rate 2-year loan term -year loan term] -year loan term -year loan term 0% 4% 8% The cost of any... different rates: 0%, 4% and 8%. The total annual loan cost rates in this table are based on the total... loan term 0% 39.00% 9.86% 3.87% 4% 39.00% 11.03% 10.14% 8% 39.00% 11.03% 10.20% The cost of any...

  8. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  9. Computer Model of Biopolymer Crystal Growth and Aggregation by Addition of Macromolecular Units — a Comparative Study

    NASA Astrophysics Data System (ADS)

    Siódmiak, J.; Gadomski, A.

    We discuss the results of a computer simulation of the biopolymer crystal growth and aggregation based on the 2D lattice Monte Carlo technique and the HP approximation of the biopolymers. As a modeled molecule (growth unit) we comparatively consider the previously studied non-mutant lysozyme protein, Protein Data Bank (PDB) ID: 193L, which forms, under a certain set of thermodynamic-kinetic conditions, the tetragonal crystals, and an amyloidogenic variant of the lysozyme, PDB ID: 1LYY, which is known as fibril-yielding and prone-to-aggregation agent. In our model, the site-dependent attachment, detachment and migration processes are involved. The probability of growth unit motion, attachment and detachment to/from the crystal surface are assumed to be proportional to the orientational factor representing the anisotropy of the molecule. Working within a two-dimensional representation of the truly three-dimensional process, we also argue that the crystal grows in a spiral way, whereby one or more screw dislocations on the crystal surface give rise to a terrace. We interpret the obtained results in terms of known models of crystal growth and aggregation such as B-C-F (Burton-Cabrera-Frank) dislocation driven growth and M-S (Mullins-Sekerka) instability concept, with stochastic aspects supplementing the latter. We discuss the conditions under which crystals vs non-crystalline protein aggregates appear, and how the process depends upon difference in chemical structure of the protein molecule seen as the main building block of the elementary crystal cell.

  10. Effects of protonation and C5 methylation on the electrophilic addition reaction of cytosine: a computational study.

    PubMed

    Jin, Lingxia; Wang, Wenliang; Hu, Daodao; Min, Suotian

    2013-01-10

    The mechanism for the effects of protonation and C5 methylation on the electrophilic addition reaction of Cyt has been explored by means of CBS-QB3 and CBS-QB3/PCM methods. In the gas phase, three paths, two protonated paths (N3 and O2 protonated paths B and C) as well as one neutral path (path A), were mainly discussed, and the calculated results indicate that the reaction of the HSO(3)(-) group with neutral Cyt is unlikely because of its high activation free energy, whereas O2-protonated path (path C) is the most likely to occur. In the aqueous phase, path B is the most feasible mechanism to account for the fact that the activation free energy of path B decreases compared with the corresponding path in the gas phase, whereas those of paths A and C increase. The main striking results are that the HSO(3)(-) group directly interacts with the C5═C6 bond rather than the N3═C4 bond and that the C5 methylation, compared with Cyt, by decreasing values of global electrophilicity index manifests that C5 methylation forms are less electrophilic power as well as by decreasing values of NPA charges on C5 site of the intermediates make the trend of addition reaction weaken, which is in agreement with the experimental observation that the rate of 5-MeCyt reaction is approximately 2 orders of magnitude slower than that of Cyt in the presence of bisulfite. Apart from cis and trans isomers, the rare third isomer where both the CH(3) and SO(3) occupy axial positions has been first found in the reactions of neutral and protonated 5-MeCyt with the HSO(3)(-) group. Furthermore, the transformation of the third isomer from the cis isomer can occur easily.

  11. ANL/RBC: A computer code for the analysis of Rankine bottoming cycles, including system cost evaluation and off-design performance

    NASA Technical Reports Server (NTRS)

    Mclennan, G. A.

    1986-01-01

    This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.

  12. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  13. The cumulative cost of additional wakefulness: dose-response effects on neurobehavioral functions and sleep physiology from chronic sleep restriction and total sleep deprivation

    NASA Technical Reports Server (NTRS)

    Van Dongen, Hans P A.; Maislin, Greg; Mullington, Janet M.; Dinges, David F.

    2003-01-01

    were near-linearly related to the cumulative duration of wakefulness in excess of 15.84 h (s.e. 0.73 h). CONCLUSIONS: Since chronic restriction of sleep to 6 h or less per night produced cognitive performance deficits equivalent to up to 2 nights of total sleep deprivation, it appears that even relatively moderate sleep restriction can seriously impair waking neurobehavioral functions in healthy adults. Sleepiness ratings suggest that subjects were largely unaware of these increasing cognitive deficits, which may explain why the impact of chronic sleep restriction on waking cognitive functions is often assumed to be benign. Physiological sleep responses to chronic restriction did not mirror waking neurobehavioral responses, but cumulative wakefulness in excess of a 15.84 h predicted performance lapses across all four experimental conditions. This suggests that sleep debt is perhaps best understood as resulting in additional wakefulness that has a neurobiological "cost" which accumulates over time.

  14. Would school closure for the 2009 H1N1 influenza epidemic have been worth the cost?: a computational simulation of Pennsylvania

    PubMed Central

    2011-01-01

    Background During the 2009 H1N1 influenza epidemic, policy makers debated over whether, when, and how long to close schools. While closing schools could have reduced influenza transmission thereby preventing cases, deaths, and health care costs, it may also have incurred substantial costs from increased childcare needs and lost productivity by teachers and other school employees. Methods A combination of agent-based and Monte Carlo economic simulation modeling was used to determine the cost-benefit of closing schools (vs. not closing schools) for different durations (range: 1 to 8 weeks) and symptomatic case incidence triggers (range: 1 to 30) for the state of Pennsylvania during the 2009 H1N1 epidemic. Different scenarios varied the basic reproductive rate (R0) from 1.2, 1.6, to 2.0 and used case-hospitalization and case-fatality rates from the 2009 epidemic. Additional analyses determined the cost per influenza case averted of implementing school closure. Results For all scenarios explored, closing schools resulted in substantially higher net costs than not closing schools. For R0 = 1.2, 1.6, and 2.0 epidemics, closing schools for 8 weeks would have resulted in median net costs of $21.0 billion (95% Range: $8.0 - $45.3 billion). The median cost per influenza case averted would have been $14,185 ($5,423 - $30,565) for R0 = 1.2, $25,253 ($9,501 - $53,461) for R0 = 1.6, and $23,483 ($8,870 - $50,926) for R0 = 2.0. Conclusions Our study suggests that closing schools during the 2009 H1N1 epidemic could have resulted in substantial costs to society as the potential costs of lost productivity and childcare could have far outweighed the cost savings in preventing influenza cases. PMID:21599920

  15. Development of cost-effective media to increase the economic potential for larger-scale bioproduction of natural food additives by Lactobacillus rhamnosus , Debaryomyces hansenii , and Aspergillus niger.

    PubMed

    Salgado, José Manuel; Rodríguez, Noelia; Cortés, Sandra; Domínguez, José Manuel

    2009-11-11

    Yeast extract (YE) is the most common nitrogen source in a variety of bioprocesses in spite of the high cost. Therefore, the use of YE in culture media is one of the major technical hurdles to be overcome for the development of low-cost fermentation routes, making the search for alternative-cheaper nitrogen sources particularly desired. The aim of the current study is to develop cost-effective media based on corn steep liquor (CSL) and locally available vinasses in order to increase the economic potential for larger-scale bioproduction. Three microorganisms were evaluated: Lactobacillus rhamnosus , Debaryomyces hansenii , and Aspergillus niger . The amino acid profile and protein concentration was relevant for the xylitol and citric acid production by D. hansenii and A. niger , respectively. Metals also played an important role for citric acid production, meanwhile, D. hansenii showed a strong dependence with the initial amount of Mg(2+). Under the best conditions, 28.8 g lactic acid/L (Q(LA) = 0.800 g/L.h, Y(LA/S) = 0.95 g/g), 35.3 g xylitol/L (Q(xylitol) = 0.380 g/L.h, Y(xylitol/S) = 0.69 g/g), and 13.9 g citric acid/L (Q(CA) = 0.146 g/L.h, Y(CA/S) = 0.63 g/g) were obtained. The economic efficiency (E(p/euro)) parameter identify vinasses as a lower cost and more effective nutrient source in comparison to CSL.

  16. Reactivation steps by 2-PAM of tabun-inhibited human acetylcholinesterase: reducing the computational cost in hybrid QM/MM methods.

    PubMed

    da Silva Gonçalves, Arlan; França, Tanos Celmar Costa; Caetano, Melissa Soares; Ramalho, Teodorico Castro

    2014-01-01

    The present work describes a simple integrated Quantum Mechanics/Molecular Mechanics method developed to study the reactivation steps by pralidoxime (2-PAM) of acetylcholinesterase (AChE) inhibited by the neurotoxic agent Tabun. The method was tested on an AChE model and showed to be able to corroborate most of the results obtained before, through a more complex and time-consuming methodology, proving to be suitable to this kind of mechanistic study at a lower computational cost.

  17. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  18. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    PubMed

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  19. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    USGS Publications Warehouse

    Gaydos, Leonard

    1978-01-01

    The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.

  20. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  1. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems.

    PubMed

    Li, Ying

    2016-09-16

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  2. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems

    NASA Astrophysics Data System (ADS)

    Li, Ying

    2016-09-01

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  3. 48 CFR 27.406-2 - Additional data requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the prescribed form, for reproduction, and for delivery. In order to minimize storage costs for the... (d)). (c) Absent an established program for dissemination of computer software, agencies should not order additional computer software under the clause at 52.227-16, for the sole purpose of...

  4. Cost Reduction through the Use of Additive Manufacturing (3D Printing) and Collaborative Product Lifecycle Management Technologies to Enhance the Navy’s Maintenance Programs

    DTIC Science & Technology

    2013-08-30

    Intermediate Level Military and Civilian Medium Medium Depot Level Civilian High High D. ADDITIVE MANUFACTURING AM, more commonly known as 3D...eutectic metals, edible materials Granular Direct metal laser sintering (DMLS) Most metal alloys Electron beam melting (EBM) Titanium alloys ...Selective laser melting (SLM) Titanium alloys , cobalt chrome alloys , stainless steel, aluminum Selective heat sintering (SHS) Thermoplastic powder

  5. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K... property value: Interest rate: Monthly advance: Initial draw: Line of credit: Initial Loan Charges Closing...: $301.80 Initial draw: $1,000 Line of credit: $4,000 Initial Loan Charges Closing costs: $5,000...

  6. 12 CFR Appendix K to Part 1026 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. K Appendix K to Part 1026—Total Annual Loan Cost... Terms Age of youngest borrower: Appraised property value: Interest rate: Monthly advance: Initial draw... Appraised property value: $100,000 Interest rate: 9% Monthly advance: $301.80 Initial draw: $1,000 Line...

  7. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K... property value: Interest rate: Monthly advance: Initial draw: Line of credit: Initial Loan Charges Closing...: $301.80 Initial draw: $1,000 Line of credit: $4,000 Initial Loan Charges Closing costs: $5,000...

  8. Cost Reduction Through the Use of Additive Manufacturing (3d Printing) and Collaborative Product Life Cycle Management Technologies to Enhance the Navy’s Maintenance Programs

    DTIC Science & Technology

    2013-09-01

    Level Military and Civilian Medium Medium Depot Level Civilian High High D. ADDITIVE MANUFACTURING AM, more commonly known as 3D printing, is a...Thermoplastics (e.g., PLA, ABS), HDPE, eutectic metals, edible materials Granular Direct metal laser sintering (DMLS) Most metal alloys ...Electron beam melting (EBM) Titanium alloys Selective laser melting (SLM) Titanium alloys , cobalt chrome alloys , stainless steel, aluminum Selective heat

  9. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention.

    PubMed

    Akutsu, Yasushi; Hamazaki, Yuji; Sekimoto, Teruo; Kaneko, Kyouichi; Kodama, Yusuke; Li, Hui-Ling; Suyama, Jumpei; Gokan, Takehiko; Sakai, Koshiro; Kosaki, Ryota; Yokota, Hiroyuki; Tsujita, Hiroaki; Tsukamoto, Shigeto; Sakurai, Masayuki; Sambe, Takehiko; Oguchi, Katsuji; Uchida, Naoki; Kobayashi, Shinichi; Aoki, Atsushi; Kobayashi, Youichi

    2016-06-01

    Our data shows the regional coronary artery calcium scores (lesion CAC) on multidetector computed tomography (MDCT) and the cross-section imaging on MDCT angiography (CTA) in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI). CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany) before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan) and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions.

  10. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention

    PubMed Central

    Akutsu, Yasushi; Hamazaki, Yuji; Sekimoto, Teruo; Kaneko, Kyouichi; Kodama, Yusuke; Li, Hui-Ling; Suyama, Jumpei; Gokan, Takehiko; Sakai, Koshiro; Kosaki, Ryota; Yokota, Hiroyuki; Tsujita, Hiroaki; Tsukamoto, Shigeto; Sakurai, Masayuki; Sambe, Takehiko; Oguchi, Katsuji; Uchida, Naoki; Kobayashi, Shinichi; Aoki, Atsushi; Kobayashi, Youichi

    2016-01-01

    Our data shows the regional coronary artery calcium scores (lesion CAC) on multidetector computed tomography (MDCT) and the cross-section imaging on MDCT angiography (CTA) in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI). CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany) before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan) and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions. PMID:26977441

  11. Synthesis of Bridged Heterocycles via Sequential 1,4- and 1,2-Addition Reactions to α,β-Unsaturated N-Acyliminium Ions: Mechanistic and Computational Studies.

    PubMed

    Yazici, Arife; Wille, Uta; Pyne, Stephen G

    2016-02-19

    Novel tricyclic bridged heterocyclic systems can be readily prepared from sequential 1,4- and 1,2-addition reactions of allyl and 3-substituted allylsilanes to indolizidine and quinolizidine α,β-unsaturated N-acyliminium ions. These reactions involve a novel N-assisted, transannular 1,5-hydride shift. Such a mechanism was supported by examining the reaction of a dideuterated indolizidine, α,β-unsaturated N-acyliminium ion precursor, which provided specifically dideuterated tricyclic bridged heterocyclic products, and from computational studies. In contrast, the corresponding pyrrolo[1,2-a]azepine system did not provide the corresponding tricyclic bridged heterocyclic product and gave only a bis-allyl adduct, while more substituted versions gave novel furo[3,2-d]pyrrolo[1,2-a]azepine products. Such heterocyclic systems would be expected to be useful scaffolds for the preparation of libraries of novel compounds for new drug discovery programs.

  12. Performance and Cost-Effectiveness of Computed Tomography Lung Cancer Screening Scenarios in a Population-Based Setting: A Microsimulation Modeling Analysis in Ontario, Canada

    PubMed Central

    ten Haaf, Kevin; Tammemägi, Martin C.; Bondy, Susan J.; van der Aalst, Carlijn M.; Gu, Sumei; de Koning, Harry J.

    2017-01-01

    Background The National Lung Screening Trial (NLST) results indicate that computed tomography (CT) lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria. Methods and Findings This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP), Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars), and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure) were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55–75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars) per

  13. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal energy storage coupled with district heating or cooling systems. Volume I. Main text

    SciTech Connect

    Huber, H.D.; Brown, D.R.; Reilly, R.W.

    1982-04-01

    A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. The AQUASTOR model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two principal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains the main text, including introduction, program description, input data instruction, a description of the output, and Appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

  14. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    PubMed Central

    Gómez León, Nieves; Escalona, Sofía; Bandrés, Beatriz; Belda, Cristobal; Callejo, Daniel; Blasco, Juan Antonio

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (n = 40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC. PMID:25431665

  15. OPTIM: Computer program to generate a vertical profile which minimizes aircraft fuel burn or direct operating cost. User's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A profile of altitude, airspeed, and flight path angle as a function of range between a given set of origin and destination points for particular models of transport aircraft provided by NASA is generated. Inputs to the program include the vertical wind profile, the aircraft takeoff weight, the costs of time and fuel, certain constraint parameters and control flags. The profile can be near optimum in the sense of minimizing: (1) fuel, (2) time, or (3) a combination of fuel and time (direct operating cost (DOC)). The user can also, as an option, specify the length of time the flight is to span. The theory behind the technical details of this program is also presented.

  16. Accuracy of a Low-Cost Novel Computer-Vision Dynamic Movement Assessment: Potential Limitations and Future Directions

    NASA Astrophysics Data System (ADS)

    McGroarty, M.; Giblin, S.; Meldrum, D.; Wetterling, F.

    2016-04-01

    The aim of the study was to perform a preliminary validation of a low cost markerless motion capture system (CAPTURE) against an industry gold standard (Vicon). Measurements of knee valgus and flexion during the performance of a countermovement jump (CMJ) between CAPTURE and Vicon were compared. After correction algorithms were applied to the raw CAPTURE data acceptable levels of accuracy and precision were achieved. The knee flexion angle measured for three trials using Capture deviated by -3.8° ± 3° (left) and 1.7° ± 2.8° (right) compared to Vicon. The findings suggest that low-cost markerless motion capture has potential to provide an objective method for assessing lower limb jump and landing mechanics in an applied sports setting. Furthermore, the outcome of the study warrants the need for future research to examine more fully the potential implications of the use of low-cost markerless motion capture in the evaluation of dynamic movement for injury prevention.

  17. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  18. Development, Implementation, and Cost-Assessment of an Integrated Computer-Assisted Instruction Course on Drug Interactions.

    ERIC Educational Resources Information Center

    Narducci, Warren A.

    1985-01-01

    A study of the feasibility of using integrated, computer-assisted instruction in a drug interaction course revealed that despite the high initial time and financial investment, the potential educational benefits and high student acceptance of the instruction supports its application in other curriculum areas. (MSE)

  19. Implications of Using Computer-Based Training with the AN/SQQ-89(v) Sonar System: Operating and Support Costs

    DTIC Science & Technology

    2012-06-01

    Defense Science Board ECR Electronic Classroom ERNT Executive Review of Navy Training ETS Engineering and Technical Services EXCEL Excellence...Delivery Systems for Web-Based Technology In A schools, CBT is conducted in an electronic classroom ( ECR ) environment. The ECR consists of several... ECRs . The average age of the computers was approximately 6 years (Naval Inspector General, 21 2009, p. 5). The IG group found that most ECRs

  20. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    SciTech Connect

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  1. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal-energy storage oupled with district-heating or cooling systems. Volume II. Appendices

    SciTech Connect

    Huber, H.D.; Brown, D.R.; Reilly, R.W.

    1982-04-01

    A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. the AQUASTOR Model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two prinicpal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains all the appendices, including supply and distribution system cost equations and models, descriptions of predefined residential districts, key equations for the cooling degree-hour methodology, a listing of the sample case output, and appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

  2. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  3. Troubleshooting Costs

    NASA Astrophysics Data System (ADS)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  4. Construction and field test of a programmable and self-cleaning auto-sampler controlled by a low-cost one-board computer

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias

    2016-04-01

    This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.

  5. Performance, throughput, and cost of in-home training for the Army Reserve: Using asynchronous computer conferencing as an alternative to resident training

    SciTech Connect

    Hahn, H.A. ); Ashworth, R.L. Jr.; Phelps, R.H. ); Byers, J.C. )

    1990-01-01

    Asynchronous computer conferencing (ACC) was investigated as an alternative to resident training for the Army Reserve Component (RC). Specifically, the goals were to (1) evaluate the performance and throughput of ACC as compared with traditional Resident School instruction and (2) determine the cost-effectiveness of developing and implementing ACC. Fourteen RC students took a module of the Army Engineer Officer Advanced Course (EOAC) via ACC. Course topics included Army doctrine, technical engineering subjects, leadership, and presentation skills. Resident content was adapted for presentation via ACC. The programs of instruction for ACC and the equivalent resident course were identical; only the media used for presentation were changed. Performance on tests, homework, and practical exercises; self-assessments of learning; throughput; and cost data wee the measures of interest. Comparison data were collected on RC students taking the course in residence. Results indicated that there were no performance differences between the two groups. Students taking the course via ACC perceived greater learning benefit than did students taking the course in residence. Resident throughput was superior to ACC throughput, both in terms of numbers of students completing and time to complete the course. In spite of this fact, however, ACC was more cost-effective than resident training.

  6. A computational study of the addition of ReO3L (L = Cl(-), CH3, OCH3 and Cp) to ethenone.

    PubMed

    Aniagyei, Albert; Tia, Richard; Adei, Evans

    2016-01-01

    The periselectivity and chemoselectivity of the addition of transition metal oxides of the type ReO3L (L = Cl, CH3, OCH3 and Cp) to ethenone have been explored at the MO6 and B3LYP/LACVP* levels of theory. The activation barriers and reaction energies for the stepwise and concerted addition pathways involving multiple spin states have been computed. In the reaction of ReO3L (L = Cl(-), OCH3, CH3 and Cp) with ethenone, the concerted [2 + 2] addition of the metal oxide across the C=C and C=O double bond to form either metalla-2-oxetane-3-one or metalla-2,4-dioxolane is the most kinetically favored over the formation of metalla-2,5-dioxolane-3-one from the direct [3 + 2] addition pathway. The trends in activation and reaction energies for the formation of metalla-2-oxetane-3-one and metalla-2,4-dioxolane are Cp < Cl(-) < OCH3 < CH3 and Cp < OCH3 < CH3 < Cl(-) and for the reaction energies are Cp < OCH3 < Cl(-) < CH3 and Cp < CH3 < OCH3 < Cl CH3. The concerted [3 + 2] addition of the metal oxide across the C=C double of the ethenone to form species metalla-2,5-dioxolane-3-one is thermodynamically the most favored for the ligand L = Cp. The direct [2 + 2] addition pathways leading to the formations of metalla-2-oxetane-3-one and metalla-2,4-dioxolane is thermodynamically the most favored for the ligands L = OCH3 and Cl(-). The difference between the calculated [2 + 2] activation barriers for the addition of the metal oxide LReO3 across the C=C and C=O functionalities of ethenone are small except for the case of L = Cl(-) and OCH3. The rearrangement of the metalla-2-oxetane-3-one-metalla-2,5-dioxolane-3-one even though feasible, are unfavorable due to high activation energies of their rate-determining steps. For the rearrangement of the metalla-2-oxetane-3-one to metalla-2,5-dioxolane-3-one, the trends in activation barriers is found to follow the order OCH3 < Cl(-) < CH3 < Cp. The trends in the activation energies for

  7. Determination of Zinc-Based Additives in Lubricating Oils by Flow-Injection Analysis with Flame-AAS Detection Exploiting Injection with a Computer-Controlled Syringe.

    PubMed

    Pignalosa, Gustavo; Knochen, Moisés; Cabrera, Noel

    2005-01-01

    A flow-injection system is proposed for the determination of metal-based additives in lubricating oils. The system, operating under computer control uses a motorised syringe for measuring and injecting the oil sample (200 muL) in a kerosene stream, where it is dispersed by means of a packed mixing reactor and carried to an atomic absorption spectrometer which is used as detector. Zinc was used as model analyte. Two different systems were evaluated, one for low concentrations (range 0-10 ppm) and the second capable of providing higher dilution rates for high concentrations (range 0.02%-0.2% w/w). The sampling frequency was about 30 samples/h. Calibration curves fitted a second-degree regression model (r(2) = 0.996). Commercial samples with high and low zinc levels were analysed by the proposed method and the results were compared with those obtained with the standard ASTM method. The t test for mean values showed no significant differences at the 95% confidence level. Precision (RSD%) was better than 5% (2% typical) for the high concentrations system. The carryover between successive injections was found to be negligible.

  8. Determination of Zinc-Based Additives in Lubricating Oils by Flow-Injection Analysis with Flame-AAS Detection Exploiting Injection with a Computer-Controlled Syringe

    PubMed Central

    Pignalosa, Gustavo; Cabrera, Noel

    2005-01-01

    A flow-injection system is proposed for the determination of metal-based additives in lubricating oils. The system, operating under computer control uses a motorised syringe for measuring and injecting the oil sample (200 μL) in a kerosene stream, where it is dispersed by means of a packed mixing reactor and carried to an atomic absorption spectrometer which is used as detector. Zinc was used as model analyte. Two different systems were evaluated, one for low concentrations (range 0–10 ppm) and the second capable of providing higher dilution rates for high concentrations (range 0.02%–0.2% w/w). The sampling frequency was about 30 samples/h. Calibration curves fitted a second-degree regression model (r 2 = 0.996). Commercial samples with high and low zinc levels were analysed by the proposed method and the results were compared with those obtained with the standard ASTM method. The t test for mean values showed no significant differences at the 95% confidence level. Precision (RSD%) was better than 5% (2% typical) for the high concentrations system. The carryover between successive injections was found to be negligible. PMID:18924720

  9. Reducing Communication in Algebraic Multigrid Using Additive Variants

    SciTech Connect

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for good performance on future exascale architectures.

  10. What Constitutes a "Good" Sensitivity Analysis? Elements and Tools for a Robust Sensitivity Analysis with Reduced Computational Cost

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin

    2016-04-01

    Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  11. Application of a dual-resolution voxellation scheme to small ROI reconstruction in iterative CBCT for the reduction of computational cost

    NASA Astrophysics Data System (ADS)

    Lee, Minsik; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Park, Yeonok; Park, Cheulkyu; Cho, Heemoon; Choi, Sungil; Koo, Yangseo

    2014-11-01

    In iterative methods for cone-beam computed tomography (CBCT) reconstruction, the use of a huge system matrix is the primary computational bottleneck and is still an obstacle to the more widespread use of these methods in practice. In this paper, to put iterative methods to practical applications, we propose a pragmatic idea, the-so-called dual-resolution voxellation scheme, for a small region-of-interest (ROI) reconstruction in CBCT in which voxels outside the ROI are binned with a double resolution such as 2×2×2, 4×4×4, 8×8×8, 16×16×16, etc., and the voxel sizewithin the ROI remains unchanged. In some situations of medical diagnosis, physicians are interested only in a small ROI containing a target diagnosis from the examined structure. We implemented an efficient compressed-sensing (CS)-based reconstruction algorithm with the proposed voxellation scheme incorporated and performed both simulation and experimental works to investigate the imaging characteristics. Our results indicate that the proposed voxellation scheme seems to be effective in reducing the computational cost considerably for a small ROI reconstruction in iterative CBCT, with the image quality inside the ROI not being noticeably impaired.

  12. Calibration of forcefields for molecular simulation: sequential design of computer experiments for building cost-efficient kriging metamodels.

    PubMed

    Cailliez, Fabien; Bourasseau, Arnaud; Pernot, Pascal

    2014-01-15

    We present a global strategy for molecular simulation forcefield optimization, using recent advances in Efficient Global Optimization algorithms. During the course of the optimization process, probabilistic kriging metamodels are used, that predict molecular simulation results for a given set of forcefield parameter values. This enables a thorough investigation of parameter space, and a global search for the minimum of a score function by properly integrating relevant uncertainty sources. Additional information about the forcefield parameters are obtained that are inaccessible with standard optimization strategies. In particular, uncertainty on the optimal forcefield parameters can be estimated, and transferred to simulation predictions. This global optimization strategy is benchmarked on the TIP4P water model.

  13. The Cambridge Face Tracker: Accurate, Low Cost Measurement of Head Posture Using Computer Vision and Face Recognition Software

    PubMed Central

    Thomas, Peter B. M.; Baltrušaitis, Tadas; Robinson, Peter; Vivian, Anthony J.

    2016-01-01

    Purpose We validate a video-based method of head posture measurement. Methods The Cambridge Face Tracker uses neural networks (constrained local neural fields) to recognize facial features in video. The relative position of these facial features is used to calculate head posture. First, we assess the accuracy of this approach against videos in three research databases where each frame is tagged with a precisely measured head posture. Second, we compare our method to a commercially available mechanical device, the Cervical Range of Motion device: four subjects each adopted 43 distinct head postures that were measured using both methods. Results The Cambridge Face Tracker achieved confident facial recognition in 92% of the approximately 38,000 frames of video from the three databases. The respective mean error in absolute head posture was 3.34°, 3.86°, and 2.81°, with a median error of 1.97°, 2.16°, and 1.96°. The accuracy decreased with more extreme head posture. Comparing The Cambridge Face Tracker to the Cervical Range of Motion Device gave correlation coefficients of 0.99 (P < 0.0001), 0.96 (P < 0.0001), and 0.99 (P < 0.0001) for yaw, pitch, and roll, respectively. Conclusions The Cambridge Face Tracker performs well under real-world conditions and within the range of normally-encountered head posture. It allows useful quantification of head posture in real time or from precaptured video. Its performance is similar to that of a clinically validated mechanical device. It has significant advantages over other approaches in that subjects do not need to wear any apparatus, and it requires only low cost, easy-to-setup consumer electronics. Translational Relevance Noncontact assessment of head posture allows more complete clinical assessment of patients, and could benefit surgical planning in future. PMID:27730008

  14. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    PubMed

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  15. Program Demand Cost Model for Alaskan Schools. Eighth Edition.

    ERIC Educational Resources Information Center

    Morgan, Michael; Mearig, Tim; Coffee, Nathan

    The State of Alaska Department of Education has created a handbook for establishing budgets for the following three types of construction projects: new schools or additions; renovations; and combined new work and renovations. The handbook supports a demand cost model computer program that includes detailed renovation cost data, itemized by…

  16. Weight and cost estimating relationships for heavy lift airships

    NASA Technical Reports Server (NTRS)

    Gray, D. W.

    1979-01-01

    Weight and cost estimating relationships, including additional parameters that influence the cost and performance of heavy-lift airships (HLA), are discussed. Inputs to a closed loop computer program, consisting of useful load, forward speed, lift module positive or negative thrust, and rotors and propellers, are examined. Detail is given to the HLA cost and weight program (HLACW), which computes component weights, vehicle size, buoyancy lift, rotor and propellar thrust, and engine horse power. This program solves the problem of interrelating the different aerostat, rotors, engines and propeller sizes. Six sets of 'default parameters' are left for the operator to change during each computer run enabling slight data manipulation without altering the program.

  17. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  18. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts...

  19. Isothiourea-catalysed enantioselective pyrrolizine synthesis: synthetic and computational studies† †Electronic supplementary information (ESI) available: NMR spectra, HPLC analysis and computational co-ordinates. Data available.12 CCDC 1483759. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c6ob01557c Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Stark, Daniel G.; Williamson, Patrick; Gayner, Emma R.; Musolino, Stefania F.; Kerr, Ryan W. F.; Taylor, James E.; Slawin, Alexandra M. Z.; O'Riordan, Timothy J. C.

    2016-01-01

    The catalytic enantioselective synthesis of a range of cis-pyrrolizine carboxylate derivatives with outstanding stereocontrol (14 examples, >95 : 5 dr, >98 : 2 er) through an isothiourea-catalyzed intramolecular Michael addition-lactonisation and ring-opening approach from the corresponding enone acid is reported. An optimised and straightforward three-step synthetic route to the enone acid starting materials from readily available pyrrole-2-carboxaldehydes is delineated, with benzotetramisole (5 mol%) proving the optimal catalyst for the enantioselective process. Ring-opening of the pyrrolizine dihydropyranone products with either MeOH or a range of amines leads to the desired products in excellent yield and enantioselectivity. Computation has been used to probe the factors leading to high stereocontrol, with the formation of the observed cis-steroisomer predicted to be kinetically and thermodynamically favoured. PMID:27489030

  20. Cost Considerations in Cloud Computing

    DTIC Science & Technology

    2014-01-01

    development of a range of new distributed file systems and data- bases that have better scalability properties than traditional SQL databases. Hadoop ...data. Many systems exist that extend or supplement Hadoop —such as Apache Accumulo, which provides a highly granular mechanism for managing security...Accumulo database, when implemented on Hadoop , has a data ingestion rate significantly higher than that provided by Oracle. However, it should be

  1. Breaking Barriers in Polymer Additive Manufacturing

    SciTech Connect

    Love, Lonnie J; Duty, Chad E; Post, Brian K; Lind, Randall F; Lloyd, Peter D; Kunc, Vlastimil; Peter, William H; Blue, Craig A

    2015-01-01

    Additive Manufacturing (AM) enables the creation of complex structures directly from a computer-aided design (CAD). There are limitations that prevent the technology from realizing its full potential. AM has been criticized for being slow and expensive with limited build size. Oak Ridge National Laboratory (ORNL) has developed a large scale AM system that improves upon each of these areas by more than an order of magnitude. The Big Area Additive Manufacturing (BAAM) system directly converts low cost pellets into a large, three-dimensional part at a rate exceeding 25 kg/h. By breaking these traditional barriers, it is possible for polymer AM to penetrate new manufacturing markets.

  2. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  3. Real-space finite-difference calculation method of generalized Bloch wave functions and complex band structures with reduced computational cost.

    PubMed

    Tsukamoto, Shigeru; Hirose, Kikuji; Blügel, Stefan

    2014-07-01

    Generalized Bloch wave functions of bulk structures, which are composed of not only propagating waves but also decaying and growing evanescent waves, are known to be essential for defining the open boundary conditions in the calculations of the electronic surface states and scattering wave functions of surface and junction structures. Electronic complex band structures being derived from the generalized Bloch wave functions are also essential for studying bound states of the surface and junction structures, which do not appear in conventional band structures. We present a novel calculation method to obtain the generalized Bloch wave functions of periodic bulk structures by solving a generalized eigenvalue problem, whose dimension is drastically reduced in comparison with the conventional generalized eigenvalue problem derived by Fujimoto and Hirose [Phys. Rev. B 67, 195315 (2003)]. The generalized eigenvalue problem derived in this work is even mathematically equivalent to the conventional one, and, thus, we reduce computational cost for solving the eigenvalue problem considerably without any approximation and losing the strictness of the formulations. To exhibit the performance of the present method, we demonstrate practical calculations of electronic complex band structures and electron transport properties of Al and Cu nanoscale systems. Moreover, employing atom-structured electrodes and jellium-approximated ones for both of the Al and Si monatomic chains, we investigate how much the electron transport properties are unphysically affected by the jellium parts.

  4. The Hidden Costs of Owning a Microcomputer.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Before purchasing computer hardware, individuals must consider the costs associated with the setup and operation of a microcomputer system. Included among the initial costs of purchasing a computer are the costs of the computer, one or more disk drives, a monitor, and a printer as well as the costs of such optional peripheral devices as a plotter…

  5. Marginal Costs and Formula-Based Funding.

    ERIC Educational Resources Information Center

    O'Connor, Ellen

    Marginal cost is the cost of producing an additional unit. In higher education, one marginal cost would be cost of educating an additional student. Formula-based budget determination for public higher education is usually based on average cost per student. This study estimates marginal cost and compares it with average cost. There are several…

  6. Computational studies on the interactions among redox couples, additives and TiO2: implications for dye-sensitized solar cells.

    PubMed

    Asaduzzaman, Abu Md; Schreckenbach, Georg

    2010-11-21

    One of the major and unique components of dye-sensitized solar cells (DSSC) is the iodide/triiodide redox couple. Periodic density-functional calculations have been carried out to study the interactions among three different components of the DSSC, i.e. the redox shuttle, the TiO(2) semiconductor surface, and nitrogen containing additives, with a focus on the implications for the performance of the DSSC. Iodide and bromide with alkali metal cations as counter ions are strongly adsorbed on the TiO(2) surface. Small additive molecules also strongly interact with TiO(2). Both interactions induce a negative shift of the Fermi energy of TiO(2). The negative shift of the Fermi energy is related to the performance of the cell by increasing the open voltage of the cell and retarding the injection dynamics (decreasing the short circuit current). Additive molecules, however, have relatively weaker interaction with iodide and triiodide.

  7. Characterization of pulmonary nodules on computer tomography (CT) scans: the effect of additive white noise on features selection and classification performance

    NASA Astrophysics Data System (ADS)

    Osicka, Teresa; Freedman, Matthew T.; Ahmed, Farid

    2007-03-01

    The goal of this project is to use computer analysis to classify small lung nodules, identified on CT, into likely benign and likely malignant categories. We compared discrete wavelet transforms (DWT) based features and a modification of classical features used and reported by others. To determine the best combination of features for classification, several intensities of white noise were added to the original images to determine the effect of such noise on classification accuracy. Two different approaches were used to determine the effect of noise: in the first method the best features for classification of nodules on the original image were retained as noise was added. In the second approach, we recalculated the results to reselect the best classification features for each particular level of added noise. The CT images are from the National Lung Screening Trial (NLST) of the National Cancer Institute (NCI). For this study, nodules were extracted in window frames of three sizes. Malignant nodules were cytologically or histogically diagnosed, while benign had two-year follow-up. A linear discriminant analysis with Fisher criterion (FLDA) approach was used for feature selection and classification, and decision matrix for matched sample to compare the classification accuracy. The initial features mode revealed sensitivity to both the amount of noise and the size of window frame. The recalculated feature mode proved more robust to noise with no change in terms of classification accuracy. This indicates that the best features for computer classification of lung nodules will differ with noise, and, therefore, with exposure.

  8. Comprehensive cardiac assessment with multislice computed tomography: evaluation of left ventricular function and perfusion in addition to coronary anatomy in patients with previous myocardial infarction

    PubMed Central

    Henneman, M M; Schuijf, J D; Jukema, J W; Lamb, H J; de Roos, A; Dibbets, P; Stokkel, M P; van der Wall, E E; Bax, J J

    2006-01-01

    Objective To evaluate a comprehensive multislice computed tomography (MSCT) protocol in patients with previous infarction, including assessment of coronary artery stenoses, left ventricular (LV) function and perfusion. Patients and methods 16‐slice MSCT was performed in 21 patients with previous infarction; from the MSCT data, coronary artery stenoses, (regional and global) LV function and perfusion were assessed. Invasive coronary angiography and gated single‐photon emission computed tomography (SPECT) served as the reference standards for coronary artery stenoses and LV function/perfusion, respectively. Results 236 of 241 (98%) coronary artery segments were interpretable on MSCT. The sensitivity and specificity for detection of stenoses were 91% and 97%. Pearson's correlation showed excellent agreement for assessment of LV ejection fraction between MSCT and SPECT (49 (13)% v 53 (12)%, respectively, r  =  0.85). Agreement for assessment of regional wall motion was excellent (92%, κ  =  0.77). In 68 of 73 (93%) segments, MSCT correctly identified a perfusion defect as compared with SPECT, whereas the absence of perfusion defects was correctly detected in 277 of 284 (98%) segments. Conclusions MSCT permits accurate, non‐invasive assessment of coronary artery stenoses, LV function and perfusion in patients with previous infarction. All parameters can be assessed from a single dataset. PMID:16740917

  9. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  10. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  11. Computer modelling integrated with micro-CT and material testing provides additional insight to evaluate bone treatments: Application to a beta-glycan derived whey protein mice model.

    PubMed

    Sreenivasan, D; Tu, P T; Dickinson, M; Watson, M; Blais, A; Das, R; Cornish, J; Fernandez, J

    2016-01-01

    The primary aim of this study was to evaluate the influence of a whey protein diet on computationally predicted mechanical strength of murine bones in both trabecular and cortical regions of the femur. There was no significant influence on mechanical strength in cortical bone observed with increasing whey protein treatment, consistent with cortical tissue mineral density (TMD) and bone volume changes observed. Trabecular bone showed a significant decline in strength with increasing whey protein treatment when nanoindentation derived Young׳s moduli were used in the model. When microindentation, micro-CT phantom density or normalised Young׳s moduli were included in the model a non-significant decline in strength was exhibited. These results for trabecular bone were consistent with both trabecular bone mineral density (BMD) and micro-CT indices obtained independently. The secondary aim of this study was to characterise the influence of different sources of Young׳s moduli on computational prediction. This study aimed to quantify the predicted mechanical strength in 3D from these sources and evaluate if trends and conclusions remained consistent. For cortical bone, predicted mechanical strength behaviour was consistent across all sources of Young׳s moduli. There was no difference in treatment trend observed when Young׳s moduli were normalised. In contrast, trabecular strength due to whey protein treatment significantly reduced when material properties from nanoindentation were introduced. Other material property sources were not significant but emphasised the strength trend over normalised material properties. This shows strength at the trabecular level was attributed to both changes in bone architecture and material properties.

  12. GPU computing for systems biology.

    PubMed

    Dematté, Lorenzo; Prandi, Davide

    2010-05-01

    The development of detailed, coherent, models of complex biological systems is recognized as a key requirement for integrating the increasing amount of experimental data. In addition, in-silico simulation of bio-chemical models provides an easy way to test different experimental conditions, helping in the discovery of the dynamics that regulate biological systems. However, the computational power required by these simulations often exceeds that available on common desktop computers and thus expensive high performance computing solutions are required. An emerging alternative is represented by general-purpose scientific computing on graphics processing units (GPGPU), which offers the power of a small computer cluster at a cost of approximately $400. Computing with a GPU requires the development of specific algorithms, since the programming paradigm substantially differs from traditional CPU-based computing. In this paper, we review some recent efforts in exploiting the processing power of GPUs for the simulation of biological systems.

  13. Managing Information On Costs

    NASA Technical Reports Server (NTRS)

    Taulbee, Zoe A.

    1990-01-01

    Cost Management Model, CMM, software tool for planning, tracking, and reporting costs and information related to costs. Capable of estimating costs, comparing estimated to actual costs, performing "what-if" analyses on estimates of costs, and providing mechanism to maintain data on costs in format oriented to management. Number of supportive cost methods built in: escalation rates, production-learning curves, activity/event schedules, unit production schedules, set of spread distributions, tables of rates and factors defined by user, and full arithmetic capability. Import/export capability possible with 20/20 Spreadsheet available on Data General equipment. Program requires AOS/VS operating system available on Data General MV series computers. Written mainly in FORTRAN 77 but uses SGU (Screen Generation Utility).

  14. Cost Index Flying

    DTIC Science & Technology

    2011-06-01

    continually alter applicable cost indexes . Computed KC-10 Cost Index Equation Using the dollar figures given above, our CI equation reads : CI = CT / C...COST INDEX FLYING GRADUATE RESEARCH PAPER John M. Mirtich, Major, USAF AFIT/IMO/ENS/11-11 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY...AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED

  15. Do We Really Need Additional Contrast-Enhanced Abdominal Computed Tomography for Differential Diagnosis in Triage of Middle-Aged Subjects With Suspected Biliary Pain

    PubMed Central

    Hwang, In Kyeom; Lee, Yoon Suk; Kim, Jaihwan; Lee, Yoon Jin; Park, Ji Hoon; Hwang, Jin-Hyeok

    2015-01-01

    Abstract Enhanced computed tomography (CT) is widely used for evaluating acute biliary pain in the emergency department (ED). However, concern about radiation exposure from CT has also increased. We investigated the usefulness of pre-contrast CT for differential diagnosis in middle-aged subjects with suspected biliary pain. A total of 183 subjects, who visited the ED for suspected biliary pain from January 2011 to December 2012, were included. Retrospectively, pre-contrast phase and multiphase CT findings were reviewed and the detection rate of findings suggesting disease requiring significant treatment by noncontrast CT (NCCT) was compared with cases detected by multiphase CT. Approximately 70% of total subjects had a significant condition, including 1 case of gallbladder cancer and 126 (68.8%) cases requiring intervention (122 biliary stone-related diseases, 3 liver abscesses, and 1 liver hemangioma). The rate of overlooking malignancy without contrast enhancement was calculated to be 0% to 1.5%. Biliary stones and liver space-occupying lesions were found equally on NCCT and multiphase CT. Calculated probable rates of overlooking acute cholecystitis and biliary obstruction were maximally 6.8% and 4.2% respectively. Incidental significant finding unrelated with pain consisted of 1 case of adrenal incidentaloma, which was also observed in NCCT. NCCT might be sufficient to detect life-threatening or significant disease requiring early treatment in young adults with biliary pain. PMID:25700321

  16. Decreased length of stay after addition of healthcare provider in emergency department triage: a comparison between computer-simulated and real-world interventions

    PubMed Central

    Al-Roubaie, Abdul Rahim; Goldlust, Eric Jonathan

    2013-01-01

    Objective (1) To determine the effects of adding a provider in triage on average length of stay (LOS) and proportion of patients with >6 h LOS. (2) To assess the accuracy of computer simulation in predicting the magnitude of such effects on these metrics. Methods A group-level quasi-experimental trial comparing the St. Louis Veterans Affairs Medical Center emergency department (1) before intervention, (2) after institution of provider in triage, and discrete event simulation (DES) models of similar (3) ‘before’ and (4) ‘after’ conditions. The outcome measures were daily mean LOS and percentage of patients with LOS >6 h. Results The DES-modelled intervention predicted a decrease in the %6-hour LOS from 19.0% to 13.1%, and a drop in the daily mean LOS from 249 to 200 min (p<0.0001). Following (actual) intervention, the number of patients with LOS >6 h decreased from 19.9% to 14.3% (p<0.0001), with the daily mean LOS decreasing from 247 to 210 min (p<0.0001). Conclusion Physician and mid-level provider coverage at triage significantly reduced emergency department LOS in this setting. DES accurately predicted the magnitude of this effect. These results suggest further work in the generalisability of triage providers and in the utility of DES for predicting quantitative effects of process changes. PMID:22398851

  17. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  18. Regulatory use of computational toxicology tools and databases at the United States Food and Drug Administration's Office of Food Additive Safety.

    PubMed

    Arvidson, Kirk B; Chanderbhan, Ronald; Muldoon-Jacobs, Kristi; Mayer, Julie; Ogungbesan, Adejoke

    2010-07-01

    Over 10 years ago, the Office of Food Additive Safety (OFAS) in the FDA's Center for Food Safety and Applied Nutrition implemented the formal use of structure-activity relationship analysis and quantitative structure-activity relationship (QSAR) analysis in the premarket review of food-contact substances. More recently, OFAS has implemented the use of multiple QSAR software packages and has begun investigating the use of metabolism data and metabolism predictive models in our QSAR evaluations of food-contact substances. In this article, we provide an overview of the programs used in OFAS as well as a perspective on how to apply multiple QSAR tools in the review process of a new food-contact substance.

  19. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

    NASA Astrophysics Data System (ADS)

    Gill, Wonpyong

    2016-01-01

    This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

  20. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  1. Automated Water Analyser Computer Supported System (AWACSS) Part II: Intelligent, remote-controlled, cost-effective, on-line, water-monitoring measurement system.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (Automated Water Analyser Computer Supported System) based on immunochemical technology has been evaluated that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration or pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) [98/83/EC, 1998. Council Directive (98/83/EC) of 3 November 1998 relating to the quality of water intended for human consumption. Off. J. Eur. Commun. L330, 32-54] and Water Framework Directive (WFD) [2000/60/EC, 2000. Directive 2000/60/EC of the European Parliament and of the Council of 23 October 2000 establishing a framework for Community action in the field of water policy. Off. J. Eur. Commun. L327, 1-72], drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The first part article gave the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods. The systems' capability for analysing a wide range of environmental organic micro-pollutants, such as modern pesticides, endocrine disrupting compounds and pharmaceuticals in surface, ground, drinking and waste water is shown. In addition, a protocol using reconstitution of extracts of solid samples, developed and applied for analysis of river sediments and food samples, is presented. Finally, the overall performance of the AWACSS system in comparison to the conventional analytical techniques, which included liquid and gas chromatographic systems with diode-array UV and mass

  2. The UCLA MEDLARS computer system.

    PubMed

    Garvis, F J

    1966-01-01

    Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested.

  3. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  4. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGES

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  5. User manual for GEOCITY: a computer model for cost analysis of geothermal district-heating-and-cooling systems. Volume I. Main text

    SciTech Connect

    Huber, H.D.; Fassbender, L.L.; Bloomster, C.H.

    1982-09-01

    The purpose of this model is to calculate the costs of residential space heating, space cooling, and sanitary water heating or process heating (cooling) using geothermal energy from a hydrothermal reservoir. The model can calculate geothermal heating and cooling costs for residential developments, a multi-district city, or a point demand such as an industrial factory or commercial building. GEOCITY simulates the complete geothermal heating and cooling system, which consists of two principal parts: the reservoir and fluid transmission system and the distribution system. The reservoir and fluid transmission submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the reservoir and fluid transmission system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. Geothermal space heating is assumed to be provided by circulating hot water through radiators, convectors, fan-coil units, or other in-house heating systems. Geothermal process heating is provided by directly using the hot water or by circulating it through a process heat exchanger. Geothermal space or process cooling is simulated by circulating hot water through lithium bromide/water absorption chillers located at each building. Retrofit costs for both heating and cooling applications can be input by the user. The life-cycle cost of thermal energy from the reservoir and fluid transmission system to the distribution system and the life-cycle cost of heat (chill) to the end-users are calculated using discounted cash flow analysis.

  6. The Computer Fraud and Abuse Act of 1986. Hearing before the Committee on the Judiciary, United States Senate, Ninety-Ninth Congress, Second Session on S.2281, a Bill To Amend Title 18, United States Code, To Provide Additional Penalties for Fraud and Related Activities in Connection with Access Devices and Computers, and for Other Purposes.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.

    The proposed legislation--S. 2281--would amend federal laws to provide additional penalties for fraud and related activities in connection with access devices and computers. The complete text of the bill and proceedings of the hearing are included in this report. Statements and materials submitted by the following committee members and witnesses…

  7. Cost-Effectiveness of Screening for Unhealthy Alcohol Use with %Carbohydrate Deficient Transferrin: Results From a Literature-Based Decision Analytic Computer Model

    PubMed Central

    Kapoor, Alok; Kraemer, Kevin L.; Smith, Kenneth J.; Roberts, Mark S.; Saitz, Richard

    2009-01-01

    Background The %carbohydrate deficient transferrin (%CDT) test offers objective evidence of unhealthy alcohol use but its cost-effectiveness in primary care conditions is unknown. Methods Using a decision tree and Markov model, we performed a literature-based cost-effectiveness analysis of 4 strategies for detecting unhealthy alcohol use in adult primary care patients: (i) Questionnaire Only, using a validated 3-item alcohol questionnaire; (ii) %CDT Only; (iii) Questionnaire followed by %CDT (Questionnaire-%CDT) if the questionnaire is negative; and (iv) No Screening. For those patients screening positive, clinicians performed more detailed assessment to characterize unhealthy use and determine therapy. We estimated costs using Medicare reimbursement and the Medical Expenditure Panel Survey. We determined sensitivity, specificity, prevalence of disease, and mortality from the medical literature. In the base case, we calculated the incremental cost-effectiveness ratio (ICER) in 2006 dollars per quality-adjusted life year ($/QALY) for a 50-year-old cohort. Results In the base case, the ICER for the Questionnaire-%CDT strategy was $15,500/QALY compared with the Questionnaire Only strategy. Other strategies were dominated. When the prevalence of unhealthy alcohol use exceeded 15% and screening age was <60 years, the Questionnaire-%CDT strategy costs less than $50,000/QALY compared to the Questionnaire Only strategy. Conclusions Adding %CDT to questionnaire-based screening for unhealthy alcohol use was cost-effective in our literature-based decision analytic model set in typical primary care conditions. Screening with %CDT should be considered for adults up to the age of 60 when the prevalence of unhealthy alcohol use is 15% or more and screening questionnaires are negative. PMID:19426168

  8. Computer-Based Approach to the Navy’s Academic Remedial Training, Project PREST (Performance-Related Enabling Skills Training): A Cost-Effectiveness Evaluation.

    DTIC Science & Technology

    1981-05-01

    multiplying deviation scores by the test -form reliability ( Kuder - Richardson 20), and then converting back to raw scores. 7 _______ . . .. -o o•. . o... Education and Training (N-5) &contracted for the development and test of a computer-based approach, hereafter referred to as the Performance-related Enabling...RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A .. . . . .. - . . . . . . NPROC SR 81-18 MAY 1981 COMPUTER-BASED APPROACH TO THE NAVY’S ACADEMIC

  9. FeO2/MgO(1 0 0) supported cluster: Computational pursual for a low-cost and low-temperature CO nanocatalyst

    NASA Astrophysics Data System (ADS)

    Zamora, A. Y.; Reveles, J. U.; Mejia-Olvera, R.; Baruah, T.; Zope, R. R.

    2014-09-01

    CO oxidation of only 0.23 eV.Regarding the CO catalytic activity of iron oxide species at low-temperatures, it has been experimentally observed that thin oxide films on metals may indeed exhibit greatly enhanced catalytic activity compared to the underlying metal substrate under the same conditions [24]. In addition to the above studies, low temperature CO oxidation over Ag supported ultrathin MgO films was recently reported [17]. In this case, the activation barrier (0.7 eV) resulted lower than the corresponding barrier of CO oxidation on Pt(1 1 1) of 0.9 eV. The gas phase reaction (½O2 + CO → CO2) was calculated to present an overall exothermicity of 3.2 eV. Importantly, this study showed the possibility to generate a catalyst in which the CO adsorption energy of only 0.4 eV is low enough to prevent CO poisoning, therefore enabling a low temperature CO oxidation route, and addressing the cold start problem [25].Despite the above mentioned studies, the development of active and stable catalysts, without noble metals, for low-temperature CO oxidation under an ambient atmosphere remains a significant challenge. Earlier reports, as mentioned above, indicate that the Fe2O3 is the most active iron oxide surface toward CO oxidation at high temperatures (∼300 °C) [8]. Furthermore, a number of theoretical and experimental cluster studies have also shown that selected iron oxide compositions and charge states are the most reactive toward CO oxidation, i.e. FeO2, Fe2O3, FeO2- Fe2O3- FeO+, FeO2+, Fe2O+, Fe2O2+ and Fe2O3+[26,27].The aim of the proposed work is to carry out a detailed investigation that will provide information about the electronic, geometrical, and catalytic properties of the iron oxide FeO2 cluster adsorbed on defect-free MgO(1 0 0) surfaces on the quest for a low-cost and low-temperature CO nano-catalysts. Note that thin oxide films have been found more active at low temperature [24] as compared to the iron oxide surfaces [14]. Our objective is to show

  10. Benefits and Costs of Ultraviolet Fluorescent Lighting

    PubMed Central

    Lestina, Diane C.; Miller, Ted R.; Knoblauch, Richard; Nitzburg, Marcia

    1999-01-01

    Objective To demonstrate the improvements in detection and recognition distances using fluorescent roadway delineation and auxiliary ultra-violet (UVA) headlights and determine the reduction in crashes needed to recover increased costs of the UVA/flourescent technology. Methods Field study comparisons with and without UVA headlights. Crash types potentially reduced by UVA/flourescent technology were estimated using annual crash and injury incidence data from the General Estimates System (1995–1996) and the 1996 Fatality Analysis Reporting System. Crash costs were computed based on body region and threat-to-life injury severity. Results Significant improvements in detection and recognition distances for pedestrian scenarios, ranging from 34% to 117%. A 19% reduction in nighttime motor vehicle crashes involving pedestrians or pedal-cycles will pay for the additional UVA headlight costs. Alternatively, a 5.5% reduction in all relevant nighttime crashes will pay for the additional costs of UVA headlights and fluorescent highway paint combined. Conclusions If the increased detection and recognition distances resulting from using UVA/flourescent technology as shown in this field study reduce relevant crashes by even small percentages, the benefit cost ratios will still be greater than 2; thus, the UVA/flourescent technology is very cost-effective and a definite priority for crash reductions.

  11. Cost Validation Using PRICE H

    NASA Technical Reports Server (NTRS)

    Jack, John; Kwan, Eric; Wood, Milana

    2011-01-01

    PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.

  12. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  13. An Analysis of the RCA Price-S Cost Estimation Model as it Relates to Current Air Force Computer Software Acquisition and Management.

    DTIC Science & Technology

    1979-12-01

    of skill level, experience, productivity, efficiency, overhead, and labor rates for individual organizations on software development costs. This...GrE LE MfE T7 .. . . . . ESTr . ...... Tr. L o E . ...... . ........ OTFIC " STE UTG Eri TrJ F T NGjT . ..... . ..... .’ ......... ... ... . 5

  14. Computation Directorate 2008 Annual Report

    SciTech Connect

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  15. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost...

  16. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost...

  17. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  18. Polylactides in additive biomanufacturing.

    PubMed

    Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W

    2016-12-15

    New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed.

  19. Cost Realism Handbook for Assuring More Realistic Contractor Cost Proposals

    DTIC Science & Technology

    1985-05-01

    realism. Solicitation: Specify cost realism in addition ,:) Government estimated cost as cost evaluation sub- - ieria in the solicitation and specify the...Analogy techniques involve extrapolations from actual costs for similar systems. Other techniques may include the use of industry wide factors. Parametric...of sources (e.g., actual costs for similar systems and industry wide factors) may be used to supplement the estimates based on specific contractor

  20. Cost goals

    NASA Technical Reports Server (NTRS)

    Hoag, J.

    1981-01-01

    Cost goal activities for the point focusing parabolic dish program are reported. Cost goals involve three tasks: (1) determination of the value of the dish systems to potential users; (2) the cost targets of the dish system are set out; (3) the value side and cost side are integrated to provide information concerning the potential size of the market for parabolic dishes. The latter two activities are emphasized.

  1. Tracking Costs

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    Even though there's been a slight reprieve in energy costs, the reality is that the cost of non-renewable energy is increasing, and state education budgets are shrinking. One way to keep energy and operations costs from overshadowing education budgets is to develop a 10-year energy audit plan to eliminate waste. First, facility managers should…

  2. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  3. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  4. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  5. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  6. ''When Cost Measures Contradict''

    SciTech Connect

    Montgomery, W. D.; Smith, A. E.; Biggar, S. L.; Bernstein, P. M.

    2003-05-09

    When regulators put forward new economic or regulatory policies, there is a need to compare the costs and benefits of these new policies to existing policies and other alternatives to determine which policy is most cost-effective. For command and control policies, it is quite difficult to compute costs, but for more market-based policies, economists have had a great deal of success employing general equilibrium models to assess a policy's costs. Not all cost measures, however, arrive at the same ranking. Furthermore, cost measures can produce contradictory results for a specific policy. These problems make it difficult for a policy-maker to determine the best policy. For a cost measures to be of value, one would like to be confident of two things. First one wants to be sure whether the policy is a winner or loser. Second, one wants to be confident that a measure produces the correct policy ranking. That is, one wants to have confidence in a policy measure's ability to correctly rank policies from most beneficial to most harmful. This paper analyzes empirically these two properties of different costs measures as they pertain to assessing the costs of the carbon abatement policies, especially the Kyoto Protocol, under alternative assumptions about implementation.

  7. The Personnel Office and Computer Services: Tomorrow.

    ERIC Educational Resources Information Center

    Nicely, H. Phillip, Jr.

    1980-01-01

    It is suggested that the director of personnel should be making maximum use of available computer services. Four concerns of personnel directors are cited: number of government reports required, privacy and security, cost of space for personnel records and files, and the additional decision-making tools required for collective bargaining…

  8. An investigation on the effect of second-order additional thickness distributions to the upper surface of an NACA 64 sub 1-212 airfoil. [using flow equations and a CDC 7600 digital computer

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Merz, A. W.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of an NACA 64 sub 1 - 212 airfoil. Additional thickness distributions employed were in the form of two second-order polynomial arcs which have a specified thickness at a given chordwise location. The forward arc disappears at the airfoil leading edge, the aft arc disappears at the airfoil trailing edge. At the juncture of the two arcs, x = x, continuity of slope is maintained. The effect of varying the maximum additional thickness and its chordwise location on airfoil lift coefficient, pitching moment, and pressure distribution was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic NACA 64 sub 1 - 212 airfoil, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  9. Theoretical effect of modifications to the upper surface of two NACA airfoils using smooth polynomial additional thickness distributions which emphasize leading edge profile and which vary quadratically at the trailing edge. [using flow equations and a CDC 7600 computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of the NACA 64-206 and 64 sub 1 - 212 airfoils. The additional thickness distribution had the form of a continuous mathematical function which disappears at both the leading edge and the trailing edge. The function behaves as a polynomial of order epsilon sub 1 at the leading edge, and a polynomial of order epsilon sub 2 at the trailing edge. Epsilon sub 2 is a constant and epsilon sub 1 is varied over a range of practical interest. The magnitude of the additional thickness, y, is a second input parameter, and the effect of varying epsilon sub 1 and y on the aerodynamic performance of the airfoil was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic airfoils, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  10. An investigation on the effect of second-order additional thickness distributions to the upper surface of an NACA 64-206 airfoil. [using flow equations and a CDC 7600 digital computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of an NACA 64-206 airfoil. Additional thickness distributions employed were in the form of two second-order polynomial arcs which have a specified thickness at a given chordwise location. The forward arc disappears at the airfoil leading edge, the aft arc disappears at the airfoil trailing edge. At the juncture of the two arcs, x = x, continuity of slope is maintained. The effect of varying the maximum additional thickness and its chordwise location on airfoil lift coefficient, pitching moment, and pressure distribution was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic NACA 64-206 airfoil, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  11. Cost optimization in anaesthesia.

    PubMed

    Bauer, M; Bach, A; Martin, E; Böttiger, B W

    2001-04-01

    As a result of the progress which has been made in medicine and technology and the increase in morbidity associated this demographic development, the need and thus the costs for medical care have increased as well. The financial resources which are available for medical care, however, are still limited and hence the funds which are available must be distributed more efficiently. Cost optimisation measures can help make better use of the profitability reserves in hospitals. The authors show how costs can be optimised in the anaesthesiology department of a clinic. Pharmacoeconomic evaluation of the new inhalation anaesthetics shows an example of how the cost structures in anaesthesia can be made more obvious and potential ways savings be implemented. To reduce material and personnel costs, a more rational means of internal process management is presented. According to cost-effectiveness analysis, medications are not divided into the categories inexpensive and expensive but rather cost-effective or non-cost-effective. By selecting a cost-effective drug it is possible to reduce cost at a hospital. For example, sevoflurane at a fresh gas flow of below 3 l/min has been shown to be a cost-effective inhalation anaesthetic which, in terms of the economics, is also superior to intravenous anaesthesia with propofol. In addition to these measures of reducing material costs, other examples are given of how personnel costs can be reduced by optimising work procedures: e.g. effective operating theatre co-ordination, short switchover times by overlapping anaesthesia induction and the use of multifunctional personnel. The gain in productivity which is a result of these measures can positively affect profits, and by optimising the organisation of procedures to shorten the times required to carry out a procedure, costs can be reduced.

  12. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  13. Cutting Transportation Costs.

    ERIC Educational Resources Information Center

    Lewis, Barbara

    1982-01-01

    Beginning on the front cover, this article tells how school districts are reducing their transportation costs. Particularly effective measures include the use of computers for bus maintenance and scheduling, school board ownership of buses, and the conversion of gasoline-powered buses to alternative fuels. (Author/MLF)

  14. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  15. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  16. Testing the Cost Yardstick in Cost-Quality Studies.

    ERIC Educational Resources Information Center

    Finch, James N.

    1967-01-01

    To discover how costs affect quality, 16 different methods of computing educational costs are developed and correlated with a cluster of "quality related" factors (QRC). Data for the correlation were obtained from 1,055 city school districts in 48 states. The QRC is composed of staffing adequacy variables, measures of teacher quality, and…

  17. Assessing the Costs of Adequacy in California Public Schools: A Cost Function Approach

    ERIC Educational Resources Information Center

    Imazeki, Jennifer

    2008-01-01

    In this study, a cost function is used to estimate the costs for California districts to meet the achievement goals set out for them by the state. I calculate estimates of base costs (i.e., per pupil costs in a district with relatively low levels of student need) and marginal costs (i.e., the additional costs associated with specific student…

  18. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  19. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  20. Cryogenic cooling for computers - Obstacles and opportunities

    NASA Astrophysics Data System (ADS)

    Pei, Hsien-Sheng; Heng, Stephen

    The computer environmental challenges in the 1990s and the system impacts of using cryocooler for computer cooling are discussed. Attention is given to the advantages of employing a cryocooler for computer cooling from the system standpoint, in addition to the traditional consideration of enhancing the performance of the semiconductor chips. It is argued that a closed-loop cooling system (CLCS) can provide, on average, more than 10 dB noise reduction over its open-loop counterpart. A CLCS can totally eliminate the potential recirculation problems inherent in many open-loop systems. A CLCS can eliminate the problems with draft and thermal discomfort experienced by users working near computers employing some type of the open-loop computer cooling system. It can reduce total operating costs, provide better air quality for computers, and alleviate the condensation problem.

  1. Letter regarding 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics' by Patrizi et al. and research reproducibility.

    PubMed

    2017-04-01

    The reporting of research in a manner that allows reproduction in subsequent investigations is important for scientific progress. Several details of the recent study by Patrizi et al., 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics', are absent from the published manuscript and make reproduction of findings impossible. As new and complex technologies with great promise for ergonomics develop, new but surmountable challenges for reporting investigations using these technologies in a reproducible manner arise. Practitioner Summary: As with traditional methods, scientific reporting of new and complex ergonomics technologies should be performed in a manner that allows reproduction in subsequent investigations and supports scientific advancement.

  2. COMPARISON OF CLASSIFICATION STRATEGIES BY COMPUTER SIMULATION METHODS.

    DTIC Science & Technology

    NAVAL TRAINING, COMPUTER PROGRAMMING), (*NAVAL PERSONNEL, CLASSIFICATION), SELECTION, SIMULATION, CORRELATION TECHNIQUES , PROBABILITY, COSTS, OPTIMIZATION, PERSONNEL MANAGEMENT, DECISION THEORY, COMPUTERS

  3. Identification of cost effective energy conservation measures

    NASA Technical Reports Server (NTRS)

    Bierenbaum, H. S.; Boggs, W. H.

    1978-01-01

    In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed

  4. Multi-tasking computer control of video related equipment

    NASA Technical Reports Server (NTRS)

    Molina, Rod; Gilbert, Bob

    1989-01-01

    The flexibility, cost-effectiveness and widespread availability of personal computers now makes it possible to completely integrate the previously separate elements of video post-production into a single device. Specifically, a personal computer, such as the Commodore-Amiga, can perform multiple and simultaneous tasks from an individual unit. Relatively low cost, minimal space requirements and user-friendliness, provides the most favorable environment for the many phases of video post-production. Computers are well known for their basic abilities to process numbers, text and graphics and to reliably perform repetitive and tedious functions efficiently. These capabilities can now apply as either additions or alternatives to existing video post-production methods. A present example of computer-based video post-production technology is the RGB CVC (Computer and Video Creations) WorkSystem. A wide variety of integrated functions are made possible with an Amiga computer existing at the heart of the system.

  5. Teardrop bladder: additional considerations

    SciTech Connect

    Wechsler, R.J.; Brennan, R.E.

    1982-07-01

    Nine cases of teardrop bladder (TDB) seen at excretory urography are presented. In some of these patients, the iliopsoas muscles were at the upper limit of normal in size, and additional evaluation of the perivesical structures with computed tomography (CT) was necessary. CT demonstrated only hypertrophied muscles with or without perivesical fat. The psoas muscles and pelvic width were measured in 8 patients and compared with the measurements of a control group of males without TDB. Patients with TDB had large iliopsoas muscles and narrow pelves compared with the control group. The psoas muscle width/pelvic width ratio was significantly greater (p < 0.0005) in patients with TDB than in the control group, with values of 1.04 + 0.05 and 0.82 + 0.09, respectively. It is concluded that TDB is not an uncommon normal variant in black males. Both iliopsoas muscle hypertrophy and a narrow pelvis are factors that predispose a patient to TDB.

  6. SUPERSONIC TRANSPORT DEVELOPMENT AND PRODUCTION. VOLUME I. COST ANALYSIS PROGRAM.

    DTIC Science & Technology

    SUPERSONIC AIRCRAFT, *COSTS), (*AIRCRAFT INDUSTRY, INDUSTRIAL PRODUCTION ), MANAGEMENT ENGINEERING, AIRFRAMES, ECONOMICS, COMPUTER PROGRAMS, STATISTICAL ANALYSIS, MONEY, AIRCRAFT ENGINES, FEASIBILITY STUDIES

  7. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  8. The second Randomised Evaluation of the Effectiveness, cost-effectiveness and Acceptability of Computerised Therapy (REEACT-2) trial: does the provision of telephone support enhance the effectiveness of computer-delivered cognitive behaviour therapy? A randomised controlled trial.

    PubMed Central

    Brabyn, Sally; Araya, Ricardo; Barkham, Michael; Bower, Peter; Cooper, Cindy; Duarte, Ana; Kessler, David; Knowles, Sarah; Lovell, Karina; Littlewood, Elizabeth; Mattock, Richard; Palmer, Stephen; Pervin, Jodi; Richards, David; Tallon, Debbie; White, David; Walker, Simon; Worthy, Gillian; Gilbody, Simon

    2016-01-01

    BACKGROUND Computerised cognitive behaviour therapy (cCBT) is an efficient form of therapy potentially improving access to psychological care. Indirect evidence suggests that the uptake and effectiveness of cCBT can be increased if facilitated by telephone, but this is not routinely offered in the NHS. OBJECTIVES To compare the clinical effectiveness and cost-effectiveness of telephone-facilitated free-to-use cCBT [e.g. MoodGYM (National Institute for Mental Health Research, Australian National University, Canberra, ACT, Australia)] with minimally supported cCBT. DESIGN This study was a multisite, pragmatic, open, two-arm, parallel-group randomised controlled trial with a concurrent economic evaluation. SETTING Participants were recruited from GP practices in Bristol, Manchester, Sheffield, Hull and the north-east of England. PARTICIPANTS Potential participants were eligible to participate in the trial if they were adults with depression scoring ≥ 10 on the Patient Health Questionnaire-9 (PHQ-9). INTERVENTIONS Participants were randomised using a computer-generated random number sequence to receive minimally supported cCBT or telephone-facilitated cCBT. Participants continued with usual general practitioner care. MAIN OUTCOME MEASURES The primary outcome was self-reported symptoms of depression, as assessed by the PHQ-9 at 4 months post randomisation. SECONDARY OUTCOMES Secondary outcomes were depression at 12 months and anxiety, somatoform complaints, health utility (as assessed by the European Quality of Life-5 Dimensions questionnaire) and resource use at 4 and 12 months. RESULTS Clinical effectiveness: 182 participants were randomised to minimally supported cCBT and 187 participants to telephone-facilitated cCBT. There was a difference in the severity of depression at 4 and 12 months, with lower levels in the telephone-facilitated group. The odds of no longer being depressed (defined as a PHQ-9 score of < 10) at 4 months were twice as high in the

  9. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  10. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  11. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  12. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  13. Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  14. Cost Control

    ERIC Educational Resources Information Center

    Foreman, Phillip

    2009-01-01

    Education administrators involved in construction initiatives unanimously agree that when it comes to change orders, less is more. Change orders have a negative rippling effect of driving up building costs and producing expensive project delays that often interfere with school operations and schedules. Some change orders are initiated by schools…

  15. Distributed computing at the SSCL

    SciTech Connect

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given.

  16. Low Cost Hydrogen Production Platform

    SciTech Connect

    Timothy M. Aaron, Jerome T. Jankowiak

    2009-10-16

    A technology and design evaluation was carried out for the development of a turnkey hydrogen production system in the range of 2.4 - 12 kg/h of hydrogen. The design is based on existing SMR technology and existing chemical processes and technologies to meet the design objectives. Consequently, the system design consists of a steam methane reformer, PSA system for hydrogen purification, natural gas compression, steam generation and all components and heat exchangers required for the production of hydrogen. The focus of the program is on packaging, system integration and an overall step change in the cost of capital required for the production of hydrogen at small scale. To assist in this effort, subcontractors were brought in to evaluate the design concepts and to assist in meeting the overall goals of the program. Praxair supplied the overall system and process design and the subcontractors were used to evaluate the components and system from a manufacturing and overall design optimization viewpoint. Design for manufacturing and assembly (DFMA) techniques, computer models and laboratory/full-scale testing of components were utilized to optimize the design during all phases of the design development. Early in the program evaluation, a review of existing Praxair hydrogen facilities showed that over 50% of the installed cost of a SMR based hydrogen plant is associated with the high temperature components (reformer, shift, steam generation, and various high temperature heat exchange). The main effort of the initial phase of the program was to develop an integrated high temperature component for these related functions. Initially, six independent concepts were developed and the processes were modeled to determine overall feasibility. The six concepts were eventually narrowed down to the highest potential concept. A US patent was awarded in February 2009 for the Praxair integrated high temperature component design. A risk analysis of the high temperature component was

  17. Additive Manufacturing for Superalloys - Producibility and Cost Validation (Preprint)

    DTIC Science & Technology

    2011-03-01

    cell located at Advanced Manufacturing Research Centre (AMRC) Sheffield . The rationale was to implement development in production standard conditions...inspections were completed, a rough dimensional inspection was performed with a surface plate and manual gages to verify that the level of distortion

  18. SEASAT economic assessment. Volume 10: The SATIL 2 program (a program for the evaluation of the costs of an operational SEASAT system as a function of operational requirements and reliability. [computer programs for economic analysis and systems analysis of SEASAT satellite systems

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The SATIL 2 computer program was developed to assist with the programmatic evaluation of alternative approaches to establishing and maintaining a specified mix of operational sensors on spacecraft in an operational SEASAT system. The program computes the probability distributions of events (i.e., number of launch attempts, number of spacecraft purchased, etc.), annual recurring cost, and present value of recurring cost. This is accomplished for the specific task of placing a desired mix of sensors in orbit in an optimal fashion in order to satisfy a specified sensor demand function. Flow charts are shown, and printouts of the programs are given.

  19. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  20. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the

  1. 34 CFR 304.21 - Allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Allowable costs. 304.21 Section 304.21 Education... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education... allowable expenditures by projects funded under the program: (a) Cost of attendance, as defined in Title...

  2. 34 CFR 304.21 - Allowable costs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 2 2011-07-01 2010-07-01 true Allowable costs. 304.21 Section 304.21 Education... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education... allowable expenditures by projects funded under the program: (a) Cost of attendance, as defined in Title...

  3. Component Cost Analysis of Large Scale Systems

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.; Yousuff, A.

    1982-01-01

    The ideas of cost decomposition is summarized to aid in the determination of the relative cost (or 'price') of each component of a linear dynamic system using quadratic performance criteria. In addition to the insights into system behavior that are afforded by such a component cost analysis CCA, these CCA ideas naturally lead to a theory for cost-equivalent realizations.

  4. The Prevalence of Phosphorus Containing Food Additives in Top Selling Foods in Grocery Stores

    PubMed Central

    León, Janeen B.; Sullivan, Catherine M.; Sehgal, Ashwini R.

    2013-01-01

    Objective To determine the prevalence of phosphorus-containing food additives in best selling processed grocery products and to compare the phosphorus content of a subset of top selling foods with and without phosphorus additives. Design The labels of 2394 best selling branded grocery products in northeast Ohio were reviewed for phosphorus additives. The top 5 best selling products containing phosphorus additives from each food category were matched with similar products without phosphorus additives and analyzed for phosphorus content. Four days of sample meals consisting of foods with and without phosphorus additives were created and daily phosphorus and pricing differentials were computed. Setting Northeast Ohio Main outcome measures Presence of phosphorus-containing food additives, phosphorus content Results 44% of the best selling grocery items contained phosphorus additives. The additives were particularly common in prepared frozen foods (72%), dry food mixes (70%), packaged meat (65%), bread & baked goods (57%), soup (54%), and yogurt (51%) categories. Phosphorus additive containing foods averaged 67 mg phosphorus/100 gm more than matched non-additive containing foods (p=.03). Sample meals comprised mostly of phosphorus additive-containing foods had 736 mg more phosphorus per day compared to meals consisting of only additive-free foods. Phosphorus additive-free meals cost an average of $2.00 more per day. Conclusion Phosphorus additives are common in best selling processed groceries and contribute significantly to their phosphorus content. Moreover, phosphorus additive foods are less costly than phosphorus additive-free foods. As a result, persons with chronic kidney disease may purchase these popular low-cost groceries and unknowingly increase their intake of highly bioavailable phosphorus. PMID:23402914

  5. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  6. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  7. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  8. Using Technology to Control Costs

    ERIC Educational Resources Information Center

    Ho, Simon; Schoenberg, Doug; Richards, Dan; Morath, Michael

    2009-01-01

    In this article, the authors examines the use of technology to control costs in the child care industry. One of these technology solutions is Software-as-a-Service (SaaS). SaaS solutions can help child care providers save money in many aspects of center management. In addition to cost savings, SaaS solutions are also particularly appealing to…

  9. Consumer Security Perceptions and the Perceived Influence on Adopting Cloud Computing: A Quantitative Study Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Paquet, Katherine G.

    2013-01-01

    Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…

  10. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  11. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  12. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  13. An iron–oxygen intermediate formed during the catalytic cycle of cysteine dioxygenase† †Electronic supplementary information (ESI) available: Experimental and computational details. See DOI: 10.1039/c6cc03904a Click here for additional data file.

    PubMed Central

    Tchesnokov, E. P.; Faponle, A. S.; Davies, C. G.; Quesne, M. G.; Turner, R.; Fellner, M.; Souness, R. J.; Wilbanks, S. M.

    2016-01-01

    Cysteine dioxygenase is a key enzyme in the breakdown of cysteine, but its mechanism remains controversial. A combination of spectroscopic and computational studies provides the first evidence of a short-lived intermediate in the catalytic cycle. The intermediate decays within 20 ms and has absorption maxima at 500 and 640 nm. PMID:27297454

  14. Fitting the computer to the job

    SciTech Connect

    Taylor, T.D.

    1988-04-01

    In the last two years, novel computer systems have been commercialized which substitute tens, hundreds or even thousands of cheap and simple processors for the one, two, or four massive and expensive processors of state-of-the-art supercomputers. Designers of such systems claim the achievement of speeds rivaling or exceeding those of mainframe computers at a fraction of the cost, through parallel processing. Additional performance improvements are foreseen in the use of specialized parallel processors whose hardware is optimized for a small set of algorithms; these systems are ideally suited for CFD, radar image and signal processing, thermophysical analysis, and similar fields of basic scientific research.

  15. Arbitrage risk induced by transaction costs

    NASA Astrophysics Data System (ADS)

    Piotrowski, Edward W.; Sładkowski, Jan

    2004-01-01

    We discuss the time evolution of quotation of stocks and commodities and show that they form an Ising chain. We show that transaction costs induce arbitrage risk that is usually neglected. The full analysis of the portfolio theory is computationally complex but the latest development in quantum computation theory suggests that such a task can be performed on quantum computers.

  16. Costs and Cost-Effectiveness of Plasmodium vivax Control

    PubMed Central

    White, Michael T.; Yeung, Shunmay; Patouillard, Edith; Cibulskis, Richard

    2016-01-01

    The continued success of efforts to reduce the global malaria burden will require sustained funding for interventions specifically targeting Plasmodium vivax. The optimal use of limited financial resources necessitates cost and cost-effectiveness analyses of strategies for diagnosing and treating P. vivax and vector control tools. Herein, we review the existing published evidence on the costs and cost-effectiveness of interventions for controlling P. vivax, identifying nine studies focused on diagnosis and treatment and seven studies focused on vector control. Although many of the results from the much more extensive P. falciparum literature can be applied to P. vivax, it is not always possible to extrapolate results from P. falciparum–specific cost-effectiveness analyses. Notably, there is a need for additional studies to evaluate the potential cost-effectiveness of radical cure with primaquine for the prevention of P. vivax relapses with glucose-6-phosphate dehydrogenase testing. PMID:28025283

  17. Asymptotic cost in document conversion

    NASA Astrophysics Data System (ADS)

    Blostein, Dorothea; Nagy, George

    2012-01-01

    In spite of a hundredfold decrease in the cost of relevant technologies, the role of document image processing systems is gradually declining due to the transition to an on-line world. Nevertheless, in some high-volume applications, document image processing software still saves millions of dollars by accelerating workflow, and similarly large savings could be realized by more effective automation of the multitude of low-volume personal document conversions. While potential cost savings, based on estimates of costs and values, are a driving force for new developments, quantifying such savings is difficult. The most important trend is that the cost of computing resources for DIA is becoming insignificant compared to the associated labor costs. An econometric treatment of document processing complements traditional performance evaluation, which focuses on assessing the correctness of the results produced by document conversion software. Researchers should look beyond the error rate for advancing both production and personal document conversion.

  18. 48 CFR 1246.101-70 - Additional definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACT MANAGEMENT QUALITY ASSURANCE General 1246.101-70 Additional definitions. At no additional cost to the Government means at no increase in price for firm-fixed-price contracts, at no increase in target... estimated cost or fee for cost-reimbursement contracts. Defect means any condition or characteristic in...

  19. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  20. Additive manufacturing of hybrid circuits

    SciTech Connect

    Bell, Nelson S.; Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David M.; Hirschfeld, Deidre; Hall, Aaron Christopher

    2016-03-26

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects. Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. As a result, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.

  1. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  2. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  3. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  4. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  5. Using technology to support investigations in the electronic age: tracking hackers to large scale international computer fraud

    NASA Astrophysics Data System (ADS)

    McFall, Steve

    1994-03-01

    With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.

  6. Software for Tracking Costs of Mars Projects

    NASA Technical Reports Server (NTRS)

    Wong, Alvin; Warfield, Keith

    2003-01-01

    The Mars Cost Tracking Model is a computer program that administers a system set up for tracking the costs of future NASA projects that pertain to Mars. Previously, no such tracking system existed, and documentation was written in a variety of formats and scattered in various places. It was difficult to justify costs or even track the history of costs of a spacecraft mission to Mars. The present software enables users to maintain all cost-model definitions, documentation, and justifications of cost estimates in one computer system that is accessible via the Internet. The software provides sign-off safeguards to ensure the reliability of information entered into the system. This system may eventually be used to track the costs of projects other than only those that pertain to Mars.

  7. Parametric Cost Deployment

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1995-01-01

    Parametric cost analysis is a mathematical approach to estimating cost. Parametric cost analysis uses non-cost parameters, such as quality characteristics, to estimate the cost to bring forth, sustain, and retire a product. This paper reviews parametric cost analysis and shows how it can be used within the cost deployment process.

  8. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  9. Cleavage of ether, ester, and tosylate C(sp3)-O bonds by an iridium complex, initiated by oxidative addition of C-H bonds. Experimental and computational studies.

    PubMed

    Kundu, Sabuj; Choi, Jongwook; Wang, David Y; Choliy, Yuriy; Emge, Thomas J; Krogh-Jespersen, Karsten; Goldman, Alan S

    2013-04-03

    A pincer-ligated iridium complex, (PCP)Ir (PCP = κ(3)-C6H3-2,6-[CH2P(t-Bu)2]2), is found to undergo oxidative addition of C(sp(3))-O bonds of methyl esters (CH3-O2CR'), methyl tosylate (CH3-OTs), and certain electron-poor methyl aryl ethers (CH3-OAr). DFT calculations and mechanistic studies indicate that the reactions proceed via oxidative addition of C-H bonds followed by oxygenate migration, rather than by direct C-O addition. Thus, methyl aryl ethers react via addition of the methoxy C-H bond, followed by α-aryloxide migration to give cis-(PCP)Ir(H)(CH2)(OAr), followed by iridium-to-methylidene hydride migration to give (PCP)Ir(CH3)(OAr). Methyl acetate undergoes C-H bond addition at the carbomethoxy group to give (PCP)Ir(H)[κ(2)-CH2OC(O)Me] which then affords (PCP-CH2)Ir(H)(κ(2)-O2CMe) (6-Me) in which the methoxy C-O bond has been cleaved, and the methylene derived from the methoxy group has migrated into the PCP Cipso-Ir bond. Thermolysis of 6-Me ultimately gives (PCP)Ir(CH3)(κ(2)-O2CR), the net product of methoxy group C-O oxidative addition. Reaction of (PCP)Ir with species of the type ROAr, RO2CMe or ROTs, where R possesses β-C-H bonds (e.g., R = ethyl or isopropyl), results in formation of (PCP)Ir(H)(OAr), (PCP)Ir(H)(O2CMe), or (PCP)Ir(H)(OTs), respectively, along with the corresponding olefin or (PCP)Ir(olefin) complex. Like the C-O bond oxidative additions, these reactions also proceed via initial activation of a C-H bond; in this case, C-H addition at the β-position is followed by β-migration of the aryloxide, carboxylate, or tosylate group. Calculations indicate that the β-migration of the carboxylate group proceeds via an unusual six-membered cyclic transition state in which the alkoxy C-O bond is cleaved with no direct participation by the iridium center.

  10. Realistic costs of carbon capture

    SciTech Connect

    Al Juaied, Mohammed . Belfer Center for Science and International Affiaris); Whitmore, Adam )

    2009-07-01

    There is a growing interest in carbon capture and storage (CCS) as a means of reducing carbon dioxide (CO2) emissions. However there are substantial uncertainties about the costs of CCS. Costs for pre-combustion capture with compression (i.e. excluding costs of transport and storage and any revenue from EOR associated with storage) are examined in this discussion paper for First-of-a-Kind (FOAK) plant and for more mature technologies, or Nth-of-a-Kind plant (NOAK). For FOAK plant using solid fuels the levelised cost of electricity on a 2008 basis is approximately 10 cents/kWh higher with capture than for conventional plants (with a range of 8-12 cents/kWh). Costs of abatement are found typically to be approximately US$150/tCO2 avoided (with a range of US$120-180/tCO2 avoided). For NOAK plants the additional cost of electricity with capture is approximately 2-5 cents/kWh, with costs of the range of US$35-70/tCO2 avoided. Costs of abatement with carbon capture for other fuels and technologies are also estimated for NOAK plants. The costs of abatement are calculated with reference to conventional SCPC plant for both emissions and costs of electricity. Estimates for both FOAK and NOAK are mainly based on cost data from 2008, which was at the end of a period of sustained escalation in the costs of power generation plant and other large capital projects. There are now indications of costs falling from these levels. This may reduce the costs of abatement and costs presented here may be 'peak of the market' estimates. If general cost levels return, for example, to those prevailing in 2005 to 2006 (by which time significant cost escalation had already occurred from previous levels), then costs of capture and compression for FOAK plants are expected to be US$110/tCO2 avoided (with a range of US$90-135/tCO2 avoided). For NOAK plants costs are expected to be US$25-50/tCO2. Based on these considerations a likely representative range of costs of abatement from CCS excluding

  11. Additive Manufacturing: Making Imagination the Major Limitation

    NASA Astrophysics Data System (ADS)

    Zhai, Yuwei; Lados, Diana A.; LaGoy, Jane L.

    2014-05-01

    Additive manufacturing (AM) refers to an advanced technology used for the fabrication of three-dimensional near-net-shaped functional components directly from computer models, using unit materials. The fundamentals and working principle of AM offer several advantages, including near-net-shape capabilities, superior design and geometrical flexibility, innovative multi-material fabrication, reduced tooling and fixturing, shorter cycle time for design and manufacturing, instant local production at a global scale, and material, energy, and cost efficiency. Well suiting the requests of modern manufacturing climate, AM is viewed as the new industrial revolution, making its way into a continuously increasing number of industries, such as aerospace, defense, automotive, medical, architecture, art, jewelry, and food. This overview was created to relate the historical evolution of the AM technology to its state-of-the-art developments and emerging applications. Generic thoughts on the microstructural characteristics, properties, and performance of AM-fabricated materials will also be discussed, primarily related to metallic materials. This write-up will introduce the general reader to specifics of the AM field vis-à-vis advantages and common techniques, materials and properties, current applications, and future opportunities.

  12. Birth and first-year costs for mothers and infants attributable to maternal smoking.

    PubMed

    Miller, D P; Villa, K F; Hogue, S L; Sivapathasundaram, D

    2001-02-01

    Maternal smoking during pregnancy has been linked to high costs. This study estimates the magnitude of excess costs attributable to smoking during pregnancy for mothers and infants. The model estimates smoking-attributable costs for 11 infant and maternal conditions. From a claims database of 7784 mothers and 7901 infants who had deliveries during 1996, we estimated total cost over the infants' first year of life for each mother and infant and identified each complication of interest, based on ICD-9 codes. The average cost for smokers and non-smokers could not be computed directly because smoking status is not available in claims data. Therefore, the population attributable risk percentage (PAR%) due to smoking for each complication was identified from the literature. Multiple linear regression was used to provide estimates of the incremental cost associated with each smoking-related complication. The total cost attributable to smoking was computed as a function of the incremental cost of each complication and the PAR% for each complication. The conditions associated with the largest incremental costs compared to patients without those conditions were abruptio placenta ($23,697) and respiratory distress syndrome ($21,944). Because they were more common, the conditions with the largest smoking-attributable cost were low birth weight ($914) and lower respiratory infection ($428). The sum of the additional costs attributable to smoking for all conditions yielded a total in the first year after birth ranging from $1142 to $1358 per smoking pregnant woman. It was concluded that maternal smoking during pregnancy results in an economic burden to payers and society. These estimates may be useful in formal cost-effectiveness evaluations of individual smoking cessation strategies.

  13. Metal Additive Manufacturing: A Review

    NASA Astrophysics Data System (ADS)

    Frazier, William E.

    2014-06-01

    This paper reviews the state-of-the-art of an important, rapidly emerging, manufacturing technology that is alternatively called additive manufacturing (AM), direct digital manufacturing, free form fabrication, or 3D printing, etc. A broad contextual overview of metallic AM is provided. AM has the potential to revolutionize the global parts manufacturing and logistics landscape. It enables distributed manufacturing and the productions of parts-on-demand while offering the potential to reduce cost, energy consumption, and carbon footprint. This paper explores the material science, processes, and business consideration associated with achieving these performance gains. It is concluded that a paradigm shift is required in order to fully exploit AM potential.

  14. 27 CFR 70.302 - Fees and costs for witnesses.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., where a third party's records are stored at an independent storage facility that charges the third party... searching for records or information and the cost of retrieving information stored by computer. Salaries of... by computer in the format in which it is normally produced, actual costs, based on computer time...

  15. Solar energy systems cost

    SciTech Connect

    Lavender, J.A.

    1980-01-01

    Five major areas of work currently being pursued in the United States in solar energy which will have a significant impact on the world's energy situation in the future are addressed. The five significant areas discussed include a technical description of several solar technologies, current and projected cost of the selected solar systems, and cost methodologies which are under development. In addition, sensitivity considerations which are unique to solar energy systems and end user applications are included. A total of six solar technologies - biomass, photovoltaics, wind, ocean thermal energy conversion (OTEC), solar thermal, and industrial process heat (IPH) have been included in a brief technical description to present the variety of systems and their techncial status. System schematics have been included of systems which have been constructed, are currently in the detail design and test stage of development, or are of a conceptual nature.

  16. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  17. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  18. Costs and cost containment in nursing homes.

    PubMed Central

    Smith, H L; Fottler, M D

    1981-01-01

    The study examines the impact of structural and process variables on the cost of nursing home care and the utilization of various cost containment methods in 43 california nursing homes. Several predictors were statistically significant in their relation to cost per patient day. A diverse range of cost containment techniques was discovered along with strong predictors of the utilization of these techniques by nursing home administrators. The trade-off between quality of care and cost of care is discussed. PMID:7228713

  19. Additive Similarity Trees

    ERIC Educational Resources Information Center

    Sattath, Shmuel; Tversky, Amos

    1977-01-01

    Tree representations of similarity data are investigated. Hierarchical clustering is critically examined, and a more general procedure, called the additive tree, is presented. The additive tree representation is then compared to multidimensional scaling. (Author/JKS)

  20. Assessing the Cost Efficiency of Italian Universities

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Salerno, Carlo

    2007-01-01

    This study uses Data Envelopment Analysis to evaluate the cost efficiency of 52 Italian public universities. In addition to being one of the first such cost studies of the Italian system, it explicitly takes into account the internal cost structure of institutions' education programs; a task not prevalent in past Data Envelopment Analysis studies…

  1. Marginal Costing Techniques for Higher Education.

    ERIC Educational Resources Information Center

    Allen, Richard; Brinkman, Paul

    The techniques for calculating marginal costs in higher education are examined in detail. Marginal costs, as defined in economics, is the change in total cost associated with producing one additional unit of output. In higher education, the most frequently selected unit of output is a full-time-equivalent student or, alternatively, a student…

  2. Synthesis of malhamensilipin A exploiting iterative epoxidation/chlorination: experimental and computational analysis of epoxide-derived chloronium ions† †Electronic supplementary information (ESI) available. CCDC 1470484 and 1470483. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c6sc03012b Click here for additional data file. Click here for additional data file.

    PubMed Central

    Saska, J.; Lewis, W.

    2016-01-01

    We report a 12-step catalytic enantioselective formal synthesis of malhamensilipin A (3) and diastereoisomeric analogues from (E)-2-undecenal. The convergent synthesis relied upon iterative epoxidation and phosphorus(v)-mediated deoxydichlorination reactions as well a titanium-mediated epoxide-opening to construct the C11–C16 stereohexad. The latter transformation occurred with very high levels of stereoretention regardless of the C13 configuration of the parent epoxide, implicating anchimeric assistance of either the γ- or δ-chlorine atoms, and the formation of chloretanium or chlorolanium ions, respectively. A computational analysis of the chloronium ion intermediates provided support for the involvement of chlorolanium ions, whereas the potential chloretanium ions were found to be less likely intermediates on the basis of their greater carbocationic character.

  3. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  4. Unraveling Higher Education's Costs.

    ERIC Educational Resources Information Center

    Gordon, Gus; Charles, Maria

    1998-01-01

    The activity-based costing (ABC) method of analyzing institutional costs in higher education involves four procedures: determining the various discrete activities of the organization; calculating the cost of each; determining the cost drivers; tracing cost to the cost objective or consumer of each activity. Few American institutions have used the…

  5. 25 CFR 700.81 - Monthly housing cost.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for...

  6. 25 CFR 700.81 - Monthly housing cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for...

  7. 25 CFR 700.81 - Monthly housing cost.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for...

  8. Artist meets computer

    NASA Astrophysics Data System (ADS)

    Faggin, Marzia

    1997-04-01

    I would like to share my experience ofusing the computer for creating art. I am a graphic designer originally trained without any exposure to the computer. I graduated in July of 1994 from a four-year curriculum of graphic design at the Istituto Europeo di Design in Milan Italy. Italy is famous for its excellent design capability. Art and beauty influence the life ofnearly every Italian. Everywhere you look on the streets there is art from grandiose architecture to the displays in shop windows. A keen esthetic sense and a search and appreciation for quality permeate all aspects of Italian life, manifesting in the way people cut their hair, the style ofthe clothes and how furniture and everyday objects are designed. Italian taste is fine-tuned to the appreciation ofrefined textiles and quality materials are often enhanced by simple design. The Italian culture has a long history ofexcellent artisanship and good craftsmanship is highly appreciated. Gadgets have never been popular in Italian society. Gadgets are considered useless objects which add nothing to a person's life, and since they cost money they are actually viewed as a waste. The same is true for food, exception made in the big cities filled with tourists, fast food chains have never survived. Genuine and simple food is what people truly desire. A typical Italian sandwich, for example, is minimalist, the essential ingredients are left alone without additional sauces because if something is delicious by itselfwhy would anyone want to disgnise its taste?

  9. Cost aggregation and occlusion handling with WLS in stereo matching.

    PubMed

    Min, Dongbo; Sohn, Kwanghoon

    2008-08-01

    This paper presents a novel method for cost aggregation and occlusion handling for stereo matching. In order to estimate optimal cost, given a per-pixel difference image as observed data, we define an energy function and solve the minimization problem by solving the iterative equation with the numerical method. We improve performance and increase the convergence rate by using several acceleration techniques such as the Gauss-Seidel method, the multiscale approach, and adaptive interpolation. The proposed method is computationally efficient since it does not use color segmentation or any global optimization techniques. For occlusion handling, which has not been performed effectively by any conventional cost aggregation approaches, we combine the occlusion problem with the proposed minimization scheme. Asymmetric information is used so that few additional computational loads are necessary. Experimental results show that performance is comparable to that of many state-of-the-art methods. The proposed method is in fact the most successful among all cost aggregation methods based on standard stereo test beds.

  10. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  11. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  12. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  13. Costs and cost-minimisation analysis.

    PubMed

    Robinson, R

    1993-09-18

    Whatever kind of economic evaluation you plan to undertake, the costs must be assessed. In health care these are first of all divided into costs borne by the NHS (like drugs), by patients and their families (like travel), and by the rest of society (like health education). Next the costs have to be valued in monetary terms; direct costs, like wages, pose little problem, but indirect costs (like time spent in hospital) have to have values imputed to them. And that is not all: costs must be further subdivided into average, marginal, and joint costs, which help decisions on how much of a service should be provided. Capital costs (investments in plant, buildings, and machinery) are also important, as are discounting and inflation. In this second article in the series Ray Robinson defines the types of costs, their measurement, and how they should be valued in monetary terms.

  14. Computers + Student Activities Handbook.

    ERIC Educational Resources Information Center

    Masie, Elliott; Stein, Michele

    Designed to provide schools with the tools to start utilizing computers for student activity programs without additional expenditures, this handbook provides beginning computer users with suggestions and ideas for using computers in such activities as drama clubs, yearbooks, newspapers, activity calendars, accounting programs, room utilization,…

  15. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  16. Introduction to Cost Analysis in IR: Challenges and Opportunities.

    PubMed

    Roudsari, Bahman; McWilliams, Justin; Bresnahan, Brian; Padia, Siddharth A

    2016-04-01

    Demonstration of value has become increasingly important in the current health care system. This review summarizes four of the most commonly used cost analysis methods relevant to IR that could be adopted to demonstrate the value of IR interventions: the cost minimization study, cost-effectiveness assessment, cost-utility analysis, and cost-benefit analysis. In addition, the issues of true cost versus hospital charges, modeling in cost studies, and sensitivity analysis are discussed.

  17. Screening and life-cycle cost models for new pulverized-coal heating plants: An integrated computer-based module for the central heating plant economic evaluation program (CHPECON). Final report

    SciTech Connect

    Sheng, R.; Kinast, J.A.; Biederman, R.; Blazek, C.F.; Lin, M.C.

    1995-07-01

    Public Law 99-190 requires the Department of Defense (DOD) to increase the use of coal for steam generation, but DOD also has an obligation to use the most economical fuel. In support of the coal conversion effort, the U.S. Army Construction Engineering Research Laboratories (USACERL) has been tasked to develop a series of screening and life-cycle cost models to determine when and where specific coal-combustion technologies can economically be implemented in Army central heating plants. This report documents a pulverized coal-fired boiler analysis model, part of the USACERL-developed Central Heating Plant Economics model (CHPECON). The model is divided into two parts. A preliminary screening model contains options for evaluating new heating plants and cogeneration facilities fueled with pulverized coal, as well as the previous options. A cost model uses the entries provided by the screening model to provide a conceptual facility design, capital (installed) costs of the facility, operation and maintenance costs over the life of the facility, and life-cycle costs. Using these numbers the model produces a summary value for the total life-cycle cost of the plant, and a levelized cost of service.

  18. Equipment Cost Estimator

    SciTech Connect

    2016-08-24

    The ECE application forecasts annual costs of preventive and corrective maintenance for budgeting purposes. Features within the application enable the user to change the specifications of the model to customize your forecast to best fit their needs and support “what if” analysis. Based on the user's selections, the ECE model forecasts annual maintenance costs. Preventive maintenance costs include the cost of labor to perform preventive maintenance activities at the specific frequency and labor rate. Corrective maintenance costs include the cost of labor and the cost of replacement parts. The application presents forecasted maintenance costs for the next five years in two tables: costs by year and costs by site.

  19. Television broadcast from space systems: Technology, costs

    NASA Technical Reports Server (NTRS)

    Cuccia, C. L.

    1981-01-01

    Broadcast satellite systems are described. The technologies which are unique to both high power broadcast satellites and small TV receive-only earth terminals are also described. A cost assessment of both space and earth segments is included and appendices present both a computer model for satellite cost and the pertinent reported experience with the Japanese BSE.

  20. Integrating Cost Models with Systems Engineering Tools

    DTIC Science & Technology

    1994-07-20

    expert judgment, bottom up (Industrial Engineering), and top down or parametric estimation . Table IV presents a Production Program Training...Bottom Up approach. life cycle cost. These costs are generally computed by Top Down or Parametric Estimation is the most analogy, expert opinion or by

  1. Cost of energy from utility-scale PV systems

    SciTech Connect

    Stolte, W.J.; Whisnant, R.A.; McGowin, C.R.

    1994-12-31

    The cost of energy produced by three different photovoltaic (PV) power plants was estimated based on PV cell and module technology expected to be available by 1995. Plant designs were created for two high concentration PV plants (500 suns), both based on advanced back-contact silicon cell technology, and a thin-film, flat plate plant using copper indium diselenide (CIS) cell technology. The concentrator plants included a central receiver plant using stretched-membrane heliostats and a Fresnel-lens module plant, both utilizing two-axis tracking. Basic plant design factors were selected to minimize 30-year levelized energy costs. Total capital requirements to construct the three plants were estimated through detailed cost estimates. Costs of the cell and module components of the plants were determined by modeling their manufacturing processes when producing modules at an annual rate of both 25 MW/year and 100 MW/year. Energy outputs were determined by computer modeling with hourly insolation and temperature profiles for the two sites. Power system simulation studies were carried out to estimate the impact of the PV plants on system power production cost using synthetic, but realistic, utility system definitions. Both high and low growth rate utility system expansion plans were considered, and capacity and energy credits were calculated. Additionally, credits were calculated for environmental externalities. Benefit/cost ratios for each plant and site were determined. The results of the study provide projections in 1990 dollars of the cost of electric energy from utility-scale PV plants assuming a mature technology that may be available by about 1995. The cost of energy produced by the CIS flat plate plant was projected to be as low as 10.8 cents/kWh. The concentrator plant results were only slightly higher at 12.3 cents/kWh for the Fresnel lens plant and 13.1 cents/kWh for the central receiver plant. 18 refs., 11 figs., 7 tabs.

  2. Optimal Management Design of a Pump and Treat System at an Industrial Complex using a Parallel Computing Method

    NASA Astrophysics Data System (ADS)

    Park, Y.

    2013-12-01

    Pump and treat systems for groundwater remediation usually require enormous remediation costs which includes remediation time and resources. To reduce remediation costs, it is important to get an optimal management design of a pump and treat system. The optimization of management design requires immense computing time and resources. A parallel computing method, which has been developed to reduce computing time and resources in computer sciences, was applied to get an optimal management design of a pump and treat system. The optimization using a parallel computing method was performed for a pump and treat system of groundwater remediation at an industrial complex site in Korea as an example. Trichloroethylene (TCE) and other solvents have been known as the main contaminants of groundwater at the site. For an optimization technique, a genetic algorithm was selected. For groundwater flow and contaminant transport simulations, MODFLOW, MT3D and RT3D were selected. To test the cost effectiveness, various cases were conceived and optimized. The cost effectiveness of remediation was determined by the total costs, which includes the installation costs of pumping wells and the operational costs of the pump and treat system in addition to the cleaning costs of remediation systems. This subject is supported by Korea Ministry of Environment as "The GAIA Project(173-092-010)".

  3. Computational methods in drug discovery

    PubMed Central

    Leelananda, Sumudu P

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341

  4. Computational methods in drug discovery.

    PubMed

    Leelananda, Sumudu P; Lindert, Steffen

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  5. Automatic Computer Mapping of Terrain

    NASA Technical Reports Server (NTRS)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  6. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  7. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  8. Hydropower Baseline Cost Modeling

    SciTech Connect

    O'Connor, Patrick W.; Zhang, Qin Fen; DeNeale, Scott T.; Chalise, Dol Raj; Centurion, Emma E.

    2015-01-01

    Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost-estimating tools that can support the national-scale evaluation of hydropower resources.

  9. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  10. Light multinary computing

    NASA Astrophysics Data System (ADS)

    Arago, Jaime

    2012-11-01

    Next-generation optical communication and optical computing imply an evolution from binary to multinary computing. Light multinary computing encodes data using pulses of light components in higher orders than binary and processes it using truth tables larger than Boolean ones. This results in lesser encoded data that can be processed at faster speeds. We use a general-purpose optical transistor as the building block to develop the main computing units for counting, distributing, storing, and logically operating the arithmetic addition of two bytes of base-10 data. Currently available optical switching technologies can be used to physically implement light multinary computing to achieve ultra-high speed communication and computing.

  11. Additive Manufactured Product Integrity

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Wells, Doug; James, Steve; Nichols, Charles

    2017-01-01

    NASA is providing key leadership in an international effort linking NASA and non-NASA resources to speed adoption of additive manufacturing (AM) to meet NASA's mission goals. Participants include industry, NASA's space partners, other government agencies, standards organizations and academia. Nondestructive Evaluation (NDE) is identified as a universal need for all aspects of additive manufacturing.

  12. Additional Security Considerations for Grid Management

    NASA Technical Reports Server (NTRS)

    Eidson, Thomas M.

    2003-01-01

    The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.

  13. Compact and low-cost THz QTDS system.

    PubMed

    Probst, Thorsten; Rehn, Arno; Koch, Martin

    2015-08-24

    We present a terahertz quasi time domain spectroscopy (QTDS) system setup which is improved regarding cost and compactness. The diode laser is mounted directly onto the optical delay line, making the optical setup more compact. The system is operated using a Raspberry Pi and an additional sound card. This combination replaces the desktop/laptop computer, the lock-in-amplifier, the stage controller and the signal generator. We examined not only a commercially available stepper motor driven delay line, but also the repurposed internal mechanics from a DVD drive. We characterize the performance of the new system concept.

  14. 24 CFR 208.112 - Cost.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... formatted data, including either the purchase and maintenance of computer hardware or software, or both, the... increases. (b) At the owner's option, the cost of the computer software may include service contracts to... provide maintenance or training or both, the software must be updated to incorporate changes or...

  15. Predicting hospital accounting costs

    PubMed Central

    Newhouse, Joseph P.; Cretin, Shan; Witsberger, Christina J.

    1989-01-01

    Two alternative methods to Medicare Cost Reports that provide information about hospital costs more promptly but less accurately are investigated. Both employ utilization data from current-year bills. The first attaches costs to utilization data using cost-charge ratios from the previous year's cost report; the second uses charges from current year's bills. The first method is the more accurate of the two, but even using it, only 40 percent of hospitals had predicted costs within plus or minus 5 percent of actual costs. The feasibility and cost of obtaining cost reports from a small, fast-track sample of hospitals should be investigated. PMID:10313352

  16. The addition of computer simulated noise to investigate radiation dose and image quality in images with spatial correlation of statistical noise: an example application to X-ray CT of the brain.

    PubMed

    Britten, A J; Crotty, M; Kiremidjian, H; Grundy, A; Adam, E J

    2004-04-01

    This study validates a method to add spatially correlated statistical noise to an image, applied to transaxial X-ray CT images of the head to simulate exposure reduction by up to 50%. 23 patients undergoing routine head CT had three additional slices acquired for validation purposes, two at the same clinical 420 mAs exposure and one at 300 mAs. Images at the level of the cerebrospinal fluid filled ventricles gave readings of noise from a single image, with subtraction of image pairs to obtain noise readings from non-uniform tissue regions. The spatial correlation of the noise was determined and added to the acquired 420 mAs image to simulate images at 340 mAs, 300 mAs, 260 mAs and 210 mAs. Two radiologists assessed the images, finding little difference between the 300 mAs simulated and acquired images. The presence of periventricular low density lesions (PVLD) was used as an example of the effect of simulated dose reduction on diagnostic accuracy, and visualization of the internal capsule was used as a measure of image quality. Diagnostic accuracy for the diagnosis of PVLD did not fall significantly even down to 210 mAs, though visualization of the internal capsule was poorer at lower exposure. Further work is needed to investigate means of measuring statistical noise without the need for uniform tissue areas, or image pairs. This technique has been shown to allow sufficiently accurate simulation of dose reduction and image quality degradation, even when the statistical noise is spatially correlated.

  17. A new application for food customization with additive manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Serenó, L.; Vallicrosa, G.; Delgado, J.; Ciurana, J.

    2012-04-01

    Additive Manufacturing (AM) technologies have emerged as a freeform approach capable of producing almost any complete three dimensional (3D) objects from computer-aided design (CAD) data by successively adding material layer by layer. Despite the broad range of possibilities, commercial AM technologies remain complex and expensive, making them suitable only for niche applications. The developments of the Fab@Home system as an open AM technology discovered a new range of possibilities of processing different materials such as edible products. The main objective of this work is to analyze and optimize the manufacturing capacity of this system when producing 3D edible objects. A new heated syringe deposition tool was developed and several process parameters were optimized to adapt this technology to consumers' needs. The results revealed in this study show the potential of this system to produce customized edible objects without qualified personnel knowledge, therefore saving manufacturing costs compared to traditional technologies.

  18. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  19. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  20. Additive manufacturing of hybrid circuits

    DOE PAGES

    Bell, Nelson S.; Sarobol, Pylin; Cook, Adam; ...

    2016-03-26

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects.more » Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. As a result, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.« less

  1. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  2. Computational study of the transition state for H[sub 2] addition to Vaska-type complexes (trans-Ir(L)[sub 2](CO)X). Substituent effects on the energy barrier and the origin of the small H[sub 2]/D[sub 2] kinetic isotope effect

    SciTech Connect

    Abu-Hasanayn, F.; Goldman, A.S.; Krogh-Jespersen, K. )

    1993-06-03

    Ab initio molecular orbital methods have been used to study transition state properties for the concerted addition reaction of H[sub 2] to Vaska-type complexes, trans-Ir(L)[sub 2](CO)X, 1 (L = PH[sub 3] and X = F, Cl, Br, I, CN, or H; L = NH[sub 3] and X = Cl). Stationary points on the reaction path retaining the trans-L[sub 2] arrangement were located at the Hartree-Fock level using relativistic effective core potentials and valence basis sets of double-[zeta] quality. The identities of the stationary points were confirmed by normal mode analysis. Activation energy barriers were calculated with electron correlation effects included via Moller-Plesset perturbation theory carried fully through fourth order, MP4(SDTQ). The more reactive complexes feature structurally earlier transition states and larger reaction exothermicities, in accord with the Hammond postulate. The experimentally observed increase in reactivity of Ir(PPh[sub 3])[sub 2](CO)X complexes toward H[sub 2] addition upon going from X = F to X = I is reproduced well by the calculations and is interpreted to be a consequence of diminished halide-to-Ir [pi]-donation by the heavier halogens. Computed activation barriers (L = PH[sub 3]) range from 6.1 kcal/mol (X = H) to 21.4 kcal/mol (X = F). Replacing PH[sub 3] by NH[sub 3] when X = Cl increases the barrier from 14.1 to 19.9 kcal/mol. Using conventional transition state theory, the kinetic isotope effects for H[sub 2]/D[sub 2] addition are computed to lie between 1.1 and 1.7 with larger values corresponding to earlier transition states. Judging from the computational data presented here, tunneling appears to be unimportant for H[sub 2] addition to these iridium complexes. 51 refs., 4 tabs.

  3. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  4. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  5. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 6: Implementation schedule, development costs, operational costs, benefit assessment, impact on company organization, spin-off assessment, phase 1, tasks 3 to 8

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.; Dublin, M.

    1973-01-01

    A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance.

  6. Speed test results and hardware/software study of computational speed problem, appendix D

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  7. 42 CFR 412.29 - Excluded rehabilitation units: Additional requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Excluded rehabilitation units: Additional... Costs and Inpatient Capital-Related Costs § 412.29 Excluded rehabilitation units: Additional... paid under the prospective payment system specified in § 412.1(a)(3), a rehabilitation unit must...

  8. Blended Teaching and Learning of Computer Programming Skills in Engineering Curricula

    ERIC Educational Resources Information Center

    El-Zein, Abbas; Langrish, Tim; Balaam, Nigel

    2009-01-01

    Many engineering schools include computer programming as part of a first-year course taught to large engineering classes. This approach is effective in rationalizing resources and improving the cost-effectiveness of course delivery. In addition, it can lead to wholesale improvements in teaching and learning. However, class sizes and the variety of…

  9. Health Monitoring System Technology Assessments: Cost Benefits Analysis

    NASA Technical Reports Server (NTRS)

    Kent, Renee M.; Murphy, Dennis A.

    2000-01-01

    The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.

  10. High Performance Computing CFRD -- Final Technial Report

    SciTech Connect

    Hope Forsmann; Kurt Hamman

    2003-01-01

    National Engineering and Environmental Laboratory (INEEL), it is crucial to know the capabilities of a software package’s SMP (shared memory processor) version or cluster (distributed memory) version. Of utmost importance is knowledge of a software package’s cost and implementation challenges. Additionally, it is important to determine the hardware performance of a computing workstation. The level of performance of software is inextricably tied to the computer hardware upon which it is run. Bechtel can do more for its clients in the same amount of time and/or solve more complex problems if computer workstations and associated software are optimized. As a Bechtel Management and Operations Facility, INEEL engineers and scientists find solutions to problems important to Bechtel. Both INEEL engineers and managers must be informed and educated in high performance computing (HPC) techniques and issues to better accomplish their research.

  11. Group Sparse Additive Models

    PubMed Central

    Yin, Junming; Chen, Xi; Xing, Eric P.

    2016-01-01

    We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ1/ℓ2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

  12. College of Architecture Addition, Burchard Hall.

    ERIC Educational Resources Information Center

    Design Cost Data, 2001

    2001-01-01

    Describes the architectural design, costs, general description, and square footage data for the College of Architecture Addition, Burchard Hall in Blacksburg, Virginia. A floor plan and photos are included along with a list of manufacturers and suppliers used for the project. (GR)

  13. 30 CFR 256.53 - Additional bonds.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Additional bonds. 256.53 Section 256.53 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR... the Government and the estimated costs of lease abandonment and cleanup are less than the...

  14. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  15. Fused Lasso Additive Model

    PubMed Central

    Petersen, Ashley; Witten, Daniela; Simon, Noah

    2016-01-01

    We consider the problem of predicting an outcome variable using p covariates that are measured on n independent observations, in a setting in which additive, flexible, and interpretable fits are desired. We propose the fused lasso additive model (FLAM), in which each additive function is estimated to be piecewise constant with a small number of adaptively-chosen knots. FLAM is the solution to a convex optimization problem, for which a simple algorithm with guaranteed convergence to a global optimum is provided. FLAM is shown to be consistent in high dimensions, and an unbiased estimator of its degrees of freedom is proposed. We evaluate the performance of FLAM in a simulation study and on two data sets. Supplemental materials are available online, and the R package flam is available on CRAN. PMID:28239246

  16. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamines containing phenylethynyl groups and various ratios of phthalic anhydride and 4-phenylethynylphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pyrrolidi none to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  17. The computer-based lecture.

    PubMed

    Wofford, M M; Spickard, A W; Wofford, J L

    2001-07-01

    Advancing computer technology, cost-containment pressures, and desire to make innovative improvements in medical education argue for moving learning resources to the computer. A reasonable target for such a strategy is the traditional clinical lecture. The purpose of the lecture, the advantages and disadvantages of "live" versus computer-based lectures, and the technical options in computerizing the lecture deserve attention in developing a cost-effective, complementary learning strategy that preserves the teacher-learner relationship. Based on a literature review of the traditional clinical lecture, we build on the strengths of the lecture format and discuss strategies for converting the lecture to a computer-based learning presentation.

  18. Additives in plastics.

    PubMed Central

    Deanin, R D

    1975-01-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products. PMID:1175566

  19. Additives in plastics.

    PubMed

    Deanin, R D

    1975-06-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products.

  20. Neutron Characterization for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.

    2013-01-01

    Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL