Sample records for computational cost increases

  1. 20 CFR 226.13 - Cost-of-living increase in employee vested dual benefit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... RAILROAD RETIREMENT ACT COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing an Employee... increase is based on the cost-of-living increases in social security benefits during the period from...

  2. Cut Costs with Thin Client Computing.

    ERIC Educational Resources Information Center

    Hartley, Patrick H.

    2001-01-01

    Discusses how school districts can considerably increase the number of administrative computers in their districts without a corresponding increase in costs by using the "Thin Client" component of the Total Cost of Ownership (TCC) model. TCC and Thin Client are described, including its software and hardware components. An example of a…

  3. 20 CFR 404.278 - Additional cost-of-living increase.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Section 404.278 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.278...) Measuring period for the additional increase—(1) Beginning. To compute the additional increase, we begin...

  4. Guidelines for the Design of Computers and Information Processing Systems to Increase Their Access by Persons with Disabilities. Version 2.0.

    ERIC Educational Resources Information Center

    Vanderheiden, Gregg C.; Lee, Charles C.

    Many low-cost and no-cost modifications to computers would greatly increase the number of disabled individuals who could use standard computers without requiring custom modifications, and would increase the ability to attach special input and output systems. The purpose of the Guidelines is to provide an awareness of these access problems and a…

  5. 20 CFR 404.270 - Cost-of-living increases.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Cost-of-living increases. 404.270 Section 404.270 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living...

  6. Cloud computing for comparative genomics

    PubMed Central

    2010-01-01

    Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786

  7. Cloud computing for comparative genomics.

    PubMed

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  8. 20 CFR 404.275 - How is an automatic cost-of-living increase calculated?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... calculated? 404.275 Section 404.275 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases... compute the average of the CPI for the quarters that begin and end the measuring period by adding the...

  9. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  10. Cost and resource utilization associated with use of computed tomography to evaluate chest pain in the emergency department: the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) study.

    PubMed

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron

    2013-09-01

    Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (P<0.001). However, when the prevalence of obstructive CAD increases, index hospitalization cost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD <30%. However, increased cost would be anticipated in populations with higher prevalence of disease.

  11. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  12. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  13. Identifying and locating surface defects in wood: Part of an automated lumber processing system

    Treesearch

    Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa

    1983-01-01

    Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...

  14. Performance Comparison of Mainframe, Workstations, Clusters, and Desktop Computers

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2005-01-01

    A performance evaluation of a variety of computers frequently found in a scientific or engineering research environment was conducted using a synthetic and application program benchmarks. From a performance perspective, emerging commodity processors have superior performance relative to legacy mainframe computers. In many cases, the PC clusters exhibited comparable performance with traditional mainframe hardware when 8-12 processors were used. The main advantage of the PC clusters was related to their cost. Regardless of whether the clusters were built from new computers or whether they were created from retired computers their performance to cost ratio was superior to the legacy mainframe computers. Finally, the typical annual maintenance cost of legacy mainframe computers is several times the cost of new equipment such as multiprocessor PC workstations. The savings from eliminating the annual maintenance fee on legacy hardware can result in a yearly increase in total computational capability for an organization.

  15. Computational Methods for Stability and Control (COMSAC): The Time Has Come

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.

    2005-01-01

    Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.

  16. 20 CFR 404.273 - When are automatic cost-of-living increases effective?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false When are automatic cost-of-living increases effective? 404.273 Section 404.273 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases...

  17. 20 CFR 404.271 - When automatic cost-of-living increases apply.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false When automatic cost-of-living increases apply. 404.271 Section 404.271 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404...

  18. Evaluating Thin Client Computers for Use by the Polish Army

    DTIC Science & Technology

    2006-06-01

    43 Figure 15. Annual Electricity Cost and Savings for 5 to 100 Users (source: Thin Client Computing...50 percent in hard costs in the first year of thin client network deployment.20 However, the greatest savings come from the reduction in soft costs ...resources from both the classrooms and home. The thin client solution increased the reliability of the IT infrastructure and resulted in cost savings

  19. Virtualization and cloud computing in dentistry.

    PubMed

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  20. Cost Optimization Model for Business Applications in Virtualized Grid Environments

    NASA Astrophysics Data System (ADS)

    Strebel, Jörg

    The advent of Grid computing gives enterprises an ever increasing choice of computing options, yet research has so far hardly addressed the problem of mixing the different computing options in a cost-minimal fashion. The following paper presents a comprehensive cost model and a mixed integer optimization model which can be used to minimize the IT expenditures of an enterprise and help in decision-making when to outsource certain business software applications. A sample scenario is analyzed and promising cost savings are demonstrated. Possible applications of the model to future research questions are outlined.

  1. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  2. Trends in computer hardware and software.

    PubMed

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  3. Development of a small-scale computer cluster

    NASA Astrophysics Data System (ADS)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  4. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  5. 24 CFR 208.112 - Cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... increases. (b) At the owner's option, the cost of the computer software may include service contracts to... requirements. (c) The source of funds for the purchase of hardware or software, or contracting for services for... formatted data, including either the purchase and maintenance of computer hardware or software, or both, the...

  6. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  7. A direct-to-drive neural data acquisition system.

    PubMed

    Kinney, Justin P; Bernstein, Jacob G; Meyer, Andrew J; Barber, Jessica B; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T; Kopell, Nancy J; Boyden, Edward S

    2015-01-01

    Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future.

  8. A direct-to-drive neural data acquisition system

    PubMed Central

    Kinney, Justin P.; Bernstein, Jacob G.; Meyer, Andrew J.; Barber, Jessica B.; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T.; Kopell, Nancy J.; Boyden, Edward S.

    2015-01-01

    Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future. PMID:26388740

  9. Radiation Tolerant, FPGA-Based SmallSat Computer System

    NASA Technical Reports Server (NTRS)

    LaMeres, Brock J.; Crum, Gary A.; Martinez, Andres; Petro, Andrew

    2015-01-01

    The Radiation Tolerant, FPGA-based SmallSat Computer System (RadSat) computing platform exploits a commercial off-the-shelf (COTS) Field Programmable Gate Array (FPGA) with real-time partial reconfiguration to provide increased performance, power efficiency and radiation tolerance at a fraction of the cost of existing radiation hardened computing solutions. This technology is ideal for small spacecraft that require state-of-the-art on-board processing in harsh radiation environments but where using radiation hardened processors is cost prohibitive.

  10. GPSS computer simulation of aircraft passenger emergency evacuations.

    DOT National Transportation Integrated Search

    1978-06-01

    The costs of civil air transport emergency evacuation demonstrations using human subjects have risen as seating capacities of these aircraft have increased. Repeated tests further increase the costs and also the risks of injuries to participants. A m...

  11. Parallel spatial direct numerical simulations on the Intel iPSC/860 hypercube

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Zubair, Mohammad

    1993-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube is documented. The direct numerical simulation approach is used to compute spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows. The feasibility of using the PSDNS on the hypercube to perform transition studies is examined. The results indicate that the direct numerical simulation approach can effectively be parallelized on a distributed-memory parallel machine. By increasing the number of processors nearly ideal linear speedups are achieved with nonoptimized routines; slower than linear speedups are achieved with optimized (machine dependent library) routines. This slower than linear speedup results because the Fast Fourier Transform (FFT) routine dominates the computational cost and because the routine indicates less than ideal speedups. However with the machine-dependent routines the total computational cost decreases by a factor of 4 to 5 compared with standard FORTRAN routines. The computational cost increases linearly with spanwise wall-normal and streamwise grid refinements. The hypercube with 32 processors was estimated to require approximately twice the amount of Cray supercomputer single processor time to complete a comparable simulation; however it is estimated that a subgrid-scale model which reduces the required number of grid points and becomes a large-eddy simulation (PSLES) would reduce the computational cost and memory requirements by a factor of 10 over the PSDNS. This PSLES implementation would enable transition simulations on the hypercube at a reasonable computational cost.

  12. Cloud Computing: Should It Be Integrated into the Curriculum?

    ERIC Educational Resources Information Center

    Changchit, Chuleeporn

    2015-01-01

    Cloud computing has become increasingly popular among users and businesses around the world, and education is no exception. Cloud computing can bring an increased number of benefits to an educational setting, not only for its cost effectiveness, but also for the thirst for technology that college students have today, which allows learning and…

  13. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  14. Application of ubiquitous computing in personal health monitoring systems.

    PubMed

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  15. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651

  16. The thermodynamic efficiency of computations made in cells across the range of life

    NASA Astrophysics Data System (ADS)

    Kempes, Christopher P.; Wolpert, David; Cohen, Zachary; Pérez-Mercader, Juan

    2017-11-01

    Biological organisms must perform computation as they grow, reproduce and evolve. Moreover, ever since Landauer's bound was proposed, it has been known that all computation has some thermodynamic cost-and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However, this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the useful efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single- and multicellular eukaryotes. However, the rates of total computation per unit mass are non-monotonic in bacteria with increasing cell size, and also change across different biological architectures, including the shift from unicellular to multicellular eukaryotes. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  17. The impact of technological change on census taking.

    PubMed

    Brackstone, G J

    1984-01-01

    The increasing costs of traditional census collection methods have forced census administrators to look at the possibility of using administrative record systems in order to obtain population data. This article looks at the recent technological developments which have taken place in the last decade, and how they may affect data collection for the 1990 census. Because it is important to allow sufficient developmental and testing time of potential automated methods and technologies, it is not too soon to look at the trends resulting from technological advances and their implications for census data collection. These trends are: 1) the declining ratio of computing costs to manpower costs; 2) the increasing ratio of power and capacity of computers to their physical size; 3) declining data storage costs; 4) the increasing public acceptance of computers; 5) the increasing workforce familiarity with computers; and 6) the growing interactive computing capacity. Traditional use of computers for government data gathering operations were primarily for the processing stage. Now the possibility of applying these trends to census material may influence all aspects of the process; from questionnaire design and production, to data analysis. Examples include: the production of high quality maps for geographic frameworks, optical readers for data entry, the ability to provide users with a final data base, as well as printed output, and quicker dissemination of data results. Although these options exist, just like the use of administrative records for statistical purposes, they must be carefully analysed in context to the purposes for which they were created. The limitations of using administrative records for the and 2) definition, coverage, and quality limitations could bias statistical data derived from them. Perhaps they should be used as potential complementary sources of data, and not as replacements for census data. Influencing the evolution of these administrative records will help increase their chances fo being used for future census information.

  18. An economic model to evaluate cost-effectiveness of computer assisted knee replacement surgery in Norway.

    PubMed

    Gøthesen, Øystein; Slover, James; Havelin, Leif; Askildsen, Jan Erik; Malchau, Henrik; Furnes, Ove

    2013-07-06

    The use of Computer Assisted Surgery (CAS) for knee replacements is intended to improve the alignment of knee prostheses in order to reduce the number of revision operations. Is the cost effectiveness of computer assisted surgery influenced by patient volume and age? By employing a Markov model, we analysed the cost effectiveness of computer assisted surgery versus conventional arthroplasty with respect to implant survival and operation volume in two theoretical Norwegian age cohorts. We obtained mortality and hospital cost data over a 20-year period from Norwegian registers. We presumed that the cost of an intervention would need to be below NOK 500,000 per QALY (Quality Adjusted Life Year) gained, to be considered cost effective. The added cost of computer assisted surgery, provided this has no impact on implant survival, is NOK 1037 and NOK 1414 respectively for 60 and 75-year-olds per quality-adjusted life year at a volume of 25 prostheses per year, and NOK 128 and NOK 175 respectively at a volume of 250 prostheses per year. Sensitivity analyses showed that the 10-year implant survival in cohort 1 needs to rise from 89.8% to 90.6% at 25 prostheses per year, and from 89.8 to 89.9% at 250 prostheses per year for computer assisted surgery to be considered cost effective. In cohort 2, the required improvement is a rise from 95.1% to 95.4% at 25 prostheses per year, and from 95.10% to 95.14% at 250 prostheses per year. The cost of using computer navigation for total knee replacements may be acceptable for 60-year-old as well as 75-year-old patients if the technique increases the implant survival rate just marginally, and the department has a high operation volume. A low volume department might not achieve cost-effectiveness unless computer navigation has a more significant impact on implant survival, thus may defer the investments until such data are available.

  19. Computer Based Education.

    ERIC Educational Resources Information Center

    Fauley, Franz E.

    1980-01-01

    A case study of what one company did to increase the productivity of its sales force and generate cost savings by using computer-assisted instruction to teach salespeople at regional offices. (Editor)

  20. Real-time interactive 3D computer stereography for recreational applications

    NASA Astrophysics Data System (ADS)

    Miyazawa, Atsushi; Ishii, Motonaga; Okuzawa, Kazunori; Sakamoto, Ryuuichi

    2008-02-01

    With the increasing calculation costs of 3D computer stereography, low-cost, high-speed implementation of the latter requires effective distribution of computing resources. In this paper, we attempt to re-classify 3D display technologies on the basis of humans' 3D perception, in order to determine what level of presence or reality is required in recreational video game systems. We then discuss the design and implementation of stereography systems in two categories of the new classification.

  1. Securing the Data Storage and Processing in Cloud Computing Environment

    ERIC Educational Resources Information Center

    Owens, Rodney

    2013-01-01

    Organizations increasingly utilize cloud computing architectures to reduce costs and energy consumption both in the data warehouse and on mobile devices by better utilizing the computing resources available. However, the security and privacy issues with publicly available cloud computing infrastructures have not been studied to a sufficient depth…

  2. Assessment of regional management strategies for controlling seawater intrusion

    USGS Publications Warehouse

    Reichard, E.G.; Johnson, T.A.

    2005-01-01

    Simulation-optimization methods, applied with adequate sensitivity tests, can provide useful quantitative guidance for controlling seawater intrusion. This is demonstrated in an application to the West Coast Basin of coastal Los Angeles that considers two management options for improving hydraulic control of seawater intrusion: increased injection into barrier wells and in lieu delivery of surface water to replace current pumpage. For the base-case optimization analysis, assuming constant groundwater demand, in lieu delivery was determined to be most cost effective. Reduced-cost information from the optimization provided guidance for prioritizing locations for in lieu delivery. Model sensitivity to a suite of hydrologic, economic, and policy factors was tested. Raising the imposed average water-level constraint at the hydraulic-control locations resulted in nonlinear increases in cost. Systematic varying of the relative costs of injection and in lieu water yielded a trade-off curve between relative costs and injection/in lieu amounts. Changing the assumed future scenario to one of increasing pumpage in the adjacent Central Basin caused a small increase in the computed costs of seawater intrusion control. Changing the assumed boundary condition representing interaction with an adjacent basin did not affect the optimization results. Reducing the assumed hydraulic conductivity of the main productive aquifer resulted in a large increase in the model-computed cost. Journal of Water Resources Planning and Management ?? ASCE.

  3. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  4. Computer imaging and workflow systems in the business office.

    PubMed

    Adams, W T; Veale, F H; Helmick, P M

    1999-05-01

    Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.

  5. Developing Flexible Networked Lighting Control Systems

    Science.gov Websites

    , Bluetooth, ZigBee and others are increasingly used for building control purposes. Low-cost computation : Bundling digital intelligence at the sensors and lights adds virtually no incremental cost. Coupled with cost. Research Goals and Objectives This project "Developing Flexible, Networked Lighting Control

  6. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    PubMed

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (<$100,000 per life-year), would remain unchanged for intermediate- or unknown-value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. Broader diffusion of VBID may amplify benefits from US health care without increasing health expenditures.

  7. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  8. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  9. Using Bloom's and Webb's Taxonomies to Integrate Emerging Cybersecurity Topics into a Computing Curriculum

    ERIC Educational Resources Information Center

    Harris, Mark A.; Patten, Karen P.

    2015-01-01

    Recent high profile hackings have cost companies millions of dollars resulting in an increasing priority to protect government and business data. Universities are under increased pressure to produce graduates with better security knowledge and skills, particularly emerging cybersecurity skills. Although accredited undergraduate computing programs…

  10. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  11. The changing nature of spacecraft operations: From the Vikings of the 1970's to the great observatories of the 1990's and beyond

    NASA Technical Reports Server (NTRS)

    Ledbetter, Kenneth W.

    1992-01-01

    Four trends in spacecraft flight operations are discussed which will reduce overall program costs. These trends are the use of high-speed, highly reliable data communications systems for distributing operations functions to more convenient and cost-effective sites; the improved capability for remote operation of sensors; a continued rapid increase in memory and processing speed of flight qualified computer chips; and increasingly capable ground-based hardware and software systems, notably those augmented by artificial intelligence functions. Changes reflected by these trends are reviewed starting from the NASA Viking missions of the early 70s, when mission control was conducted at one location using expensive and cumbersome mainframe computers and communications equipment. In the 1980s, powerful desktop computers and modems enabled the Magellan project team to operate the spacecraft remotely. In the 1990s, the Hubble Space Telescope project uses multiple color screens and automated sequencing software on small computers. Given a projection of current capabilities, future control centers will be even more cost-effective.

  12. [Introduction of a bar coding pharmacy stock replenishment system in a prehospital emergency medical unit: economical impact].

    PubMed

    Dupuis, S; Fecci, J-L; Noyer, P; Lecarpentier, E; Chollet-Xémard, C; Margenet, A; Marty, J; Combes, X

    2009-01-01

    To assess economical impact after introduction of a bar coding pharmacy stock replenishment system in a prehospital emergency medical unit. Observational before and after study. A computer system using specific software and bare-code technology was introduced in the pre hospital emergency medical unit (Smur). Overall activity and costs related to pharmacy were recorded annually during two periods: the first 2 years period before computer system introduction and the second one during the 4 years following this system installation. The overall clinical activity increased by 10% between the two periods whereas pharmacy related costs continuously decreased after the start of pharmacy management computer system use. Pharmacy stock management was easier after introduction of the new stock replenishment system. The mean pharmacy related cost of one patient management was 13 Euros before and 9 Euros after the introduction of the system. The overall cost savings during the studied period was calculated to reach 134,000 Euros. The introduction of a specific pharmacy management computer system allowed to do important costs savings in a prehospital emergency medical unit.

  13. The Cost Effectiveness of 22 Approaches for Raising Student Achievement

    ERIC Educational Resources Information Center

    Yeh, Stuart S.

    2010-01-01

    Review of cost-effectiveness studies suggests that rapid assessment is more cost effective with regard to student achievement than comprehensive school reform (CSR), cross-age tutoring, computer-assisted instruction, a longer school day, increases in teacher education, teacher experience or teacher salaries, summer school, more rigorous math…

  14. Anytime Prediction: Efficient Ensemble Methods for Any Computational Budget

    DTIC Science & Technology

    2014-01-21

    difficult problem and is the focus of this work. 1.1 Motivation The number of machine learning applications which involve real time and latency sensitive pre...significantly increasing latency , and the computational costs associated with hosting a service are often critical to its viability. For such...balancing training costs, concerns such as scalability and tractability are often more important, as opposed to factors such as latency which are more

  15. A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.

    ERIC Educational Resources Information Center

    Stevenson, John F.; And Others

    Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…

  16. Navigating the Challenges of the Cloud

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2010-01-01

    Cloud computing is increasingly popular in education. Cloud computing is "the delivery of computer services from vast warehouses of shared machines that enables companies and individuals to cut costs by handing over the running of their email, customer databases or accounting software to someone else, and then accessing it over the internet."…

  17. Computer Electromagnetics and Supercomputer Architecture

    NASA Technical Reports Server (NTRS)

    Cwik, Tom

    1993-01-01

    The dramatic increase in performance over the last decade for microporcessor computations is compared with that for the supercomputer computations. This performance, the projected performance, and a number of other issues such as cost and the inherent pysical limitations in curent supercomputer technology have naturally led to parallel supercomputers and ensemble of interconnected microprocessors.

  18. Microcomputer-Based Organizational Survey Assessment: Applications to Training.

    DTIC Science & Technology

    1987-08-01

    organizations, a growing need for efficient, flexible and cost effective training programs becomes paramount. To cope with these increased training demands, many...and cost effective training programs becomes paramount. To cnpe with these increased training demands, many orga- nizations have turned to Computer...organizational setings the need for better training will .,, For continue to increase (Wexley & Latham, 1981). Recent surveys of the literature document

  19. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence

    PubMed Central

    2013-01-01

    Background High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients’ adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients’ level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients’ adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Methods Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project’s research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention’s effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. Results The intervention’s cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for durations greater than three months and longer it was usually associated with a favorable cost per QALY. For intermediate and larger assumed effects and longer durations of intervention effectiveness, the intervention was associated with net cost savings. Conclusions Computer-delivered adherence interventions may be a cost-effective strategy to improve adherence in persons treated for HIV. Trial registration Clinicaltrials.gov identifier NCT01304186. PMID:23446180

  20. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence.

    PubMed

    Ownby, Raymond L; Waldrop-Valverde, Drenna; Jacobs, Robin J; Acevedo, Amarilis; Caballero, Joshua

    2013-02-28

    High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients' adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients' level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients' adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project's research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention's effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. The intervention's cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for durations greater than three months and longer it was usually associated with a favorable cost per QALY. For intermediate and larger assumed effects and longer durations of intervention effectiveness, the intervention was associated with net cost savings. Computer-delivered adherence interventions may be a cost-effective strategy to improve adherence in persons treated for HIV. Clinicaltrials.gov identifier NCT01304186.

  1. The Effect of Computers on School Air-Conditioning.

    ERIC Educational Resources Information Center

    Fickes, Michael

    2000-01-01

    Discusses the issue of increased air-conditioning demand when schools equip their classrooms with computers that require enhanced and costlier air-conditioning systems. Air-conditioning costs are analyzed in two elementary schools and a middle school. (GR)

  2. Space Transportation and the Computer Industry: Learning from the Past

    NASA Technical Reports Server (NTRS)

    Merriam, M. L.; Rasky, D.

    2002-01-01

    Since the space shuttle began flying in 1981, NASA has made a number of attempts to advance the state of the art in space transportation. In spite of billions of dollars invested, and several concerted attempts, no replacement for the shuttle is expected before 2010. Furthermore, the cost of access to space has dropped very slowly over the last two decades. On the other hand, the same two decades have seen dramatic progress in the computer industry. Computational speeds have increased by about a factor of 1000 and available memory, disk space, and network bandwidth has seen similar increases. At the same time, the cost of computing has dropped by about a factor of 10000. Is the space transportation problem simply harder? Or is there something to be learned from the computer industry? In looking for the answers, this paper reviews the early history of NASA's experience with supercomputers and NASA's visionary course change in supercomputer procurement strategy.

  3. Using a Cray Y-MP as an array processor for a RISC Workstation

    NASA Technical Reports Server (NTRS)

    Lamaster, Hugh; Rogallo, Sarah J.

    1992-01-01

    As microprocessors increase in power, the economics of centralized computing has changed dramatically. At the beginning of the 1980's, mainframes and super computers were often considered to be cost-effective machines for scalar computing. Today, microprocessor-based RISC (reduced-instruction-set computer) systems have displaced many uses of mainframes and supercomputers. Supercomputers are still cost competitive when processing jobs that require both large memory size and high memory bandwidth. One such application is array processing. Certain numerical operations are appropriate to use in a Remote Procedure Call (RPC)-based environment. Matrix multiplication is an example of an operation that can have a sufficient number of arithmetic operations to amortize the cost of an RPC call. An experiment which demonstrates that matrix multiplication can be executed remotely on a large system to speed the execution over that experienced on a workstation is described.

  4. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  5. The Changing Hardwood Export Market and Research to Keep the U.S. Competitive

    Treesearch

    Philip A. Araman

    1988-01-01

    Primary hardwood processors face many interrelated market, product, processing, and resource problems generated by the increasing export market. In processing, yields and quality must be increased and costs must be reduced to stay competitive. Computer-aided and computer-controlled automated processing is also needed. The industry needs to keep its products competitive...

  6. Can Broader Diffusion of Value-Based Insurance Design Increase Benefits from US Health Care without Increasing Costs? Evidence from a Computer Simulation Model

    PubMed Central

    Scott Braithwaite, R.; Omokaro, Cynthia; Justice, Amy C.; Nucifora, Kimberly; Roberts, Mark S.

    2010-01-01

    Background Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. Methods and Findings We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (<$100,000 per life-year), would remain unchanged for intermediate- or unknown-value services ($100,000–$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. Conclusion Broader diffusion of VBID may amplify benefits from US health care without increasing health expenditures. Please see later in the article for the Editors' Summary PMID:20169114

  7. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  8. Developing a Research Agenda for Ubiquitous Computing in Schools

    ERIC Educational Resources Information Center

    Zucker, Andrew

    2004-01-01

    Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…

  9. Paper Circuits: A Tangible, Low Threshold, Low Cost Entry to Computational Thinking

    ERIC Educational Resources Information Center

    Lee, Victor R.; Recker, Mimi

    2018-01-01

    In this paper, we propose that paper circuitry provides a productive space for exploring aspects of computational thinking, an increasingly critical 21st century skills for all students. We argue that the creation and operation of paper circuits involve learning about computational concepts such as rule-based constraints, operations, and defined…

  10. The HEPCloud Facility: elastic computing for High Energy Physics - The NOvA Use Case

    NASA Astrophysics Data System (ADS)

    Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Norman, A.; Timm, S.; Tiradani, A.

    2017-10-01

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.

  11. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  12. A Brief Description of the Kokkos implementation of the SNAP potential in ExaMiniMD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Aidan P.; Trott, Christian Robert

    2017-11-01

    Within the EXAALT project, the SNAP [1] approach is being used to develop high accuracy potentials for use in large-scale long-time molecular dynamics simulations of materials behavior. In particular, we have developed a new SNAP potential that is suitable for describing the interplay between helium atoms and vacancies in high-temperature tungsten[2]. This model is now being used to study plasma-surface interactions in nuclear fusion reactors for energy production. The high-accuracy of SNAP potentials comes at the price of increased computational cost per atom and increased computational complexity. The increased cost is mitigated by improvements in strong scaling that can bemore » achieved using advanced algorithms [3].« less

  13. Indirect Costs of Health Research--How They are Computed, What Actions are Needed. Report by the Comptroller General of the United States.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    A review by the General Accounting Office of various aspects of indirect costs associated with federal health research grants is presented. After an introduction detailing the scope of the review and defining indirect costs and federal participation, the report focuses on the causes of the rapid increase of indirect costs. Among findings was that…

  14. Using technology to support investigations in the electronic age: tracking hackers to large scale international computer fraud

    NASA Astrophysics Data System (ADS)

    McFall, Steve

    1994-03-01

    With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.

  15. Cardiology office computer use: primer, pointers, pitfalls.

    PubMed

    Shepard, R B; Blum, R I

    1986-10-01

    An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.

  16. Comparability of Computer Delivered versus Traditional Paper and Pencil Testing

    ERIC Educational Resources Information Center

    Strader, Douglas A.

    2012-01-01

    There are many advantages supporting the use of computers as an alternate mode of delivery for high stakes testing: cost savings, increased test security, flexibility in test administrations, innovations in items, and reduced scoring time. The purpose of this study was to determine if the use of computers as the mode of delivery had any…

  17. Cost-Effectiveness of Four Educational Interventions.

    ERIC Educational Resources Information Center

    Levin, Henry M.; And Others

    This study employs meta-analysis and cost-effectiveness instruments to evaluate and compare cross-age tutoring, computer assistance, class size reductions, and instructional time increases for their utility in improving elementary school reading and math scores. Using intervention effect studies as replication models, researchers first estimate…

  18. Cost-of-illness studies based on massive data: a prevalence-based, top-down regression approach.

    PubMed

    Stollenwerk, Björn; Welchowski, Thomas; Vogl, Matthias; Stock, Stephanie

    2016-04-01

    Despite the increasing availability of routine data, no analysis method has yet been presented for cost-of-illness (COI) studies based on massive data. We aim, first, to present such a method and, second, to assess the relevance of the associated gain in numerical efficiency. We propose a prevalence-based, top-down regression approach consisting of five steps: aggregating the data; fitting a generalized additive model (GAM); predicting costs via the fitted GAM; comparing predicted costs between prevalent and non-prevalent subjects; and quantifying the stochastic uncertainty via error propagation. To demonstrate the method, it was applied to aggregated data in the context of chronic lung disease to German sickness funds data (from 1999), covering over 7.3 million insured. To assess the gain in numerical efficiency, the computational time of the innovative approach has been compared with corresponding GAMs applied to simulated individual-level data. Furthermore, the probability of model failure was modeled via logistic regression. Applying the innovative method was reasonably fast (19 min). In contrast, regarding patient-level data, computational time increased disproportionately by sample size. Furthermore, using patient-level data was accompanied by a substantial risk of model failure (about 80 % for 6 million subjects). The gain in computational efficiency of the innovative COI method seems to be of practical relevance. Furthermore, it may yield more precise cost estimates.

  19. Addressing the computational cost of large EIT solutions.

    PubMed

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  20. Solution for a bipartite Euclidean traveling-salesman problem in one dimension

    NASA Astrophysics Data System (ADS)

    Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M.

    2018-05-01

    The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.

  1. Solution for a bipartite Euclidean traveling-salesman problem in one dimension.

    PubMed

    Caracciolo, Sergio; Di Gioacchino, Andrea; Gherardi, Marco; Malatesta, Enrico M

    2018-05-01

    The traveling-salesman problem is one of the most studied combinatorial optimization problems, because of the simplicity in its statement and the difficulty in its solution. We characterize the optimal cycle for every convex and increasing cost function when the points are thrown independently and with an identical probability distribution in a compact interval. We compute the average optimal cost for every number of points when the distance function is the square of the Euclidean distance. We also show that the average optimal cost is not a self-averaging quantity by explicitly computing the variance of its distribution in the thermodynamic limit. Moreover, we prove that the cost of the optimal cycle is not smaller than twice the cost of the optimal assignment of the same set of points. Interestingly, this bound is saturated in the thermodynamic limit.

  2. Cost-effectiveness of breast cancer screening policies using simulation.

    PubMed

    Gocgun, Y; Banjevic, D; Taghipour, S; Montgomery, N; Harvey, B J; Jardine, A K S; Miller, A B

    2015-08-01

    In this paper, we study breast cancer screening policies using computer simulation. We developed a multi-state Markov model for breast cancer progression, considering both the screening and treatment stages of breast cancer. The parameters of our model were estimated through data from the Canadian National Breast Cancer Screening Study as well as data in the relevant literature. Using computer simulation, we evaluated various screening policies to study the impact of mammography screening for age-based subpopulations in Canada. We also performed sensitivity analysis to examine the impact of certain parameters on number of deaths and total costs. The analysis comparing screening policies reveals that a policy in which women belonging to the 40-49 age group are not screened, whereas those belonging to the 50-59 and 60-69 age groups are screened once every 5 years, outperforms others with respect to cost per life saved. Our analysis also indicates that increasing the screening frequencies for the 50-59 and 60-69 age groups decrease mortality, and that the average number of deaths generally decreases with an increase in screening frequency. We found that screening annually for all age groups is associated with the highest costs per life saved. Our analysis thus reveals that cost per life saved increases with an increase in screening frequency. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Computer image generation: Reconfigurability as a strategy in high fidelity space applications

    NASA Technical Reports Server (NTRS)

    Bartholomew, Michael J.

    1989-01-01

    The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.

  4. Physics and Robotic Sensing -- the good, the bad, and approaches to making it work

    NASA Astrophysics Data System (ADS)

    Huff, Brian

    2011-03-01

    All of the technological advances that have benefited consumer electronics have direct application to robotics. Technological advances have resulted in the dramatic reduction in size, cost, and weight of computing systems, while simultaneously doubling computational speed every eighteen months. The same manufacturing advancements that have enabled this rapid increase in computational power are now being leveraged to produce small, powerful and cost-effective sensing technologies applicable for use in mobile robotics applications. Despite the increase in computing and sensing resources available to today's robotic systems developers, there are sensing problems typically found in unstructured environments that continue to frustrate the widespread use of robotics and unmanned systems. This talk presents how physics has contributed to the creation of the technologies that are making modern robotics possible. The talk discusses theoretical approaches to robotic sensing that appear to suffer when they are deployed in the real world. Finally the author presents methods being used to make robotic sensing more robust.

  5. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  6. Virtual aluminum castings: An industrial application of ICME

    NASA Astrophysics Data System (ADS)

    Allison, John; Li, Mei; Wolverton, C.; Su, Xuming

    2006-11-01

    The automotive product design and manufacturing community is continually besieged by Hercule an engineering, timing, and cost challenges. Nowhere is this more evident than in the development of designs and manufacturing processes for cast aluminum engine blocks and cylinder heads. Increasing engine performance requirements coupled with stringent weight and packaging constraints are pushing aluminum alloys to the limits of their capabilities. To provide high-quality blocks and heads at the lowest possible cost, manufacturing process engineers are required to find increasingly innovative ways to cast and heat treat components. Additionally, to remain competitive, products and manufacturing methods must be developed and implemented in record time. To bridge the gaps between program needs and engineering reality, the use of robust computational models in up-front analysis will take on an increasingly important role. This article describes just such a computational approach, the Virtual Aluminum Castings methodology, which was developed and implemented at Ford Motor Company and demonstrates the feasibility and benefits of integrated computational materials engineering.

  7. Research in the design of high-performance reconfigurable systems

    NASA Technical Reports Server (NTRS)

    Mcewan, S. D.; Spry, A. J.

    1985-01-01

    Computer aided design and computer aided manufacturing have the potential for greatly reducing the cost and lead time in the development of VLSI components. This potential paves the way for the design and fabrication of a wide variety of economically feasible high level functional units. It was observed that current computer systems have only a limited capacity to absorb new VLSI component types other than memory, microprocessors, and a relatively small number of other parts. The first purpose is to explore a system design which is capable of effectively incorporating a considerable number of VLSI part types and will both increase the speed of computation and reduce the attendant programming effort. A second purpose is to explore design techniques for VLSI parts which when incorporated by such a system will result in speeds and costs which are optimal. The proposed work may lay the groundwork for future efforts in the extensive simulation and measurements of the system's cost effectiveness and lead to prototype development.

  8. Thermodynamics of quasideterministic digital computers

    NASA Astrophysics Data System (ADS)

    Chu, Dominique

    2018-02-01

    A central result of stochastic thermodynamics is that irreversible state transitions of Markovian systems entail a cost in terms of an infinite entropy production. A corollary of this is that strictly deterministic computation is not possible. Using a thermodynamically consistent model, we show that quasideterministic computation can be achieved at finite, and indeed modest cost with accuracies that are indistinguishable from deterministic behavior for all practical purposes. Concretely, we consider the entropy production of stochastic (Markovian) systems that behave like and and a not gates. Combinations of these gates can implement any logical function. We require that these gates return the correct result with a probability that is very close to 1, and additionally, that they do so within finite time. The central component of the model is a machine that can read and write binary tapes. We find that the error probability of the computation of these gates falls with the power of the system size, whereas the cost only increases linearly with the system size.

  9. Satellite economics in the 1980's

    NASA Astrophysics Data System (ADS)

    Morgan, W. L.

    1980-01-01

    Satellite traffic, competition, and decreasing costs are discussed, as are capabilities in telecommunication (including entertainment) and computation. Also considered are future teleconferencing and telecommuting to offset the cost of transportation, the establishment of a manufacturer-to-user link for increased home minicomputer capability, and an increase of digital over analog traffic. It is suggested that transcontinental bulk traffic, high-speed data, and multipoint private networks will eventually be handled by satellites which are cost-insensitive to distance, readily match dynamically varying multipoint networks, and have uniformly wide bandwidths available to both major cities and isolated towns.

  10. Integrating Computational Chemistry into the Physical Chemistry Curriculum

    ERIC Educational Resources Information Center

    Johnson, Lewis E.; Engel, Thomas

    2011-01-01

    Relatively few undergraduate physical chemistry programs integrate molecular modeling into their quantum mechanics curriculum owing to concerns about limited access to computational facilities, the cost of software, and concerns about increasing the course material. However, modeling exercises can be integrated into an undergraduate course at a…

  11. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  12. The value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation

    NASA Astrophysics Data System (ADS)

    Salman Shahid, Syed; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Objectives. Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Approach. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Main results. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity. Significance. Results illustrate the need to rationally balance the role of model complexity, such as anisotropy in detailed current flow analysis versus value in clinical dose design. However, when extending our analysis to include axonal polarization, the results provide presumably clinically meaningful information. Hence the importance of model complexity may be more relevant with cellular level predictions of neuromodulation.

  13. Operating Dedicated Data Centers - Is It Cost-Effective?

    NASA Astrophysics Data System (ADS)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  14. The minimal work cost of information processing

    NASA Astrophysics Data System (ADS)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  15. Accelerating epistasis analysis in human genetics with consumer graphics hardware.

    PubMed

    Sinnott-Armstrong, Nicholas A; Greene, Casey S; Cancare, Fabio; Moore, Jason H

    2009-07-24

    Human geneticists are now capable of measuring more than one million DNA sequence variations from across the human genome. The new challenge is to develop computationally feasible methods capable of analyzing these data for associations with common human disease, particularly in the context of epistasis. Epistasis describes the situation where multiple genes interact in a complex non-linear manner to determine an individual's disease risk and is thought to be ubiquitous for common diseases. Multifactor Dimensionality Reduction (MDR) is an algorithm capable of detecting epistasis. An exhaustive analysis with MDR is often computationally expensive, particularly for high order interactions. This challenge has previously been met with parallel computation and expensive hardware. The option we examine here exploits commodity hardware designed for computer graphics. In modern computers Graphics Processing Units (GPUs) have more memory bandwidth and computational capability than Central Processing Units (CPUs) and are well suited to this problem. Advances in the video game industry have led to an economy of scale creating a situation where these powerful components are readily available at very low cost. Here we implement and evaluate the performance of the MDR algorithm on GPUs. Of primary interest are the time required for an epistasis analysis and the price to performance ratio of available solutions. We found that using MDR on GPUs consistently increased performance per machine over both a feature rich Java software package and a C++ cluster implementation. The performance of a GPU workstation running a GPU implementation reduces computation time by a factor of 160 compared to an 8-core workstation running the Java implementation on CPUs. This GPU workstation performs similarly to 150 cores running an optimized C++ implementation on a Beowulf cluster. Furthermore this GPU system provides extremely cost effective performance while leaving the CPU available for other tasks. The GPU workstation containing three GPUs costs $2000 while obtaining similar performance on a Beowulf cluster requires 150 CPU cores which, including the added infrastructure and support cost of the cluster system, cost approximately $82,500. Graphics hardware based computing provides a cost effective means to perform genetic analysis of epistasis using MDR on large datasets without the infrastructure of a computing cluster.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, C.

    Almost every computer architect dreams of achieving high system performance with low implementation costs. A multigauge machine can reconfigure its data-path width, provide parallelism, achieve better resource utilization, and sometimes can trade computational precision for increased speed. A simple experimental method is used here to capture the main characteristics of multigauging. The measurements indicate evidence of near-optimal speedups. Adapting these ideas in designing parallel processors incurs low costs and provides flexibility. Several operational aspects of designing a multigauge machine are discussed as well. Thus, this research reports the technical, economical, and operational feasibility studies of multigauging.

  17. Electronic Advocacy and Social Welfare Policy Education

    ERIC Educational Resources Information Center

    Moon, Sung Seek; DeWeaver, Kevin L.

    2005-01-01

    The rapid increase in the number of low-cost computers, the proliferation of user-friendly software, and the development of electronic networks have created the "informatics era." The Internet is a rapidly growing communication resource that is becoming mainstream in the American society. Computer-based electronic political advocacy by social…

  18. Face-to-Face Collaborative Learning Supported by Mobile Phones

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Nussbaum, Miguel; Calderon, Juan Felipe; Bravo, Claudio; Infante, Cristian; Vasquez, Andrea

    2011-01-01

    The use of handheld computers in educational contexts has increased considerably in recent years and their value as a teaching tool has been confirmed by many positive experiences, particular within collaborative learning systems (Mobile Computer Supported Collaborative Learning [MCSCL]). The cost of the devices has hindered widespread use in…

  19. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    NASA Astrophysics Data System (ADS)

    Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev

    2002-04-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1Msolar, for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.

  20. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuess, S.; Garzoglio, G.; Holzman, B.

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a commonmore » interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.« less

  1. Batching System for Superior Service

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  2. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  3. The Influence of Large-Scale Computing on Aircraft Structural Design.

    DTIC Science & Technology

    1986-04-01

    the customer in the most cost- effective manner. Computer facility organizations became computer resource power brokers. A good data processing...capabilities generated on other processors can be easily used. This approach is easily implementable and provides a good strategy for using existing...assistance to member nations for the purpose of increasing their scientific and technical potential; - Recommending effective ways for the member nations to

  4. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  5. Dealing with electronic waste: modeling the costs and environmental benefits of computer monitor disposal.

    PubMed

    Macauley, Molly; Palmer, Karen; Shih, Jhih-Shyang

    2003-05-01

    The importance of information technology to the world economy has brought about a surge in demand for electronic equipment. With rapid technological change, a growing fraction of the increasing stock of many types of electronics becomes obsolete each year. We model the costs and benefits of policies to manage 'e-waste' by focusing on a large component of the electronic waste stream-computer monitors-and the environmental concerns associated with disposal of the lead embodied in cathode ray tubes (CRTs) used in most monitors. We find that the benefits of avoiding health effects associated with CRT disposal appear far outweighed by the costs for a wide range of policies. For the stock of monitors disposed of in the United States in 1998, we find that policies restricting or banning some popular disposal options would increase disposal costs from about US dollar 1 per monitor to between US dollars 3 and US dollars 20 per monitor. Policies to promote a modest amount of recycling of monitor parts, including lead, can be less expensive. In all cases, however, the costs of the policies exceed the value of the avoided health effects of CRT disposal.

  6. Increased decision thresholds enhance information gathering performance in juvenile Obsessive-Compulsive Disorder (OCD)

    PubMed Central

    Iannaccone, Reto; Brem, Silvia; Walitza, Susanne

    2017-01-01

    Patients with obsessive-compulsive disorder (OCD) can be described as cautious and hesitant, manifesting an excessive indecisiveness that hinders efficient decision making. However, excess caution in decision making may also lead to better performance in specific situations where the cost of extended deliberation is small. We compared 16 juvenile OCD patients with 16 matched healthy controls whilst they performed a sequential information gathering task under different external cost conditions. We found that patients with OCD outperformed healthy controls, winning significantly more points. The groups also differed in the number of draws required prior to committing to a decision, but not in decision accuracy. A novel Bayesian computational model revealed that subjective sampling costs arose as a non-linear function of sampling, closely resembling an escalating urgency signal. Group difference in performance was best explained by a later emergence of these subjective costs in the OCD group, also evident in an increased decision threshold. Our findings present a novel computational model and suggest that enhanced information gathering in OCD can be accounted for by a higher decision threshold arising out of an altered perception of costs that, in some specific contexts, may be advantageous. PMID:28403139

  7. Assessing the use of computers in industrial occupational health departments.

    PubMed

    Owen, J P

    1995-04-01

    Computers are widely used in business and industry and the benefits of computerizing occupational health (OH) departments have been advocated by several authors. The requirements for successful computerization of an OH department are reviewed. Having identified the theoretical benefits, the real picture in industry is assessed by surveying 52 firms with over 1000 employees in a large urban area. Only 15 (29%) of the companies reported having any OH service, of which six used computers in the OH department, reflecting the business priorities of most of the companies. The types of software systems used and their main use are examined, along with perceived benefits or disadvantages. With the decreasing costs of computers and increasingly 'user-friendly' software, there is a real cost benefit to be gained from using computers in OH departments, although the concept may have to be 'sold' to management.

  8. Optimize Resources and Help Reduce Cost of Ownership with Dell[TM] Systems Management

    ERIC Educational Resources Information Center

    Technology & Learning, 2008

    2008-01-01

    Maintaining secure, convenient administration of the PC system environment can be a significant drain on resources. Deskside visits can greatly increase the cost of supporting a large number of computers. Even simple tasks, such as tracking inventory or updating software, quickly become expensive when they require physically visiting every…

  9. Competition in Defense Acquisitions

    DTIC Science & Technology

    2008-05-14

    NASA employees to maintain desktop assets No way to track costs, no standardization, not tracking service quality NASA’s Outsourcing Desktop...assets to the private sector. ODIN Goals Cut desktop computing costs Increase service quality Achieve interoperability and standardization Focus...not tracking service quality NASA’s Outsourcing Desktop Initiative (ODIN) transferred the responsibility for providing and managing the vast

  10. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    PubMed

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI QuickScan intervention programme did not prove to be cost-effective from the both the societal and companies' perspective and, therefore, this study does not provide a financial reason for implementing this intervention. However, with a relatively small investment, the programme did increase the number of workers who received information on healthy computer use and improved their work posture and movement. NTR1117.

  11. Limitations of polynomial chaos expansions in the Bayesian solution of inverse problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Fei; Department of Mathematics, University of California, Berkeley; Morzfeld, Matthias, E-mail: mmo@math.lbl.gov

    2015-02-01

    Polynomial chaos expansions are used to reduce the computational cost in the Bayesian solutions of inverse problems by creating a surrogate posterior that can be evaluated inexpensively. We show, by analysis and example, that when the data contain significant information beyond what is assumed in the prior, the surrogate posterior can be very different from the posterior, and the resulting estimates become inaccurate. One can improve the accuracy by adaptively increasing the order of the polynomial chaos, but the cost may increase too fast for this to be cost effective compared to Monte Carlo sampling without a surrogate posterior.

  12. 20 CFR 404.276 - Publication of notice of increase.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Publication of notice of increase. 404.276 Section 404.276 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.276...

  13. Expanding HPC and Research Computing--The Sustainable Way

    ERIC Educational Resources Information Center

    Grush, Mary

    2009-01-01

    Increased demands for research and high-performance computing (HPC)--along with growing expectations for cost and environmental savings--are putting new strains on the campus data center. More and more, CIOs like the University of Notre Dame's (Indiana) Gordon Wishon are seeking creative ways to build more sustainable models for data center and…

  14. Operating Policies and Procedures of Computer Data-Base Systems.

    ERIC Educational Resources Information Center

    Anderson, David O.

    Speaking on the operating policies and procedures of computer data bases containing information on students, the author divides his remarks into three parts: content decisions, data base security, and user access. He offers nine recommended practices that should increase the data base's usefulness to the user community: (1) the cost of developing…

  15. Critical Success Factors for E-Learning and Institutional Change--Some Organisational Perspectives on Campus-Wide E-Learning

    ERIC Educational Resources Information Center

    White, Su

    2007-01-01

    Computer technology has been harnessed for education in UK universities ever since the first computers for research were installed at 10 selected sites in 1957. Subsequently, real costs have fallen dramatically. Processing power has increased; network and communications infrastructure has proliferated, and information has become unimaginably…

  16. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 6: Implementation schedule, development costs, operational costs, benefit assessment, impact on company organization, spin-off assessment, phase 1, tasks 3 to 8

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.; Dublin, M.

    1973-01-01

    A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance.

  17. Discovery & Interaction in Astro 101 Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Maloney, Frank Patrick; Maurone, Philip; DeWarf, Laurence E.

    2016-01-01

    The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for arts students. We report on a strategy, begun in 1992, for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. These experiments have evolved as :a) the quality and speed of the hardware has greatly increasedb) the corresponding hardware costs have decreasedc) the students have become computer and Internet literated) the importance of computationally and scientifically literate arts graduates in the workplace has increased.We present the current suite of laboratory experiments, and describe the nature, procedures, and goals in this two-semester laboratory for liberal arts majors at the Astro 101 university level.

  18. Efficient convolutional sparse coding

    DOEpatents

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  19. High-Rate Digital Receiver Board

    NASA Technical Reports Server (NTRS)

    Ghuman, Parminder; Bialas, Thomas; Brambora, Clifford; Fisher, David

    2004-01-01

    A high-rate digital receiver (HRDR) implemented as a peripheral component interface (PCI) board has been developed as a prototype of compact, general-purpose, inexpensive, potentially mass-producible data-acquisition interfaces between telemetry systems and personal computers. The installation of this board in a personal computer together with an analog preprocessor enables the computer to function as a versatile, highrate telemetry-data-acquisition and demodulator system. The prototype HRDR PCI board can handle data at rates as high as 600 megabits per second, in a variety of telemetry formats, transmitted by diverse phase-modulation schemes that include binary phase-shift keying and various forms of quadrature phaseshift keying. Costing less than $25,000 (as of year 2003), the prototype HRDR PCI board supplants multiple racks of older equipment that, when new, cost over $500,000. Just as the development of standard network-interface chips has contributed to the proliferation of networked computers, it is anticipated that the development of standard chips based on the HRDR could contribute to reductions in size and cost and increases in performance of telemetry systems.

  20. GPSS/360 computer models to simulate aircraft passenger emergency evacuations.

    DOT National Transportation Integrated Search

    1972-09-01

    Live tests of emergency evacuation of transport aircraft are becoming increasingly expensive as the planes grow to a size seating hundreds of passengers. Repeated tests, to cope with random variations, increase these costs, as well as risks of injuri...

  1. Asynchronous communication in spectral-element and discontinuous Galerkin methods for atmospheric dynamics - a case study using the High-Order Methods Modeling Environment (HOMME-homme_dg_branch)

    NASA Astrophysics Data System (ADS)

    Jamroz, Benjamin F.; Klöfkorn, Robert

    2016-08-01

    The scalability of computational applications on current and next-generation supercomputers is increasingly limited by the cost of inter-process communication. We implement non-blocking asynchronous communication in the High-Order Methods Modeling Environment for the time integration of the hydrostatic fluid equations using both the spectral-element and discontinuous Galerkin methods. This allows the overlap of computation with communication, effectively hiding some of the costs of communication. A novel detail about our approach is that it provides some data movement to be performed during the asynchronous communication even in the absence of other computations. This method produces significant performance and scalability gains in large-scale simulations.

  2. Optimal estimation and scheduling in aquifer management using the rapid feedback control method

    NASA Astrophysics Data System (ADS)

    Ghorbanidehno, Hojat; Kokkinaki, Amalia; Kitanidis, Peter K.; Darve, Eric

    2017-12-01

    Management of water resources systems often involves a large number of parameters, as in the case of large, spatially heterogeneous aquifers, and a large number of "noisy" observations, as in the case of pressure observation in wells. Optimizing the operation of such systems requires both searching among many possible solutions and utilizing new information as it becomes available. However, the computational cost of this task increases rapidly with the size of the problem to the extent that textbook optimization methods are practically impossible to apply. In this paper, we present a new computationally efficient technique as a practical alternative for optimally operating large-scale dynamical systems. The proposed method, which we term Rapid Feedback Controller (RFC), provides a practical approach for combined monitoring, parameter estimation, uncertainty quantification, and optimal control for linear and nonlinear systems with a quadratic cost function. For illustration, we consider the case of a weakly nonlinear uncertain dynamical system with a quadratic objective function, specifically a two-dimensional heterogeneous aquifer management problem. To validate our method, we compare our results with the linear quadratic Gaussian (LQG) method, which is the basic approach for feedback control. We show that the computational cost of the RFC scales only linearly with the number of unknowns, a great improvement compared to the basic LQG control with a computational cost that scales quadratically. We demonstrate that the RFC method can obtain the optimal control values at a greatly reduced computational cost compared to the conventional LQG algorithm with small and controllable losses in the accuracy of the state and parameter estimation.

  3. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  4. Implementation of computer-based patient records in primary care: the societal health economic effects.

    PubMed Central

    Arias-Vimárlund, V.; Ljunggren, M.; Timpka, T.

    1996-01-01

    OBJECTIVE: Exploration of the societal health economic effects occurring during the first year after implementation of Computerised Patient Records (CPRs) at Primary Health Care (PHC) centres. DESIGN: Comparative case studies of practice processes and their consequences one year after CPR implementation, using the constant comparison method. Application of transaction-cost analyses at a societal level on the results. SETTING: Two urban PHC centres under a managed care contract in Ostergötland county, Sweden. MAIN OUTCOME MEASURES: Central implementation issues. First-year societal direct normal costs, direct unexpected costs, and indirect costs. Societal benefits. RESULTS: The total societal effect of the CPR implementation was a cost of nearly 250,000 SEK (USD 37,000) per GP team. About 20% of the effect consisted of direct unexpected costs, accured from the reduction of practitioners' leisure time. The main issues in the implementation process were medical informatics knowledge and computer skills, adaptation of the human-computer interaction design to practice routines, and information access through the CPR. CONCLUSIONS: The societal costs exceed the benefits during the first year after CPR implementation at the observed PHC centres. Early investments in requirements engineering and staff training may increase the efficiency. Exploitation of the CPR for disease prevention and clinical quality improvement is necessary to defend the investment in societal terms. The exact calculation of societal costs requires further analysis of the affected groups' willingness to pay. PMID:8947717

  5. The impact of supercomputers on experimentation: A view from a national laboratory

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.; Arnold, J. O.

    1985-01-01

    The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.

  6. Petascale self-consistent electromagnetic computations using scalable and accurate algorithms for complex structures

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Abell, D.; Amundson, J.; Bruhwiler, D. L.; Busby, R.; Carlsson, J. A.; Dimitrov, D. A.; Kashdan, E.; Messmer, P.; Nieter, C.; Smithe, D. N.; Spentzouris, P.; Stoltz, P.; Trines, R. M.; Wang, H.; Werner, G. R.

    2006-09-01

    As the size and cost of particle accelerators escalate, high-performance computing plays an increasingly important role; optimization through accurate, detailed computermodeling increases performance and reduces costs. But consequently, computer simulations face enormous challenges. Early approximation methods, such as expansions in distance from the design orbit, were unable to supply detailed accurate results, such as in the computation of wake fields in complex cavities. Since the advent of message-passing supercomputers with thousands of processors, earlier approximations are no longer necessary, and it is now possible to compute wake fields, the effects of dampers, and self-consistent dynamics in cavities accurately. In this environment, the focus has shifted towards the development and implementation of algorithms that scale to large numbers of processors. So-called charge-conserving algorithms evolve the electromagnetic fields without the need for any global solves (which are difficult to scale up to many processors). Using cut-cell (or embedded) boundaries, these algorithms can simulate the fields in complex accelerator cavities with curved walls. New implicit algorithms, which are stable for any time-step, conserve charge as well, allowing faster simulation of structures with details small compared to the characteristic wavelength. These algorithmic and computational advances have been implemented in the VORPAL7 Framework, a flexible, object-oriented, massively parallel computational application that allows run-time assembly of algorithms and objects, thus composing an application on the fly.

  7. 12 CFR 602.12 - Fees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of the employee doing the work. (2) For computer searches for records, the direct costs of computer... $15.00. Fee Amounts Table Type of fee Amount of fee Manual Search and Review Pro rated Salary Costs. Computer Search Direct Costs. Photocopy $0.15 a page. Other Reproduction Costs Direct Costs. Elective...

  8. [Health technology assessment report: Computer-assisted Pap test for cervical cancer screening].

    PubMed

    Della Palma, Paolo; Moresco, Luca; Giorgi Rossi, Paolo

    2012-01-01

    HEALTH PROBLEM: Cervical cancer is a disease which is highly preventable by means of Pap test screening for the precancerous lesions, which can be easily treated. Furthermore, in the near future, control of the disease will be enhanced by the vaccination which prevents the infection of those human papillomavirus types that cause the vast majority of cervical cancers. The effectiveness of screening in drastically reducing cervical cancer incidence has been clearly demonstrated. The epidemiology of cervical cancer in industrialised countries is now determined mostly by the Pap test coverage of the female population and by the ability of health systems to assure appropriate follow up after an abnormal Pap test. Today there are two fully automated systems for computer-assisted Pap test: the BD FocalPoint and the Hologic Imager. Recently, the Hologic Integrated Imager, a semi-automated system, was launched. The two fully automated systems are composed of a central scanner, where the machine examines the cytologic slide, and of one or more review stations, where the cytologists analyze the slides previously centrally scanned. The softwares used by the two systems identify the fields of interest so that the cytologists can look only at those points, automatically pointed out by the review station. Furthermore, the FocalPoint system classifies the slides according to their level of risk of containing signs of relevant lesions. Those in the upper classes--about one fifth of the slides--are labelled as « further review », while those in the lower level of risk, i.e. slides that have such a low level of risk that they can be considered as negative with no human review, are labelled as « no further review ». The aim of computer-assisted Pap test is to reduce the time of slide examination and to increase productivity. Furthermore, the number of errors due to lack of attention may decrease. Both the systems can be applied to liquidbased cytology, while only the BD Focal Point can be used on conventional smears. Cytology screening has some critical points: there is a shortage of cytologists/cytotechnicians; the quality strongly depends on the experience and ability of the cytologist; there is a subjective component in the cytological diagnosis; in highly screened populations, the prevalence of lesions is very low and the activity of cytologists is very monotonous. On the other hand, a progressive shift to molecular screening using HPV-DNA test as primary screening test is very likely in the near future; cytology will be used as triage test, dramatically reducing the number of slides to process and increasing the prevalence of lesions in those Pap tests. In this Report we assume that the diagnostic accuracy of computer-assisted Pap test is equal to the accuracy of manual Pap test and, consequently, that screening using computer-assisted Pap test has the same efficacy in reducing cervical cancer incidence and mortality. Under this assumption, the effectiveness/ benefit/utility is the same for the two screening modes, i.e. the economic analysis will be a cost minimization study. Furthermore, the screening process is identical for the two modalities in all the phases except for slide interpretation. The cost minimization analysis will be limited to the only phase differing between the two modes, i.e. the study will be a differential cost analysis between a labour-intensive strategy (traditional Pap test) and a technology-intensive strategy (the computer-assisted Pap test). Briefly, the objectives of this HTA Report are: to determine the break even point of computer-assisted Pap test systems, i.e. the volume of slides processed per year at which putting in place a computer-assisted Pap test system becomes economically convenient; to quantify the cost per Pap test in different scenarios according to screening centre activity volume, productivity of cytologist, type of cytology (conventional smear or liquid-based, fully automated or semi-automated computer-assisted); to analyse the computer-assisted Pap test in the Italian context, through a survey of the centres using the technology, collecting data useful for the sensitivity analysis of the economic evaluation; to evaluate the acceptability of the technology in the screening services; to evaluate the organizational and financial impact of the computer-assisted Pap test in different scenarios; to illustrate the ideal organization to implement computer-assisted Pap test in terms of volume of activity, productivity, and human and technological resources. to produce this Report, the following process was adopted: application to the Ministry of health for a grant « Analysis of the impact of professional involvement in evidence generation for the HTA process »; within this project, the sub-project « Cost effectiveness evaluation of the computer-assisted Pap test in the Italian screening programmes » was financed; constitution of the Working Group, which included the project coordinator, the principal investigator, and the health economist; identification of the centres using the computer-assisted Pap test and which had published scientific reports on the subject; identification of the Consulting Committee (stakeholder), which included screening programmes managers, pathologists, economists, health policy-makers, citizen organizations, and manufacturers. Once the evaluation was concluded, a plenary meeting with Working Group and Consulting Committee was held. The working group drafted the final version of this Report, which took into account the comments received. the fully automated computer-assisted Pap test has an important financial and organizational impact on screening programmes. The assessment of this health technology reached the following conclusions: according to the survey results, after some distrust, cytologists accepted the use of the machine and appreciated the reduction in interpretation time and the reliability in identifying the fields of interest; from an economic point of view, the automated computer-assisted Pap test can be convenient only with conventional smears if the screening centre has a volume of more than 49,000 slides/year and the cytologist productivity increases about threefold. It must be highlighted that it is not sufficient to adopt the automated Pap test to reach such an increase in productivity; the laboratory must be organised or re-organised to optimise the use of the review stations and the person time. In the case of liquid-based cytology, the adoption of automated computer- assisted Pap test can only increase the costs. In fact, liquid-based cytology increases the cost of consumable materials but reduces the interpretation time, even in manual screening. Consequently, the reduction of human costs is smaller in the case of computer-assisted screening. Liquid-based cytology has other implications and advantages not linked to the use of computer-assisted Pap test that should be taken into account and are beyond the scope of this Report; given that the computer-assisted Pap test reduces human costs, it may be more advantageous where the cost of cytologists is higher; given the relatively small volume of activity of screening centres in Italy, computer-assisted Pap test may be reasonable for a network using only one central scanner and several remote review stations; the use of automated computer-assisted Pap test only for quality control in a single centre is not economically sustainable. In this case as well, several centres, for example at the regional level, may form a consortium to reach a reasonable number of slides to achieve the break even point. Regarding the use of a machine rather than human intelligence to interpret the slides, some ethical issues were initially raised, but both the scientific community and healthcare professionals have accepted this technology. The identification of fields of interest by the machine is highly reproducible, reducing subjectivity in the diagnostic process. The Hologic system always includes a check by the human eye, while the FocalPoint system identifies about one fifth of the slides as No Further Review. Several studies, some of which conducted in Italy, confirmed the reliability of this classification. There is still some resistance to accept the practice of No Further Review. A check of previous slides and clinical data can be useful to make the cytologist and the clinician more confident. Computer-assisted automated Pap test may be introduced only if there is a need to increase the volume of slides screened to cover the screening target population and sufficient human resources are not available. Switching a programme using conventional slides to automatic scanning can only lead to a reduction in costs if the volume of slides per year exceeds 49,000 slides/annum and cytologist productivity is optimised to more than 20,000 slides per year. At a productivity of 15,000 or fewer, the automated computer-assisted Pap test cannot be convenient. Switching from manual screening with conventional slides to automatic scanning with liquid-based cytology cannot generate any economic saving, but the system could increase output with a given number of staff. The transition from manual to computer assisted automated screening of liquid based cytology will not generate savings and the increase in productivity will be lower than that of the switch from manual/conventional to automated/conventional. The use of biologists or pathologists as cytologists is more costly than the use of cytoscreeners. Given that the automated computer-assisted Pap test reduces human resource costs, its adoption in a model using only biologists and pathologists for screening is more economically advantageous. (ABSTRACT TRUNCATED)

  9. Open source data logger for low-cost environmental monitoring

    PubMed Central

    2014-01-01

    Abstract The increasing transformation of biodiversity into a data-intensive science has seen numerous independent systems linked and aggregated into the current landscape of biodiversity informatics. This paper outlines how we can move forward with this programme, incorporating real time environmental monitoring into our methodology using low-power and low-cost computing platforms. PMID:24855446

  10. The Application and Evaluation of PLATO IV in AF Technical Training.

    ERIC Educational Resources Information Center

    Mockovak, William P.; And Others

    The Air Force has been plagued with the rising cost of technical training and has increasingly turned to computer-assisted instruction (CAI) for better cost effectiveness. Toward this aim a trial of PLATO IV, a CAI system utilizing a graphic display and centered at the University of Illinois, was initiated at the Chanute and Sheppard training…

  11. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.

  12. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    PubMed

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers.

  13. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers. PMID:22028928

  14. Improving the cost-effectiveness of a healthcare system for depressive disorders by implementing telemedicine: a health economic modeling study.

    PubMed

    Lokkerbol, Joran; Adema, Dirk; Cuijpers, Pim; Reynolds, Charles F; Schulz, Richard; Weehuizen, Rifka; Smit, Filip

    2014-03-01

    Depressive disorders are significant causes of disease burden and are associated with substantial economic costs. It is therefore important to design a healthcare system that can effectively manage depression at sustainable costs. This article computes the benefit-to-cost ratio of the current Dutch healthcare system for depression, and investigates whether offering more online preventive interventions improves the cost-effectiveness overall. A health economic (Markov) model was used to synthesize clinical and economic evidence and to compute population-level costs and effects of interventions. The model compared a base case scenario without preventive telemedicine and alternative scenarios with preventive telemedicine. The central outcome was the benefit-to-cost ratio, also known as return-on-investment (ROI). In terms of ROI, a healthcare system with preventive telemedicine for depressive disorders offers better value for money than a healthcare system without Internet-based prevention. Overall, the ROI increases from €1.45 ($1.72) in the base case scenario to €1.76 ($2.09) in the alternative scenario in which preventive telemedicine is offered. In a scenario in which the costs of offering preventive telemedicine are balanced by reducing the expenditures for curative interventions, ROI increases to €1.77 ($2.10), while keeping the healthcare budget constant. For a healthcare system for depressive disorders to remain economically sustainable, its cost-benefit ratio needs to be improved. Offering preventive telemedicine at a large scale is likely to introduce such an improvement. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  15. Digital Waveguide Architectures for Virtual Musical Instruments

    NASA Astrophysics Data System (ADS)

    Smith, Julius O.

    Digital sound synthesis has become a standard staple of modern music studios, videogames, personal computers, and hand-held devices. As processing power has increased over the years, sound synthesis implementations have evolved from dedicated chip sets, to single-chip solutions, and ultimately to software implementations within processors used primarily for other tasks (such as for graphics or general purpose computing). With the cost of implementation dropping closer and closer to zero, there is increasing room for higher quality algorithms.

  16. Technologies for space station autonomy

    NASA Technical Reports Server (NTRS)

    Staehle, R. L.

    1984-01-01

    This report presents an informal survey of experts in the field of spacecraft automation, with recommendations for which technologies should be given the greatest development attention for implementation on the initial 1990's NASA Space Station. The recommendations implemented an autonomy philosophy that was developed by the Concept Development Group's Autonomy Working Group during 1983. They were based on assessments of the technologies' likely maturity by 1987, and of their impact on recurring costs, non-recurring costs, and productivity. The three technology areas recommended for programmatic emphasis were: (1) artificial intelligence expert (knowledge based) systems and processors; (2) fault tolerant computing; and (3) high order (procedure oriented) computer languages. This report also describes other elements required for Station autonomy, including technologies for later implementation, system evolvability, and management attitudes and goals. The cost impact of various technologies is treated qualitatively, and some cases in which both the recurring and nonrecurring costs might be reduced while the crew productivity is increased, are also considered. Strong programmatic emphasis on life cycle cost and productivity is recommended.

  17. Many-body calculations with deuteron based single-particle bases and their associated natural orbits

    NASA Astrophysics Data System (ADS)

    Puddu, G.

    2018-06-01

    We use the recently introduced single-particle states obtained from localized deuteron wave-functions as a basis for nuclear many-body calculations. We show that energies can be substantially lowered if the natural orbits (NOs) obtained from this basis are used. We use this modified basis for {}10{{B}}, {}16{{O}} and {}24{{Mg}} employing the bare NNLOopt nucleon–nucleon interaction. The lowering of the energies increases with the mass. Although in principle NOs require a full scale preliminary many-body calculation, we found that an approximate preliminary many-body calculation, with a marginal increase in the computational cost, is sufficient. The use of natural orbits based on an harmonic oscillator basis leads to a much smaller lowering of the energies for a comparable computational cost.

  18. Darwinian Spacecraft: Soft Computing Strategies Breeding Better, Faster Cheaper

    NASA Technical Reports Server (NTRS)

    Noever, David A.; Baskaran, Subbiah

    1999-01-01

    Computers can create infinite lists of combinations to try to solve a particular problem, a process called "soft-computing." This process uses statistical comparables, neural networks, genetic algorithms, fuzzy variables in uncertain environments, and flexible machine learning to create a system which will allow spacecraft to increase robustness, and metric evaluation. These concepts will allow for the development of a spacecraft which will allow missions to be performed at lower costs.

  19. Computerized preparation of a scientific poster.

    PubMed

    Lugo, M; Speaker, M G; Cohen, E J

    1989-01-01

    We prepared an attractive and effective poster using a Macintosh computer and Laserwriter and Imagewriter II printers. The advantages of preparing the poster in this fashion were increased control of the final product, decreased cost, and a sense of artistic satisfaction. Although we employed only the above mentioned computer, the desktop publishing techniques described can be used with other systems.

  20. Computer Detection of Low Contrast Targets.

    DTIC Science & Technology

    1982-06-18

    computed from the Hessian and the gradient and is given by the formula W) = - U Hf( IVf (M), Vf()) IVfj 3 Because of the amount of noise present in these...IT (nz + 1 + Zn cost ) 1/2 and this integral is a maximum for n=1 and decreases as n increases, exactly what a good measure of curvature should do

  1. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    NASA Astrophysics Data System (ADS)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a quadratic programming based modeling method is proposed. This algorithm performs well with small amount of computing tasks. However, its efficiency decreases significantly as the subdomain number and computing node number increase. 2) To compensate performance decreasing for large scale tasks, a K-Means clustering based algorithm is introduced. Instead of dedicating to get optimized solutions, this method can get relatively good feasible solutions within acceptable time. However, it may introduce imbalance communication for nodes or node-isolated subdomains. This research shows both two algorithms have their own strength and weakness for task allocation. A combination of the two algorithms is under study to obtain a better performance. Keywords: Scheduling; Parallel Computing; Load Balance; Optimization; Cost Model

  2. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  3. Applications of Technology to CAS Data-Base Production.

    ERIC Educational Resources Information Center

    Weisgerber, David W.

    1984-01-01

    Reviews the economic importance of applying computer technology to Chemical Abstracts Service database production from 1973 to 1983. Database building, technological applications for editorial processing (online editing, Author Index Manufacturing System), and benefits (increased staff productivity, reduced rate of increase of cost of services,…

  4. The Hidden Costs of Owning a Microcomputer.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Before purchasing computer hardware, individuals must consider the costs associated with the setup and operation of a microcomputer system. Included among the initial costs of purchasing a computer are the costs of the computer, one or more disk drives, a monitor, and a printer as well as the costs of such optional peripheral devices as a plotter…

  5. A geochemical module for "AMDTreat" to compute caustic quantity, effluent quantity, and sludge volume

    USGS Publications Warehouse

    Cravotta, Charles A.; Parkhurst, David L.; Means, Brent P; McKenzie, Bob; Morris, Harry; Arthur, Bill

    2010-01-01

    Treatment with caustic chemicals typically is used to increase pH and decrease concentrations of dissolved aluminum, iron, and/or manganese in largevolume, metal-laden discharges from active coal mines. Generally, aluminum and iron can be removed effectively at near-neutral pH (6 to 8), whereas active manganese removal requires treatment to alkaline pH (~10). The treatment cost depends on the specific chemical used (NaOH, CaO, Ca(OH)2, Na2CO3, or NH3) and increases with the quantities of chemical added and sludge produced. The pH and metals concentrations do not change linearly with the amount of chemical added. Consequently, the amount of caustic chemical needed to achieve a target pH and the corresponding effluent composition and sludge volume can not be accurately determined without empirical titration data or the application of geochemical models to simulate the titration of the discharge water with caustic chemical(s). The AMDTreat computer program (http://amd.osmre.gov/ ) is widely used to compute costs for treatment of coal-mine drainage. Although AMDTreat can use results of empirical titration with industrial grade caustic chemicals to compute chemical costs for treatment of net-acidic or net-alkaline mine drainage, such data are rarely available. To improve the capability of AMDTreat to estimate (1) the quantity and cost of caustic chemicals to attain a target pH, (2) the concentrations of dissolved metals in treated effluent, and (3) the volume of sludge produced by the treatment, a titration simulation is being developed using the geochemical program PHREEQC (wwwbrr.cr.usgs.gov/projects/GWC_coupled/phreeqc/) that will be coupled as a module to AMDTreat. The simulated titration results can be compared with or used in place of empirical titration data to estimate chemical quantities and costs. This paper describes the development, evaluation, and potential utilization of the PHREEQC titration module for AMDTreat.

  6. Cost-of-living indexes and demographic change.

    PubMed

    Diamond, C A

    1990-06-01

    The Consumer Price Index (CPI), although not without problems, is the most often used mechanism for adjusting contracts for cost-of-living changes in the US. The US Bureau of Labor Statistics lists several problems associated with using the CPI as a cost-of-living index where the proportion of 2-worker families is increasing, population is shifting, and work week hours are changing. This study shows how to compute cost-of-living indexes which are inexpensive to update, use less restrictive assumptions about consumer preferences, do not require statistical estimation, and handle the problem of increasing numbers of families where both the husband and wife work. This study attempts to how widely in fact the CPI varies with alternative true cost-of-living varies with alternative true cost-of-living indexes although in the end this de facto cost-of-living measure holds up quite well. In times of severe price inflation people change their preferences by substitution, necessitating a flexible cost-of-living index that accounts for this fundamental economic behavior.

  7. Asynchronous communication in spectral-element and discontinuous Galerkin methods for atmospheric dynamics – a case study using the High-Order Methods Modeling Environment (HOMME-homme_dg_branch)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamroz, Benjamin F.; Klofkorn, Robert

    The scalability of computational applications on current and next-generation supercomputers is increasingly limited by the cost of inter-process communication. We implement non-blocking asynchronous communication in the High-Order Methods Modeling Environment for the time integration of the hydrostatic fluid equations using both the spectral-element and discontinuous Galerkin methods. This allows the overlap of computation with communication, effectively hiding some of the costs of communication. A novel detail about our approach is that it provides some data movement to be performed during the asynchronous communication even in the absence of other computations. This method produces significant performance and scalability gains in large-scalemore » simulations.« less

  8. Asynchronous communication in spectral-element and discontinuous Galerkin methods for atmospheric dynamics – a case study using the High-Order Methods Modeling Environment (HOMME-homme_dg_branch)

    DOE PAGES

    Jamroz, Benjamin F.; Klofkorn, Robert

    2016-08-26

    The scalability of computational applications on current and next-generation supercomputers is increasingly limited by the cost of inter-process communication. We implement non-blocking asynchronous communication in the High-Order Methods Modeling Environment for the time integration of the hydrostatic fluid equations using both the spectral-element and discontinuous Galerkin methods. This allows the overlap of computation with communication, effectively hiding some of the costs of communication. A novel detail about our approach is that it provides some data movement to be performed during the asynchronous communication even in the absence of other computations. This method produces significant performance and scalability gains in large-scalemore » simulations.« less

  9. A new impedance accounting for short- and long-range effects in mixed substructured formulations of nonlinear problems

    NASA Astrophysics Data System (ADS)

    Negrello, Camille; Gosselet, Pierre; Rey, Christian

    2018-05-01

    An efficient method for solving large nonlinear problems combines Newton solvers and Domain Decomposition Methods (DDM). In the DDM framework, the boundary conditions can be chosen to be primal, dual or mixed. The mixed approach presents the advantage to be eligible for the research of an optimal interface parameter (often called impedance) which can increase the convergence rate. The optimal value for this parameter is often too expensive to be computed exactly in practice: an approximate version has to be sought for, along with a compromise between efficiency and computational cost. In the context of parallel algorithms for solving nonlinear structural mechanical problems, we propose a new heuristic for the impedance which combines short and long range effects at a low computational cost.

  10. Bicriteria Network Optimization Problem using Priority-based Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin; Cheng, Runwei

    Network optimization is being an increasingly important and fundamental issue in the fields such as engineering, computer science, operations research, transportation, telecommunication, decision support systems, manufacturing, and airline scheduling. In many applications, however, there are several criteria associated with traversing each edge of a network. For example, cost and flow measures are both important in the networks. As a result, there has been recent interest in solving Bicriteria Network Optimization Problem. The Bicriteria Network Optimization Problem is known a NP-hard. The efficient set of paths may be very large, possibly exponential in size. Thus the computational effort required to solve it can increase exponentially with the problem size in the worst case. In this paper, we propose a genetic algorithm (GA) approach used a priority-based chromosome for solving the bicriteria network optimization problem including maximum flow (MXF) model and minimum cost flow (MCF) model. The objective is to find the set of Pareto optimal solutions that give possible maximum flow with minimum cost. This paper also combines Adaptive Weight Approach (AWA) that utilizes some useful information from the current population to readjust weights for obtaining a search pressure toward a positive ideal point. Computer simulations show the several numerical experiments by using some difficult-to-solve network design problems, and show the effectiveness of the proposed method.

  11. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less

  12. Productivity associated with visual status of computer users.

    PubMed

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  13. Computation of Sensitivity Derivatives of Navier-Stokes Equations using Complex Variables

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.

    2004-01-01

    Accurate computation of sensitivity derivatives is becoming an important item in Computational Fluid Dynamics (CFD) because of recent emphasis on using nonlinear CFD methods in aerodynamic design, optimization, stability and control related problems. Several techniques are available to compute gradients or sensitivity derivatives of desired flow quantities or cost functions with respect to selected independent (design) variables. Perhaps the most common and oldest method is to use straightforward finite-differences for the evaluation of sensitivity derivatives. Although very simple, this method is prone to errors associated with choice of step sizes and can be cumbersome for geometric variables. The cost per design variable for computing sensitivity derivatives with central differencing is at least equal to the cost of three full analyses, but is usually much larger in practice due to difficulty in choosing step sizes. Another approach gaining popularity is the use of Automatic Differentiation software (such as ADIFOR) to process the source code, which in turn can be used to evaluate the sensitivity derivatives of preselected functions with respect to chosen design variables. In principle, this approach is also very straightforward and quite promising. The main drawback is the large memory requirement because memory use increases linearly with the number of design variables. ADIFOR software can also be cumber-some for large CFD codes and has not yet reached a full maturity level for production codes, especially in parallel computing environments.

  14. Autonomic Computing for Spacecraft Ground Systems

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Jones, Lori

    2007-01-01

    Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.

  15. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  16. The MyHealthService approach for chronic disease management based on free open source software and low cost components.

    PubMed

    Vognild, Lars K; Burkow, Tatjana M; Luque, Luis Fernandez

    2009-01-01

    In this paper we present an approach to building personal health services, supporting following-up, physical exercising, health education, and psychosocial support for the chronically ill, based on free open source software and low-cost computers, mobile devices, and consumer health and fitness devices. We argue that this will lower the cost of the systems, which is important given the increasing number of people with chronicle diseases and limited healthcare budgets.

  17. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  18. More efficient optimization of long-term water supply portfolios

    NASA Astrophysics Data System (ADS)

    Kirsch, Brian R.; Characklis, Gregory W.; Dillard, Karen E. M.; Kelley, C. T.

    2009-03-01

    The use of temporary transfers, such as options and leases, has grown as utilities attempt to meet increases in demand while reducing dependence on the expansion of costly infrastructure capacity (e.g., reservoirs). Earlier work has been done to construct optimal portfolios comprising firm capacity and transfers, using decision rules that determine the timing and volume of transfers. However, such work has only focused on the short-term (e.g., 1-year scenarios), which limits the utility of these planning efforts. Developing multiyear portfolios can lead to the exploration of a wider range of alternatives but also increases the computational burden. This work utilizes a coupled hydrologic-economic model to simulate the long-term performance of a city's water supply portfolio. This stochastic model is linked with an optimization search algorithm that is designed to handle the high-frequency, low-amplitude noise inherent in many simulations, particularly those involving expected values. This noise is detrimental to the accuracy and precision of the optimized solution and has traditionally been controlled by investing greater computational effort in the simulation. However, the increased computational effort can be substantial. This work describes the integration of a variance reduction technique (control variate method) within the simulation/optimization as a means of more efficiently identifying minimum cost portfolios. Random variation in model output (i.e., noise) is moderated using knowledge of random variations in stochastic input variables (e.g., reservoir inflows, demand), thereby reducing the computing time by 50% or more. Using these efficiency gains, water supply portfolios are evaluated over a 10-year period in order to assess their ability to reduce costs and adapt to demand growth, while still meeting reliability goals. As a part of the evaluation, several multiyear option contract structures are explored and compared.

  19. Autonomic Closure for Turbulent Flows Using Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Doronina, Olga; Christopher, Jason; Hamlington, Peter; Dahm, Werner

    2017-11-01

    Autonomic closure is a new technique for achieving fully adaptive and physically accurate closure of coarse-grained turbulent flow governing equations, such as those solved in large eddy simulations (LES). Although autonomic closure has been shown in recent a priori tests to more accurately represent unclosed terms than do dynamic versions of traditional LES models, the computational cost of the approach makes it challenging to implement for simulations of practical turbulent flows at realistically high Reynolds numbers. The optimization step used in the approach introduces large matrices that must be inverted and is highly memory intensive. In order to reduce memory requirements, here we propose to use approximate Bayesian computation (ABC) in place of the optimization step, thereby yielding a computationally-efficient implementation of autonomic closure that trades memory-intensive for processor-intensive computations. The latter challenge can be overcome as co-processors such as general purpose graphical processing units become increasingly available on current generation petascale and exascale supercomputers. In this work, we outline the formulation of ABC-enabled autonomic closure and present initial results demonstrating the accuracy and computational cost of the approach.

  20. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    NASA Technical Reports Server (NTRS)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  1. A Web-Based Computer-Aided Learning Module for an Anatomy Course Using Open Source Image Mapping Software

    ERIC Educational Resources Information Center

    Carleton, Renee E.

    2012-01-01

    Computer-aided learning (CAL) is used increasingly to teach anatomy in post-secondary programs. Studies show that augmentation of traditional cadaver dissection and model examination by CAL can be associated with positive student learning outcomes. In order to reduce costs associated with the purchase of skeletons and models and to encourage study…

  2. An innovative on-board processor for lightsats

    NASA Technical Reports Server (NTRS)

    Henshaw, R. M.; Ballard, B. W.; Hayes, J. R.; Lohr, D. A.

    1990-01-01

    The Applied Physics Laboratory (APL) has developed a flightworthy custom microprocessor that increases capability and reduces development costs of lightsat science instruments. This device, called the FRISC (FORTH Reduced Instruction Set Computer), directly executes the high-level language called FORTH, which is ideally suited to the multitasking control and data processing environment of a spaceborne instrument processor. The FRISC will be flown as the onboard processor in the Magnetic Field Experiment on the Freja satllite. APL has achieved a significant increase in onboard processing capability with no increase in cost when compared to the magnetometer instrument on Freja's predecessor, the Viking satellite.

  3. Use of optimization to predict the effect of selected parameters on commuter aircraft performance

    NASA Technical Reports Server (NTRS)

    Wells, V. L.; Shevell, R. S.

    1982-01-01

    An optimizing computer program determined the turboprop aircraft with lowest direct operating cost for various sets of cruise speed and field length constraints. External variables included wing area, wing aspect ratio and engine sea level static horsepower; tail sizes, climb speed and cruise altitude were varied within the function evaluation program. Direct operating cost was minimized for a 150 n.mi typical mission. Generally, DOC increased with increasing speed and decreasing field length but not by a large amount. Ride roughness, however, increased considerably as speed became higher and field length became shorter.

  4. Static Memory Deduplication for Performance Optimization in Cloud Computing.

    PubMed

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-04-27

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.

  5. Static Memory Deduplication for Performance Optimization in Cloud Computing

    PubMed Central

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-01-01

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible. PMID:28448434

  6. Lockheed Martin Idaho Technologies Company information management technology architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, M.J.; Lau, P.K.S.

    1996-05-01

    The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies aremore » inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other information technology infrastructure components necessary for a more integrated information technology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.« less

  7. Extension of the ADjoint Approach to a Laminar Navier-Stokes Solver

    NASA Astrophysics Data System (ADS)

    Paige, Cody

    The use of adjoint methods is common in computational fluid dynamics to reduce the cost of the sensitivity analysis in an optimization cycle. The forward mode ADjoint is a combination of an adjoint sensitivity analysis method with a forward mode automatic differentiation (AD) and is a modification of the reverse mode ADjoint method proposed by Mader et al.[1]. A colouring acceleration technique is presented to reduce the computational cost increase associated with forward mode AD. The forward mode AD facilitates the implementation of the laminar Navier-Stokes (NS) equations. The forward mode ADjoint method is applied to a three-dimensional computational fluid dynamics solver. The resulting Euler and viscous ADjoint sensitivities are compared to the reverse mode Euler ADjoint derivatives and a complex-step method to demonstrate the reduced computational cost and accuracy. Both comparisons demonstrate the benefits of the colouring method and the practicality of using a forward mode AD. [1] Mader, C.A., Martins, J.R.R.A., Alonso, J.J., and van der Weide, E. (2008) ADjoint: An approach for the rapid development of discrete adjoint solvers. AIAA Journal, 46(4):863-873. doi:10.2514/1.29123.

  8. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  9. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGES

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  10. Double reading.

    PubMed

    Kopans, D B

    2000-07-01

    Clearly, the cost of double reading varies with the approach used. The Massachusetts General Hospital method can only lead to an increase in recalls and the costs that these engender (anxiety for the women recalled, trauma from any biopsies obtained, and the actual monetary costs of additional imaging and interventions). It is of interest that one potential cost, the concern that women recalled may be reluctant to participate again in screening, does not seem to be the case. Women who are recalled appear to be more likely to participate in future screening. Double interpretation where there must be a consensus between the interpreting radiologists, and if this cannot be reached a third arbiter, is the most labor intensive, but can reduce the number of recalls in a double reading system. Computer systems have been developed to act as a second reader. The films must be digitized and then fed through the reader, but studies suggest that the computer can identify cancers that may be overlooked by a human reader. The challenge is to do this without too many false-positive calls. If the radiologist finds the false-positives are too numerous and distracting, then the system is not used. As digital mammographic systems proliferate, and computer algorithms become more sophisticated, the second human reader will likely be replaced by a computer-aided detection system and double reading will become the norm.

  11. Cost Considerations in Nonlinear Finite-Element Computing

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.

    1985-01-01

    Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.

  12. Multicore Programming Challenges

    NASA Astrophysics Data System (ADS)

    Perrone, Michael

    The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.

  13. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  14. CUDAICA: GPU Optimization of Infomax-ICA EEG Analysis

    PubMed Central

    Raimondo, Federico; Kamienkowski, Juan E.; Sigman, Mariano; Fernandez Slezak, Diego

    2012-01-01

    In recent years, Independent Component Analysis (ICA) has become a standard to identify relevant dimensions of the data in neuroscience. ICA is a very reliable method to analyze data but it is, computationally, very costly. The use of ICA for online analysis of the data, used in brain computing interfaces, results are almost completely prohibitive. We show an increase with almost no cost (a rapid video card) of speed of ICA by about 25 fold. The EEG data, which is a repetition of many independent signals in multiple channels, is very suitable for processing using the vector processors included in the graphical units. We profiled the implementation of this algorithm and detected two main types of operations responsible of the processing bottleneck and taking almost 80% of computing time: vector-matrix and matrix-matrix multiplications. By replacing function calls to basic linear algebra functions to the standard CUBLAS routines provided by GPU manufacturers, it does not increase performance due to CUDA kernel launch overhead. Instead, we developed a GPU-based solution that, comparing with the original BLAS and CUBLAS versions, obtains a 25x increase of performance for the ICA calculation. PMID:22811699

  15. Software Solution Saves Dollars

    ERIC Educational Resources Information Center

    Trotter, Andrew

    2004-01-01

    This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…

  16. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    NASA Astrophysics Data System (ADS)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  17. 20 CFR 404.274 - What are the measuring periods we use to calculate cost-of-living increases?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts... increase is effective; or (ii) The third calendar quarter of any year in which the last automatic increase became effective. (2) When the period ends. The measuring period ends with the third calendar quarter of...

  18. Electricity from fossil fuels without CO2 emissions: assessing the costs of carbon dioxide capture and sequestration in U.S. electricity markets.

    PubMed

    Johnson, T L; Keith, D W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO2 emissions via CO2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  19. Electricity from Fossil Fuels without CO2 Emissions: Assessing the Costs of Carbon Dioxide Capture and Sequestration in U.S. Electricity Markets.

    PubMed

    Johnson, Timothy L; Keith, David W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO 2 emissions via CO 2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO 2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO 2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  20. Symposium on the Interface: Computing Science and Statistics (20th). Theme: Computationally Intensive Methods in Statistics Held in Reston, Virginia on April 20-23, 1988

    DTIC Science & Technology

    1988-08-20

    34 William A. Link, Patuxent Wildlife Research Center "Increasing reliability of multiversion fault-tolerant software design by modulation," Junryo 3... Multiversion lault-Tolerant Software Design by Modularization Junryo Miyashita Department of Computer Science California state University at san Bernardino Fault...They shall beE refered to as " multiversion fault-tolerant software design". Onel problem of developing multi-versions of a program is the high cost

  1. Computer-assisted Behavioral Therapy and Contingency Management for Cannabis Use Disorder

    PubMed Central

    Budney, Alan J.; Stanger, Catherine; Tilford, J. Mick; Scherer, Emily; Brown, Pamela C.; Li, Zhongze; Li, Zhigang; Walker, Denise

    2015-01-01

    Computer-assisted behavioral treatments hold promise for enhancing access to and reducing costs of treatments for substance use disorders. This study assessed the efficacy of a computer-assisted version of an efficacious, multicomponent treatment for cannabis use disorders (CUD), i.e., motivational enhancement therapy, cognitive-behavioral therapy, and abstinence-based contingency-management (MET/CBT/CM). An initial cost comparison was also performed. Seventy-five adult participants, 59% African Americans, seeking treatment for CUD received either, MET only (BRIEF), therapist-delivered MET/CBT/CM (THERAPIST), or computer-delivered MET/CBT/CM (COMPUTER). During treatment, the THERAPIST and COMPUTER conditions engendered longer durations of continuous cannabis abstinence than BRIEF (p < .05), but did not differ from each other. Abstinence rates and reduction in days of use over time were maintained in COMPUTER at least as well as in THERAPIST. COMPUTER averaged approximately $130 (p < .05) less per case than THERAPIST in therapist costs, which offset most of the costs of CM. Results add to promising findings that illustrate potential for computer-assisted delivery methods to enhance access to evidence-based care, reduce costs, and possibly improve outcomes. The observed maintenance effects and the cost findings require replication in larger clinical trials. PMID:25938629

  2. Can disease management target patients most likely to generate high costs? The impact of comorbidity.

    PubMed

    Charlson, Mary; Charlson, Robert E; Briggs, William; Hollenberg, James

    2007-04-01

    Disease management programs are increasingly used to manage costs of patients with chronic disease. We sought to examine the clinical characteristics and measure the health care expenditures of patients most likely to be targeted by disease management programs. Retrospective analysis of prospectively obtained data. A general medicine practice with both faculty and residents at an urban academic medical center. Five thousand eight hundred sixty-one patients enrolled in the practice for at least 1 year. Annual cost of diseases targeted by disease management. Patients' clinical and demographic information were collected from a computer system used to manage patients. Data included diagnostic information, medications, and resource usage over 1 year. We looked at 10 common diseases targeted by disease management programs. Unadjusted annual median costs for chronic diseases ranged between $1,100 and $1,500. Congestive heart failure ($1,500), stroke ($1,500), diabetes ($1,500), and cancer ($1,400) were the most expensive. As comorbidity increased, annual adjusted costs increased exponentially. Those with comorbidity scores of 2 or more accounted for 26% of the population but 50% of the overall costs. Costs for individual chronic conditions vary within a relatively narrow range. However, the costs for patients with multiple coexisting medical conditions increase rapidly. Reducing health care costs will require focusing on patients with multiple comorbid diseases, not just single diseases. The overwhelming impact of comorbidity on costs raises significant concerns about the potential ability of disease management programs to limit the costs of care.

  3. AMDTreat 5.0+ with PHREEQC titration module to compute caustic chemical quantity, effluent quality, and sludge volume

    USGS Publications Warehouse

    Cravotta, Charles A.; Means, Brent P; Arthur, Willam; McKenzie, Robert M; Parkhurst, David L.

    2015-01-01

    Alkaline chemicals are commonly added to discharges from coal mines to increase pH and decrease concentrations of acidity and dissolved aluminum, iron, manganese, and associated metals. The annual cost of chemical treatment depends on the type and quantities of chemicals added and sludge produced. The AMDTreat computer program, initially developed in 2003, is widely used to compute such costs on the basis of the user-specified flow rate and water quality data for the untreated AMD. Although AMDTreat can use results of empirical titration of net-acidic or net-alkaline effluent with caustic chemicals to accurately estimate costs for treatment, such empirical data are rarely available. A titration simulation module using the geochemical program PHREEQC has been incorporated with AMDTreat 5.0+ to improve the capability of AMDTreat to estimate: (1) the quantity and cost of caustic chemicals to attain a target pH, (2) the chemical composition of the treated effluent, and (3) the volume of sludge produced by the treatment. The simulated titration results for selected caustic chemicals (NaOH, CaO, Ca(OH)2, Na2CO3, or NH3) without aeration or with pre-aeration can be compared with or used in place of empirical titration data to estimate chemical quantities, treated effluent composition, sludge volume (precipitated metals plus unreacted chemical), and associated treatment costs. This paper describes the development, evaluation, and potential utilization of the PHREEQC titration module with the new AMDTreat 5.0+ computer program available at http://www.amd.osmre.gov/.

  4. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  5. Automotive displays and controls : existing technology and future trends

    DOT National Transportation Integrated Search

    1987-11-01

    This report presents overview information on high-technology displays and : controls that are having a substantial effect on the driving environment. Advances : in electronics and computers, in addition to cost advantages, increase the : technologies...

  6. Use of computers in dysmorphology.

    PubMed Central

    Diliberti, J H

    1988-01-01

    As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092

  7. Municipal solid waste transportation optimisation with vehicle routing approach: case study of Pontianak City, West Kalimantan

    NASA Astrophysics Data System (ADS)

    Kamal, M. A.; Youlla, D.

    2018-03-01

    Municipal solid waste (MSW) transportation in Pontianak City becomes an issue that need to be tackled by the relevant agencies. The MSW transportation service in Pontianak City currently requires very high resources especially in vehicle usage. Increasing the number of fleets has not been able to increase service levels while garbage volume is growing every year along with population growth. In this research, vehicle routing optimization approach was used to find optimal and efficient routes of vehicle cost in transporting garbage from several Temporary Garbage Dump (TGD) to Final Garbage Dump (FGD). One of the problems of MSW transportation is that there is a TGD which exceed the the vehicle capacity and must be visited more than once. The optimal computation results suggest that the municipal authorities only use 3 vehicles from 5 vehicles provided with the total minimum cost of IDR. 778,870. The computation time to search optimal route and minimal cost is very time consuming. This problem is influenced by the number of constraints and decision variables that have are integer value.

  8. Aid to planning the marketing of mining area boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giles, R.H. Jr.

    Reducing trespass, legal costs, and timber and wildlife poaching and increasing control, safety, and security are key reasons why mine land boundaries need to be marked. Accidents may be reduced, especially when associated with blast area boundaries, and in some cases increased income may be gained from hunting and recreational fees on well-marked areas. A BASIC computer program for an IBM-PC has been developed that requires minimum inputs to estimate boundary marking costs. This paper describes the rationale for the program and shows representative outputs. 3 references, 3 tables.

  9. Automated chest-radiography as a triage for Xpert testing in resource-constrained settings: a prospective study of diagnostic accuracy and costs

    NASA Astrophysics Data System (ADS)

    Philipsen, R. H. H. M.; Sánchez, C. I.; Maduskar, P.; Melendez, J.; Peters-Bax, L.; Peter, J. G.; Dawson, R.; Theron, G.; Dheda, K.; van Ginneken, B.

    2015-07-01

    Molecular tests hold great potential for tuberculosis (TB) diagnosis, but are costly, time consuming, and HIV-infected patients are often sputum scarce. Therefore, alternative approaches are needed. We evaluated automated digital chest radiography (ACR) as a rapid and cheap pre-screen test prior to Xpert MTB/RIF (Xpert). 388 suspected TB subjects underwent chest radiography, Xpert and sputum culture testing. Radiographs were analysed by computer software (CAD4TB) and specialist readers, and abnormality scores were allocated. A triage algorithm was simulated in which subjects with a score above a threshold underwent Xpert. We computed sensitivity, specificity, cost per screened subject (CSS), cost per notified TB case (CNTBC) and throughput for different diagnostic thresholds. 18.3% of subjects had culture positive TB. For Xpert alone, sensitivity was 78.9%, specificity 98.1%, CSS $13.09 and CNTBC $90.70. In a pre-screening setting where 40% of subjects would undergo Xpert, CSS decreased to $6.72 and CNTBC to $54.34, with eight TB cases missed and throughput increased from 45 to 113 patients/day. Specialists, on average, read 57% of radiographs as abnormal, reducing CSS ($8.95) and CNTBC ($64.84). ACR pre-screening could substantially reduce costs, and increase daily throughput with few TB cases missed. These data inform public health policy in resource-constrained settings.

  10. Pre-Hardware Optimization of Spacecraft Image Processing Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Petrick, David J.; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Day, John H. (Technical Monitor)

    2002-01-01

    Spacecraft telemetry rates and telemetry product complexity have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image data processing and color picture generation application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The proposed solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms, and reconfigurable computing hardware (RC) technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processors (DSP). It has been shown that this approach can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft.

  11. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  12. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts to...

  13. Cost aware cache replacement policy in shared last-level cache for hybrid memory based fog computing

    NASA Astrophysics Data System (ADS)

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Wang, Feng

    2018-04-01

    Fog computing requires a large main memory capacity to decrease latency and increase the Quality of Service (QoS). However, dynamic random access memory (DRAM), the commonly used random access memory, cannot be included into a fog computing system due to its high consumption of power. In recent years, non-volatile memories (NVM) such as Phase-Change Memory (PCM) and Spin-transfer torque RAM (STT-RAM) with their low power consumption have emerged to replace DRAM. Moreover, the currently proposed hybrid main memory, consisting of both DRAM and NVM, have shown promising advantages in terms of scalability and power consumption. However, the drawbacks of NVM, such as long read/write latency give rise to potential problems leading to asymmetric cache misses in the hybrid main memory. Current last level cache (LLC) policies are based on the unified miss cost, and result in poor performance in LLC and add to the cost of using NVM. In order to minimize the cache miss cost in the hybrid main memory, we propose a cost aware cache replacement policy (CACRP) that reduces the number of cache misses from NVM and improves the cache performance for a hybrid memory system. Experimental results show that our CACRP behaves better in LLC performance, improving performance up to 43.6% (15.5% on average) compared to LRU.

  14. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    PubMed

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  15. Implementing electronic identification for performance recording in sheep: II. Cost-benefit analysis in meat and dairy farms.

    PubMed

    Ait-Saidi, A; Caja, G; Salama, A A K; Milán, M J

    2014-12-01

    Costs and secondary benefits of implementing electronic identification (e-ID) for performance recording (i.e., lambing, body weight, inventory, and milk yield) in dairy and meat ewes were assessed by using the results from a previous study in which manual (M), semiautomatic (SA), and automatic (AU) data collection systems were compared. Ewes were identified with visual ear tags and electronic rumen boluses. The M system used visual identification, on-paper data recording, and manual data uploading to a computer. The SA system used e-ID with a handheld reader in which performances were typed and automatic uploaded to a computer. The use of a personal digital assistant (PDA) for recording and automatic data uploading, which transformed M in a SA system, was also considered. The AU system was only used for BW recording and consisted of e-ID, automatic data recording in an electronic scale, and uploading to a computer. The cost-benefit study was applied to 2 reference sheep farms of 700 meat ewes, under extensive or intensive production systems, and of 400 dairy ewes, practicing once- or twice-a-day machine milkings. Sensitivity analyses under voluntary and mandatory e-ID scenarios were also included. Benefits of using e-ID for SA or AU performance recording mainly depended on sheep farm purpose, number of test days per year, handheld reader and PDA prices, and flock size. Implementing e-ID for SA and AU performance recording saved approximately 50% of the time required by the M system, and increased the reliability of the data collected. Use of e-ID increased the cost of performance recording in a voluntary e-ID scenario, paying only partially the investment made (15 to 70%). For the mandatory e-ID scenario, in which the cost of e-ID devices was not included, savings paid 100% of the extra costs needed for using e-ID in all farm types and conditions. In both scenarios, the reader price was the most important extra cost (40 to 90%) for implementing e-ID in sheep farms. Calculated extra costs of using the PDA covered more than 100% of the implementation costs in all type of sheep farms, indicating that this device was cost-effective for sheep-performance recording. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Reducing the latency of the Fractal Iterative Method to half an iteration

    NASA Astrophysics Data System (ADS)

    Béchet, Clémentine; Tallon, Michel

    2013-12-01

    The fractal iterative method for atmospheric tomography (FRiM-3D) has been introduced to solve the wavefront reconstruction at the dimensions of an ELT with a low-computational cost. Previous studies reported the requirement of only 3 iterations of the algorithm in order to provide the best adaptive optics (AO) performance. Nevertheless, any iterative method in adaptive optics suffer from the intrinsic latency induced by the fact that one iteration can start only once the previous one is completed. Iterations hardly match the low-latency requirement of the AO real-time computer. We present here a new approach to avoid iterations in the computation of the commands with FRiM-3D, thus allowing low-latency AO response even at the scale of the European ELT (E-ELT). The method highlights the importance of "warm-start" strategy in adaptive optics. To our knowledge, this particular way to use the "warm-start" has not been reported before. Futhermore, removing the requirement of iterating to compute the commands, the computational cost of the reconstruction with FRiM-3D can be simplified and at least reduced to half the computational cost of a classical iteration. Thanks to simulations of both single-conjugate and multi-conjugate AO for the E-ELT,with FRiM-3D on Octopus ESO simulator, we demonstrate the benefit of this approach. We finally enhance the robustness of this new implementation with respect to increasing measurement noise, wind speed and even modeling errors.

  17. Artificial neural networks using complex numbers and phase encoded weights.

    PubMed

    Michel, Howard E; Awwal, Abdul Ahad S

    2010-04-01

    The model of a simple perceptron using phase-encoded inputs and complex-valued weights is proposed. The aggregation function, activation function, and learning rule for the proposed neuron are derived and applied to Boolean logic functions and simple computer vision tasks. The complex-valued neuron (CVN) is shown to be superior to traditional perceptrons. An improvement of 135% over the theoretical maximum of 104 linearly separable problems (of three variables) solvable by conventional perceptrons is achieved without additional logic, neuron stages, or higher order terms such as those required in polynomial logic gates. The application of CVN in distortion invariant character recognition and image segmentation is demonstrated. Implementation details are discussed, and the CVN is shown to be very attractive for optical implementation since optical computations are naturally complex. The cost of the CVN is less in all cases than the traditional neuron when implemented optically. Therefore, all the benefits of the CVN can be obtained without additional cost. However, on those implementations dependent on standard serial computers, CVN will be more cost effective only in those applications where its increased power can offset the requirement for additional neurons.

  18. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1962-01-01

    The feasibility and cost effectiveness of solving thermal analysis problems on superminicomputers is demonstrated. Conventional thermal analysis and the changing computer environment, computer hardware and software used, six thermal analysis test problems, performance of superminicomputers (CPU time, accuracy, turnaround, and cost) and comparison with large computers are considered. Although the CPU times for superminicomputers were 15 to 30 times greater than the fastest mainframe computer, the minimum cost to obtain the solutions on superminicomputers was from 11 percent to 59 percent of the cost of mainframe solutions. The turnaround (elapsed) time is highly dependent on the computer load, but for large problems, superminicomputers produced results in less elapsed time than a typically loaded mainframe computer.

  19. Colour computer-generated holography for point clouds utilizing the Phong illumination model.

    PubMed

    Symeonidou, Athanasia; Blinder, David; Schelkens, Peter

    2018-04-16

    A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.

  20. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  1. Handgrip strength measurement as a predictor of hospitalization costs.

    PubMed

    Guerra, R S; Amaral, T F; Sousa, A S; Pichel, F; Restivo, M T; Ferreira, S; Fonseca, I

    2015-02-01

    Undernutrition status at hospital admission is related to increased hospital costs. Handgrip strength (HGS) is an indicator of undernutrition, but the ability of HGS to predict hospitalization costs has yet to be studied. To explore whether HGS measurement at hospital admission can predict patient's hospitalization costs. A prospective study was conducted in a university hospital. Inpatient's (n=637) HGS and undernutrition status by Patient-Generated Subjective Global Assessment were ascertained. Multivariable linear regression analysis, computing HGS quartiles by sex (reference: fourth quartile, highest), was conducted in order to identify the independent predictors of hospitalization costs. Costs were evaluated through percentage deviation from the mean cost, after adjustment for patients' characteristics, disease severity and undernutrition status. Being in the first or second HGS quartiles at hospital admission increased patient's hospitalization costs, respectively, by 17.5% (95% confidence interval: 2.7-32.3) and 21.4% (7.5-35.3), which translated into an increase from €375 (58-692) to €458 (161-756). After the additional adjustment for undernutrition status, being in the first or second HGS quartiles had, respectively, an economic impact of 16.6% (1.9-31.2) and 20.0% (6.2-33.8), corresponding to an increase in hospitalization expenditure from €356 (41-668) to €428 (133-724). Low HGS at hospital admission is associated with increased hospitalization costs of between 16.6 and 20.0% after controlling for possible confounders, including undernutrition status. HGS is an inexpensive, noninvasive and easy-to-use method that has clinical potential to predict hospitalization costs.

  2. Fuzzy logic, neural networks, and soft computing

    NASA Technical Reports Server (NTRS)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial intelligence. In the years ahead, this may well become a widely held position.

  3. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  4. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE PAGES

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...

    2017-04-24

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  5. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    NASA Astrophysics Data System (ADS)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  6. 78 FR 12381 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... computer technology that permitted more efficient, and increasingly paperless, distribution of proxy... requires ongoing technology support, services and maintenance, and is a significant part of the total cost...

  7. Pseudo-orthogonalization of memory patterns for associative memory.

    PubMed

    Oku, Makito; Makino, Takaki; Aihara, Kazuyuki

    2013-11-01

    A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.

  8. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production.

  9. Numerical propulsion system simulation

    NASA Technical Reports Server (NTRS)

    Lytle, John K.; Remaklus, David A.; Nichols, Lester D.

    1990-01-01

    The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.

  10. OPPORTUNITY COSTS OF REWARD DELAYS AND THE DISCOUNTING OF HYPOTHETICAL MONEY AND CIGARETTES

    PubMed Central

    Johnson, Patrick S.; Herrmann, Evan S.; Johnson, Matthew W.

    2015-01-01

    Humans are reported to discount delayed rewards at lower rates than nonhumans. However, nonhumans are studied in tasks that restrict reinforcement during delays, whereas humans are typically studied in tasks that do not restrict reinforcement during delays. In nonhuman tasks, the opportunity cost of restricted reinforcement during delays may increase delay discounting rates. The present within-subjects study used online crowdsourcing (Amazon Mechanical Turk, or MTurk) to assess the discounting of hypothetical delayed money (and cigarettes in smokers) under four hypothetical framing conditions differing in the availability of reinforcement during delays. At one extreme, participants were free to leave their computer without returning, and engage in any behavior during reward delays (modeling typical human tasks). At the opposite extreme, participants were required to stay at their computer and engage in little other behavior during reward delays (modeling typical nonhuman tasks). Discounting rates increased as an orderly function of opportunity cost. Results also indicated predominantly hyperbolic discounting, the “magnitude effect,” steeper discounting of cigarettes than money, and positive correlations between discounting rates of these commodities. This is the first study to test the effects of opportunity costs on discounting, and suggests that procedural differences may partially account for observed species differences in discounting. PMID:25388973

  11. Opportunity costs of reward delays and the discounting of hypothetical money and cigarettes.

    PubMed

    Johnson, Patrick S; Herrmann, Evan S; Johnson, Matthew W

    2015-01-01

    Humans are reported to discount delayed rewards at lower rates than nonhumans. However, nonhumans are studied in tasks that restrict reinforcement during delays, whereas humans are typically studied in tasks that do not restrict reinforcement during delays. In nonhuman tasks, the opportunity cost of restricted reinforcement during delays may increase delay discounting rates. The present within-subjects study used online crowdsourcing (Amazon Mechanical Turk, or MTurk) to assess the discounting of hypothetical delayed money (and cigarettes in smokers) under four hypothetical framing conditions differing in the availability of reinforcement during delays. At one extreme, participants were free to leave their computer without returning, and engage in any behavior during reward delays (modeling typical human tasks). At the opposite extreme, participants were required to stay at their computer and engage in little other behavior during reward delays (modeling typical nonhuman tasks). Discounting rates increased as an orderly function of opportunity cost. Results also indicated predominantly hyperbolic discounting, the "magnitude effect," steeper discounting of cigarettes than money, and positive correlations between discounting rates of these commodities. This is the first study to test the effects of opportunity costs on discounting, and suggests that procedural differences may partially account for observed species differences in discounting. © Society for the Experimental Analysis of Behavior.

  12. Software Support for Transiently Powered Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Der Woude, Joel Matthew

    With the continued reduction in size and cost of computing, power becomes an increasingly heavy burden on system designers for embedded applications. While energy harvesting techniques are an increasingly desirable solution for many deeply embedded applications where size and lifetime are a priority, previous work has shown that energy harvesting provides insufficient power for long running computation. We present Ratchet, which to the authors knowledge is the first automatic, software-only checkpointing system for energy harvesting platforms. We show that Ratchet provides a means to extend computation across power cycles, consistent with those experienced by energy harvesting devices. We demonstrate themore » correctness of our system under frequent failures and show that it has an average overhead of 58.9% across a suite of benchmarks representative for embedded applications.« less

  13. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    NASA Astrophysics Data System (ADS)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  14. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  15. Information Systems; Modern Health Care and Medical Information.

    ERIC Educational Resources Information Center

    Brandejs, J. F., And Others

    1975-01-01

    To effectively handle changes in health policy and health information, new designs and applications of automation are explored. Increased use of computer-based information systems in health care could serve as a means of control over the costs of developing more comprehensive health service, with applications increasing not only the automation of…

  16. Operative outcome and hospital cost.

    PubMed

    Ferraris, V A; Ferraris, S P; Singh, A

    1998-03-01

    Because of concern about increasing health care costs, we undertook a study to find patient risk factors associated with increased hospital costs and to evaluate the relationship between increased cost and in-hospital mortality and serious morbidity. More than 100 patient variables were screened in 1221 patients undergoing cardiac procedures. Simultaneously, patient hospital costs were computed from the cost-to-charge ratio. Univariate and multivariate statistics were used to explore the relationship between hospital cost and patient outcomes, including operative death, in-hospital morbidity, and length of stay. The greatest costs were for 31 patients who did not survive operation ($74,466, 95% confidence interval $27,102 to $198,025), greater than the costs for 120 patients who had serious, nonfatal morbidity ($60,335, 95% confidence interval $28,381 to $130,897, p = 0.02) and those for 1070 patients who survived operation without complication ($31,459, 95% confidence interval $21,944 to $49,849, p = 0.001). Breakdown of the components of hospital costs in fatalities and in cases with nonfatal complications revealed that the greatest contributions were in anesthesia and operating room costs. Significant (by stepwise linear regression analysis) independent risks for increased hospital cost were as follows (in order of decreasing importance): (1) preoperative congestive heart failure, (2) serum creatinine level greater than 2.5 mg/dl, (3) New York state predicted mortality risk, (4), type of operation (coronary artery bypass grafting, valve, valve plus coronary artery bypass grafting, or other), (5) preoperative hematocrit, (6) need for reoperative procedure, (7) operative priority, and (8) sex. These risks were different than those for in-hospitality death or increased length of stay. Hospital cost correlated with length of stay (r = 0.63, p < 0.001), but there were many outliers at the high end of the hospital cost spectrum. We conclude that operative death is the most costly outcome; length of stay is an unreliable indicator of hospital cost, especially at the high end of the cost spectrum; risks of increased hospital cost are different than those for perioperative mortality or increased length of stay; and ventricular dysfunction in elderly patients undergoing urgent operations for other than coronary disease is associated with increased cost. Certain patient factors, such as preoperative anemia and congestive heart failure, are amenable to preoperative intervention to reduce costs, and a high-risk patient profile can serve as a target for cost-reduction strategies.

  17. Manual of phosphoric acid fuel cell power plant cost model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  18. Development and evaluation of a computer program to grade student performance on peripheral blood smears

    NASA Astrophysics Data System (ADS)

    Lehman, Donald Clifford

    Today's medical laboratories are dealing with cost containment health care policies and unfilled laboratory positions. Because there may be fewer experienced clinical laboratory scientists, students graduating from clinical laboratory science (CLS) programs are expected by their employers to perform accurately in entry-level positions with minimal training. Information in the CLS field is increasing at a dramatic rate, and instructors are expected to teach more content in the same amount of time with the same resources. With this increase in teaching obligations, instructors could use a tool to facilitate grading. The research question was, "Can computer-assisted assessment evaluate students in an accurate and time efficient way?" A computer program was developed to assess CLS students' ability to evaluate peripheral blood smears. Automated grading permits students to get results quicker and allows the laboratory instructor to devote less time to grading. This computer program could improve instruction by providing more time to students and instructors for other activities. To be valuable, the program should provide the same quality of grading as the instructor. These benefits must outweigh potential problems such as the time necessary to develop and maintain the program, monitoring of student progress by the instructor, and the financial cost of the computer software and hardware. In this study, surveys of students and an interview with the laboratory instructor were performed to provide a formative evaluation of the computer program. In addition, the grading accuracy of the computer program was examined. These results will be used to improve the program for use in future courses.

  19. Cost effective campaigning in social networks

    NASA Astrophysics Data System (ADS)

    Kotnis, Bhushan; Kuri, Joy

    2016-05-01

    Campaigners are increasingly using online social networking platforms for promoting products, ideas and information. A popular method of promoting a product or even an idea is incentivizing individuals to evangelize the idea vigorously by providing them with referral rewards in the form of discounts, cash backs, or social recognition. Due to budget constraints on scarce resources such as money and manpower, it may not be possible to provide incentives for the entire population, and hence incentives need to be allocated judiciously to appropriate individuals for ensuring the highest possible outreach size. We aim to do the same by formulating and solving an optimization problem using percolation theory. In particular, we compute the set of individuals that are provided incentives for minimizing the expected cost while ensuring a given outreach size. We also solve the problem of computing the set of individuals to be incentivized for maximizing the outreach size for given cost budget. The optimization problem turns out to be non trivial; it involves quantities that need to be computed by numerically solving a fixed point equation. Our primary contribution is, that for a fairly general cost structure, we show that the optimization problems can be solved by solving a simple linear program. We believe that our approach of using percolation theory to formulate an optimization problem is the first of its kind.

  20. Direct costs and cost-effectiveness of dual-source computed tomography and invasive coronary angiography in patients with an intermediate pretest likelihood for coronary artery disease.

    PubMed

    Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W

    2012-03-01

    The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.

  1. Cost-Benefit Arbitration Between Multiple Reinforcement-Learning Systems.

    PubMed

    Kool, Wouter; Gershman, Samuel J; Cushman, Fiery A

    2017-09-01

    Human behavior is sometimes determined by habit and other times by goal-directed planning. Modern reinforcement-learning theories formalize this distinction as a competition between a computationally cheap but inaccurate model-free system that gives rise to habits and a computationally expensive but accurate model-based system that implements planning. It is unclear, however, how people choose to allocate control between these systems. Here, we propose that arbitration occurs by comparing each system's task-specific costs and benefits. To investigate this proposal, we conducted two experiments showing that people increase model-based control when it achieves greater accuracy than model-free control, and especially when the rewards of accurate performance are amplified. In contrast, they are insensitive to reward amplification when model-based and model-free control yield equivalent accuracy. This suggests that humans adaptively balance habitual and planned action through on-line cost-benefit analysis.

  2. Increasing patient engagement in rehabilitation exercises using computer-based citizen science.

    PubMed

    Laut, Jeffrey; Cappa, Francesco; Nov, Oded; Porfiri, Maurizio

    2015-01-01

    Patient motivation is an important factor to consider when developing rehabilitation programs. Here, we explore the effectiveness of active participation in web-based citizen science activities as a means of increasing participant engagement in rehabilitation exercises, through the use of a low-cost haptic joystick interfaced with a laptop computer. Using the joystick, patients navigate a virtual environment representing the site of a citizen science project situated in a polluted canal. Participants are tasked with following a path on a laptop screen representing the canal. The experiment consists of two conditions: in one condition, a citizen science component where participants classify images from the canal is included; and in the other, the citizen science component is absent. Both conditions are tested on a group of young patients undergoing rehabilitation treatments and a group of healthy subjects. A survey administered at the end of both tasks reveals that participants prefer performing the scientific task, and are more likely to choose to repeat it, even at the cost of increasing the time of their rehabilitation exercise. Furthermore, performance indices based on data collected from the joystick indicate significant differences in the trajectories created by patients and healthy subjects, suggesting that the low-cost device can be used in a rehabilitation setting for gauging patient recovery.

  3. When Machines Think: Radiology's Next Frontier.

    PubMed

    Dreyer, Keith J; Geis, J Raymond

    2017-12-01

    Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.

  4. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    PubMed Central

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  5. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time.

    PubMed

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-02-24

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90-94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7-5.6% per millisecond, with most satellites acquired successfully.

  6. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  7. Feasibility study of RFID technology for construction load tracking.

    DOT National Transportation Integrated Search

    2010-12-01

    ADOT&PF is seeking more efficient business practices and processes to increase its speed in delivering supplies to work sites, optimize the workforce, and minimize : costs. The current tracking process uses a computer-generated ticket carried by the ...

  8. Efficient Conservative Reformulation Schemes for Lithium Intercalation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urisanga, PC; Rife, D; De, S

    Porous electrode theory coupled with transport and reaction mechanisms is a widely used technique to model Li-ion batteries employing an appropriate discretization or approximation for solid phase diffusion with electrode particles. One of the major difficulties in simulating Li-ion battery models is the need to account for solid phase diffusion in a second radial dimension r, which increases the computation time/cost to a great extent. Various methods that reduce the computational cost have been introduced to treat this phenomenon, but most of them do not guarantee mass conservation. The aim of this paper is to introduce an inherently mass conservingmore » yet computationally efficient method for solid phase diffusion based on Lobatto III A quadrature. This paper also presents coupling of the new solid phase reformulation scheme with a macro-homogeneous porous electrode theory based pseudo 20 model for Li-ion battery. (C) The Author(s) 2015. Published by ECS. All rights reserved.« less

  9. The impact of pharmacophore modeling in drug design.

    PubMed

    Guner, Osman F

    2005-07-01

    With the reliable use of computer simulations in scientific research, it is possible to achieve significant increases in productivity as well as a reduction in research costs compared with experimental approaches. For example, computer-simulation can substantially enchance productivity by focusing the scientist to better, more informed choices, while also driving the 'fail-early' concept to result in a significant reduction in cost. Pharmacophore modeling is a reliable computer-aided design tool used in the discovery of new classes of compounds for a given therapeutic category. This commentary will briefly review the benefits and applications of this technology in drug discovery and design, and will also highlight its historical evolution. The two most commonly used approaches for pharmacophore model development will be discussed, and several examples of how this technology was successfully applied to identify new potent leads will be provided. The article concludes with a brief outline of the controversial issue of patentability of pharmacophore models.

  10. Targeted post-mortem computed tomography cardiac angiography: proof of concept.

    PubMed

    Saunders, Sarah L; Morgan, Bruno; Raj, Vimal; Robinson, Claire E; Rutty, Guy N

    2011-07-01

    With the increasing use and availability of multi-detector computed tomography and magnetic resonance imaging in autopsy practice, there has been an international push towards the development of the so-called near virtual autopsy. However, currently, a significant obstacle to the consideration as to whether or not near virtual autopsies could one day replace the conventional invasive autopsy is the failure of post-mortem imaging to yield detailed information concerning the coronary arteries. To date, a cost-effective, practical solution to allow high throughput imaging has not been presented within the forensic literature. We present a proof of concept paper describing a simple, quick, cost-effective, manual, targeted in situ post-mortem cardiac angiography method using a minimally invasive approach, to be used with multi-detector computed tomography for high throughput cadaveric imaging which can be used in permanent or temporary mortuaries.

  11. Research on regional numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Kreitzberg, C. W.

    1976-01-01

    Extension of the predictive power of dynamic weather forecasting to scales below the conventional synoptic or cyclonic scales in the near future is assessed. Lower costs per computation, more powerful computers, and a 100 km mesh over the North American area (with coarser mesh extending beyond it) are noted at present. Doubling the resolution even locally (to 50 km mesh) would entail a 16-fold increase in costs (including vertical resolution and halving the time interval), and constraints on domain size and length of forecast. Boundary conditions would be provided by the surrounding 100 km mesh, and time-varying lateral boundary conditions can be considered to handle moving phenomena. More physical processes to treat, more efficient numerical techniques, and faster computers (improved software and hardware) backing up satellite and radar data could produce further improvements in forecasting in the 1980s. Boundary layer modeling, initialization techniques, and quantitative precipitation forecasting are singled out among key tasks.

  12. Can Disease Management Target Patients Most Likely to Generate High Costs? The Impact of Comorbidity

    PubMed Central

    Charlson, Robert E.; Briggs, William; Hollenberg, James

    2007-01-01

    Context Disease management programs are increasingly used to manage costs of patients with chronic disease. Objective We sought to examine the clinical characteristics and measure the health care expenditures of patients most likely to be targeted by disease management programs. Design Retrospective analysis of prospectively obtained data. Setting A general medicine practice with both faculty and residents at an urban academic medical center. Participants Five thousand eight hundred sixty-one patients enrolled in the practice for at least 1 year. Main Outcomes Annual cost of diseases targeted by disease management. Measurements Patients’ clinical and demographic information were collected from a computer system used to manage patients. Data included diagnostic information, medications, and resource usage over 1 year. We looked at 10 common diseases targeted by disease management programs. Results Unadjusted annual median costs for chronic diseases ranged between $1,100 and $1,500. Congestive heart failure ($1,500), stroke ($1,500), diabetes ($1,500), and cancer ($1,400) were the most expensive. As comorbidity increased, annual adjusted costs increased exponentially. Those with comorbidity scores of 2 or more accounted for 26% of the population but 50% of the overall costs. Conclusions Costs for individual chronic conditions vary within a relatively narrow range. However, the costs for patients with multiple coexisting medical conditions increase rapidly. Reducing health care costs will require focusing on patients with multiple comorbid diseases, not just single diseases. The overwhelming impact of comorbidity on costs raises significant concerns about the potential ability of disease management programs to limit the costs of care. PMID:17372794

  13. The Cost of Increasing Physical Activity and Maintaining Weight for Mid-Life Sedentary African American Women

    PubMed Central

    Johnson, Tricia; Schoeny, Michael; Fogg, Louis; Wilbur, JoEllen

    2015-01-01

    Objective To evaluate the marginal costs of increasing physical activity and maintaining weight for a lifestyle physical activity program targeting sedentary African American women. Methods Outcomes included change in minutes of total moderate to vigorous physical activity, leisure time moderate to vigorous physical activity and walking per week, and weight stability between baseline and maintenance at 48 weeks. Marginal cost effectiveness ratios (MCERs) were calculated for each outcome, and 95% confidence intervals were computed using a bootstrap method. The analysis was from the societal perspective and calculated in 2013 US dollars. Results For the 260 participants in the analysis, program costs were $165 ± 19, and participant costs were $164 ± 35, for a total cost of $329 ± 49. The MCER for change in walking was $1.50/min/wk (95% CI: 1.28, 1.87), $1.73/min/wk (95% CI: 1.41, 2.18) for change in moderate to vigorous physical activity, and $1.94/min/wk (95% CI: 1.58, 2.40) for leisure-time moderate to vigorous physical activity. The MCER for weight stability was $412 (95% CI: 399, 456). Discussion The Women's Lifestyle Physical Activity Program is a relatively low cost strategy for increasing physical activity. The marginal cost of increasing physical activity is lower than for weight stability. The participant costs related to time in the program were nearly half of the total costs, suggesting that practitioners and policy-makers should consider the participant cost when disseminating a lifestyle physical activity program into practice. PMID:26797232

  14. Evaluation of the long-term cost-effectiveness of liraglutide therapy for patients with type 2 diabetes in France.

    PubMed

    Roussel, Ronan; Martinez, Luc; Vandebrouck, Tom; Douik, Habiba; Emiel, Patrick; Guery, Matthieu; Hunt, Barnaby; Valentine, William J

    2016-01-01

    The present study aimed to compare the projected long-term clinical and cost implications associated with liraglutide, sitagliptin and glimepiride in patients with type 2 diabetes mellitus failing to achieve glycemic control on metformin monotherapy in France. Clinical input data for the modeling analysis were taken from two randomized, controlled trials (LIRA-DPP4 and LEAD-2). Long-term (patient lifetime) projections of clinical outcomes and direct costs (2013 Euros; €) were made using a validated computer simulation model of type 2 diabetes. Costs were taken from published France-specific sources. Future costs and clinical benefits were discounted at 3% annually. Sensitivity analyses were performed. Liraglutide was associated with an increase in quality-adjusted life expectancy of 0.25 quality-adjusted life years (QALYs) and an increase in mean direct healthcare costs of €2558 per patient compared with sitagliptin. In the comparison with glimepiride, liraglutide was associated with an increase in quality-adjusted life expectancy of 0.23 QALYs and an increase in direct costs of €4695. Based on these estimates, liraglutide was associated with an incremental cost-effectiveness ratio (ICER) of €10,275 per QALY gained vs sitagliptin and €20,709 per QALY gained vs glimepiride in France. Calculated ICERs for both comparisons fell below the commonly quoted willingness-to-pay threshold of €30,000 per QALY gained. Therefore, liraglutide is likely to be cost-effective vs sitagliptin and glimepiride from a healthcare payer perspective in France.

  15. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    PubMed Central

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study provides updated estimates of CAD costs in a full-field digital system and assessment cost for women who are re-called after initial screening. However, the model is highly sensitive to various parameters e.g. reading time, reader qualification, and equipment cost. PMID:21241473

  16. One approach for evaluating the Distributed Computing Design System (DCDS)

    NASA Technical Reports Server (NTRS)

    Ellis, J. T.

    1985-01-01

    The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.

  17. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.

  18. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.

  19. Human Factors in Aviation Maintenance. Phase 3. Volume 2. Progress Report

    DTIC Science & Technology

    1994-07-01

    computer-simulated NDI eddy-current task developed by Dr. Colin G. Drury and his colleagues at the State University of New York (SUNY) at Buffalo. The task...to lower cost and increase performance ............... 28 Figure 3.2 Comparison of development and delivery costs for CBT and traditional training...Active Female Aircraft Mechanics. . 110 Figure 5.1 NDI Inspection Task Simulation ( Drury et al., 1992) ............... 147 Figure 5.2 Mean Misses and

  20. Materials Genome Initiative Element

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    NASA is committed to developing new materials and manufacturing methods that can enable new missions with ever increasing mission demands. Typically, the development and certification of new materials and manufacturing methods in the aerospace industry has required more than 20 years of development time with a costly testing and certification program. To reduce the cost and time to mature these emerging technologies, NASA is developing computational materials tools to improve understanding of the material and guide the certification process.

  1. Computer Vision Hardware System for Automating Rough Mills of Furniture Plants

    Treesearch

    Richard W. Conners; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; D. Earl Kline; C.J. Gatchell

    1990-01-01

    The rough mill of a hardwood furniture or fixture plant is the place where dried lumber is cut into the rough parts that will be used in the rest of the manufacturing process. Approximately a third of the cost of operating the rough mill is the cost of the raw material. Hence any increase in the number of rough parts produced from a given volume of raw material can...

  2. GPU Accelerated Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley

    2017-01-01

    Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.

  3. Implicit Runge-Kutta Methods with Explicit Internal Stages

    NASA Astrophysics Data System (ADS)

    Skvortsov, L. M.

    2018-03-01

    The main computational costs of implicit Runge-Kutta methods are caused by solving a system of algebraic equations at every step. By introducing explicit stages, it is possible to increase the stage (or pseudo-stage) order of the method, which makes it possible to increase the accuracy and avoid reducing the order in solving stiff problems, without additional costs of solving algebraic equations. The paper presents implicit methods with an explicit first stage and one or two explicit internal stages. The results of solving test problems are compared with similar methods having no explicit internal stages.

  4. Logistics Control Facility: A Normative Model for Total Asset Visibility in the Air Force Logistics System

    DTIC Science & Technology

    1994-09-01

    IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in

  5. Microprocessor-Based Systems Control for the Rigidized Inflatable Get-Away-Special Experiment

    DTIC Science & Technology

    2004-03-01

    communications and faster data throughput increase, satellites are becoming larger. Larger satellite antennas help to provide the needed gain to...increase communications in space. Compounding the performance and size trade-offs are the payload weight and size limit imposed by the launch vehicles...increased communications capacity, and reduce launch costs. This thesis develops and implements the computer control system and power system to

  6. 32 CFR 701.52 - Computation of fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... correspondence and preparation costs, these fees are not recoupable from the requester. (b) DD 2086, Record of... costs, as requesters may solicit a copy of that document to ensure accurate computation of fees. Costs... 32 National Defense 5 2010-07-01 2010-07-01 false Computation of fees. 701.52 Section 701.52...

  7. 12 CFR 1070.22 - Fees for processing requests for CFPB records.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CFPB shall charge the requester for the actual direct cost of the search, including computer search time, runs, and the operator's salary. The fee for computer output will be the actual direct cost. For... and the cost of operating the computer to process a request) equals the equivalent dollar amount of...

  8. A Systems Approach to Costing in the Blood Bank

    PubMed Central

    Delon, Gerald L.; Smalley, Harold E.

    1969-01-01

    A macroscopic approach to departmental cost finding is combined with a microscopic approach to the weighting of laboratory tests in a mathematical model which, when incorporated into a relative unit value format, yields unit costs for such tests under a wide variety of operational conditions. The task of updating such costs to reflect changing conditions can be facilitated by a computer program incorporating the capability of pricing the various tests to achieve any desired profit or loss or to break even. Among other potential uses of such a technique, the effects on unit cost per test caused by increasing or decreasing the number of technicians or the volume of tests can be systematically examined, and pricing can be updated each year as hospital costs change. PMID:5799486

  9. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  10. SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel; Rushton, Nelson

    2013-01-01

    With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less costly than development of comparable parallel code. Moreover, SequenceL not only automatically parallelizes the code, but since it is based on CSP-NT, it is provably race free, thus eliminating the largest quality challenge the parallelized software developer faces.

  11. A survey of computer search service costs in the academic health sciences library.

    PubMed Central

    Shirley, S

    1978-01-01

    The Norris Medical Library, University of Southern California, has recently completed an extensive survey of costs involved in the provision of computer search services beyond vendor charges for connect time and printing. In this survey costs for such items as terminal depreciation, repair contract, personnel time, and supplies are analyzed. Implications of this cost survey are discussed in relation to planning and price setting for computer search services. PMID:708953

  12. Effectiveness and cost effectiveness of television, radio and print advertisements in promoting the New York smokers' quitline.

    PubMed

    Farrelly, Matthew C; Hussin, Altijani; Bauer, Ursula E

    2007-12-01

    This study assessed the relative effectiveness and cost effectiveness of television, radio and print advertisements to generate calls to the New York smokers' quitline. Regression analysis was used to link total county level monthly quitline calls to television, radio and print advertising expenditures. Based on regression results, standardised measures of the relative effectiveness and cost effectiveness of expenditures were computed. There was a positive and statistically significant relation between call volume and expenditures for television (p<0.01) and radio (p<0.001) advertisements and a marginally significant effect for expenditures on newspaper advertisements (p<0.065). The largest effect was for television advertising. However, because of differences in advertising costs, for every $1000 increase in television, radio and newspaper expenditures, call volume increased by 0.1%, 5.7% and 2.8%, respectively. Television, radio and print media all effectively increased calls to the New York smokers' quitline. Although increases in expenditures for television were the most effective, their relatively high costs suggest they are not currently the most cost effective means to promote a quitline. This implies that a more efficient mix of media would place greater emphasis on radio than television. However, because the current study does not adequately assess the extent to which radio expenditures would sustain their effectiveness with substantial expenditure increases, it is not feasible to determine a more optimal mix of expenditures.

  13. Cost effectiveness of a medical digital library.

    PubMed

    Roussel, F; Darmoni, S J; Thirion, B

    2001-01-01

    The rapid increase in the price of electronic journals has made the optimization of collection management an urgent task. As there is currently no standard procedure for the evaluation of this problem, we applied the Reading Factor (RF), an electronically computed indicator used for consultation of individual articles. The aim of our study was to assess the cost effective impact of modifications in our digital library (i.e. change of access from the Intranet to the Internet or change in editorial policy). The digital OVID library at Rouen University Hospital continues to be cost-effective in comparison with the interlibrary loan costs. Moreover, when electronic versions are offered alongside a limited amount of interlibrary loans, a reduction in library costs was observed.

  14. Computational analysis of water entry of a circular section at constant velocity based on Reynold's averaged Navier-Stokes method

    NASA Astrophysics Data System (ADS)

    Uddin, M. Maruf; Fuad, Muzaddid-E.-Zaman; Rahaman, Md. Mashiur; Islam, M. Rabiul

    2017-12-01

    With the rapid decrease in the cost of computational infrastructure with more efficient algorithm for solving non-linear problems, Reynold's averaged Navier-Stokes (RaNS) based Computational Fluid Dynamics (CFD) has been used widely now-a-days. As a preliminary evaluation tool, CFD is used to calculate the hydrodynamic loads on offshore installations, ships, and other structures in the ocean at initial design stages. Traditionally, wedges have been studied more than circular cylinders because cylinder section has zero deadrise angle at the instant of water impact, which increases with increase of submergence. In Present study, RaNS based commercial code ANSYS Fluent is used to simulate the water entry of a circular section at constant velocity. It is seen that present computational results were compared with experiment and other numerical method.

  15. A Fresh Perspective on a Familiar Problem: Examining Disparities in Knee Osteoarthritis Using a Markov Model.

    PubMed

    Karmarkar, Taruja D; Maurer, Anne; Parks, Michael L; Mason, Thomas; Bejinez-Eastman, Ana; Harrington, Melvyn; Morgan, Randall; O'Connor, Mary I; Wood, James E; Gaskin, Darrell J

    2017-12-01

    Disparities in the presentation of knee osteoarthritis (OA) and in the utilization of treatment across sex, racial, and ethnic groups in the United States are well documented. We used a Markov model to calculate lifetime costs of knee OA treatment. We then used the model results to compute costs of disparities in treatment by race, ethnicity, sex, and socioeconomic status. We used the literature to construct a Markov Model of knee OA and publicly available data to create the model parameters and patient populations of interest. An expert panel of physicians, who treated a large number of patients with knee OA, constructed treatment pathways. Direct costs were based on the literature and indirect costs were derived from the Medical Expenditure Panel Survey. We found that failing to obtain effective treatment increased costs and limited benefits for all groups. Delaying treatment imposed a greater cost across all groups and decreased benefits. Lost income because of lower labor market productivity comprised a substantial proportion of the lifetime costs of knee OA. Population simulations demonstrated that as the diversity of the US population increases, the societal costs of racial and ethnic disparities in treatment utilization for knee OA will increase. Our results show that disparities in treatment of knee OA are costly. All stakeholders involved in treatment decisions for knee OA patients should consider costs associated with delaying and forgoing treatment, especially for disadvantaged populations. Such decisions may lead to higher costs and worse health outcomes.

  16. A cost/benefit analysis of commercial fusion-fission hybrid reactor development

    NASA Astrophysics Data System (ADS)

    Kostoff, Ronald N.

    1983-04-01

    A simple algorithm was developed that allows rapid computation of the ratio, R, of present worth of benefits to present worth of hybrid R&D program costs as a function of potential hybrid unit electricity cost savings, discount rate, electricity demand growth rate, total hybrid R&D program cost, and time to complete a demonstration reactor. In the sensitivity study, these variables were assigned nominal values (unit electricity cost savings of 4 mills/kW-hr, discount rate of 4%/year, growth rate of 2.25%/year, total R&D program cost of 20 billion, and time to complete a demonstration reactor of 30 years), and the variable of interest was varied about its nominal value. Results show that R increases with decreasing discount rate and increasing unit electricity savings and ranges from 4 to 94 as discount rate ranges from 5 to 3%/year and unit electricity savings range from 2 to 6 mills/kW-hr. R increases with increasing growth rate and ranges from 3 to 187 as growth rate ranges from 1 to 3.5%/year and unit electricity cost savings range from 2 to 6 mills/kW-hr. R attains a maximum value when plotted against time to complete a demonstration reactor. The location of this maximum value occurs at shorter completion times as discount rate increases, and this optimal completion time ranges from 20 years for a discount rate of 4%/year to 45 years for a discount rate of 3%/year.

  17. The Computer as a Tool for Learning

    PubMed Central

    Starkweather, John A.

    1986-01-01

    Experimenters from the beginning recognized the advantages computers might offer in medical education. Several medical schools have gained experience in such programs in automated instruction. Television images and graphic display combined with computer control and user interaction are effective for teaching problem solving. The National Board of Medical Examiners has developed patient-case simulation for examining clinical skills, and the National Library of Medicine has experimented with combining media. Advances from the field of artificial intelligence and the availability of increasingly powerful microcomputers at lower cost will aid further development. Computers will likely affect existing educational methods, adding new capabilities to laboratory exercises, to self-assessment and to continuing education. PMID:3544511

  18. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  19. COMPUTER PROGRAM FOR CALCULATING THE COST OF DRINKING WATER TREATMENT SYSTEMS

    EPA Science Inventory

    This FORTRAN computer program calculates the construction and operation/maintenance costs for 45 centralized unit treatment processes for water supply. The calculated costs are based on various design parameters and raw water quality. These cost data are applicable to small size ...

  20. The modelling of the flow-induced vibrations of periodic flat and axial-symmetric structures with a wave-based method

    NASA Astrophysics Data System (ADS)

    Errico, F.; Ichchou, M.; De Rosa, S.; Bareille, O.; Franco, F.

    2018-06-01

    The stochastic response of periodic flat and axial-symmetric structures, subjected to random and spatially-correlated loads, is here analysed through an approach based on the combination of a wave finite element and a transfer matrix method. Although giving a lower computational cost, the present approach keeps the same accuracy of classic finite element methods. When dealing with homogeneous structures, the accuracy is also extended to higher frequencies, without increasing the time of calculation. Depending on the complexity of the structure and the frequency range, the computational cost can be reduced more than two orders of magnitude. The presented methodology is validated both for simple and complex structural shapes, under deterministic and random loads.

  1. Advanced On-Board Processor (AOP). [for future spacecraft applications

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Advanced On-board Processor the (AOP) uses large scale integration throughout and is the most advanced space qualified computer of its class in existence today. It was designed to satisfy most spacecraft requirements which are anticipated over the next several years. The AOP design utilizes custom metallized multigate arrays (CMMA) which have been designed specifically for this computer. This approach provides the most efficient use of circuits, reduces volume, weight, assembly costs and provides for a significant increase in reliability by the significant reduction in conventional circuit interconnections. The required 69 CMMA packages are assembled on a single multilayer printed circuit board which together with associated connectors constitutes the complete AOP. This approach also reduces conventional interconnections thus further reducing weight, volume and assembly costs.

  2. Six degree of freedom sensor

    DOEpatents

    Vann, Charles S.

    1999-01-01

    This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing.

  3. Six degree of freedom sensor

    DOEpatents

    Vann, C.S.

    1999-03-16

    This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing. 3 figs.

  4. New protocol for construction of eyeglasses-supported provisional nasal prosthesis using CAD/CAM techniques.

    PubMed

    Ciocca, Leonardo; Fantini, Massimiliano; De Crescenzio, Francesca; Persiani, Franco; Scotti, Roberto

    2010-01-01

    A new protocol for making an immediate provisional eyeglasses-supported nasal prosthesis is presented that uses laser scanning, computer-aided design/computer-aided manufacturing procedures, and rapid prototyping techniques, reducing time and costs while increasing the quality of the final product. With this protocol, the eyeglasses were digitized, and the relative position of the nasal prosthesis was planned and evaluated in a virtual environment without any try-in appointment. This innovative method saves time, reduces costs, and restores the patient's aesthetic appearance after a disfiguration caused by ablation of the nasal pyramid better than conventional restoration methods. Moreover, the digital model of the designed nasal epithesis can be used to develop a definitive prosthesis anchored to osseointegrated craniofacial implants.

  5. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  6. Multiple operating system rotation environment moving target defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Nathaniel; Thompson, Michael

    Systems and methods for providing a multiple operating system rotation environment ("MORE") moving target defense ("MTD") computing system are described. The MORE-MTD system provides enhanced computer system security through a rotation of multiple operating systems. The MORE-MTD system increases attacker uncertainty, increases the cost of attacking the system, reduces the likelihood of an attacker locating a vulnerability, and reduces the exposure time of any located vulnerability. The MORE-MTD environment is effectuated by rotation of the operating systems at a given interval. The rotating operating systems create a consistently changing attack surface for remote attackers.

  7. Estimating costs and performance of systems for machine processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Ballard, R. J.; Eastwood, L. F., Jr.

    1977-01-01

    This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.

  8. 26 CFR 7.57(d)-1 - Election with respect to straight line recovery of intangibles.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Tax Reform Act of 1976. Under this election taxpayers may use cost depletion to compute straight line... wells to which the election applies, cost depletion to compute straight line recovery of intangibles for... whether or not the taxpayer uses cost depletion in computing taxable income. (5) The election is made by a...

  9. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  10. Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    2005-01-01

    Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…

  11. Cost-effectiveness analysis of unsafe abortion and alternative first-trimester pregnancy termination strategies in Nigeria and Ghana.

    PubMed

    Hu, Delphine; Grossman, Daniel; Levin, Carol; Blanchard, Kelly; Adanu, Richard; Goldie, Sue J

    2010-06-01

    To explore the policy implications of increasing access to safe abortion in Nigeria and Ghana, we developed a computer-based decision analytic model which simulates induced abortion and its potential complications in a cohort of women, and comparatively assessed the cost-effectiveness of unsafe abortion and three first-trimester abortion modalities: hospital-based dilatation and curettage, hospital- and clinic-based manual vacuum aspiration (MVA), and medical abortion using misoprostol (MA). Assuming all modalities are equally available, clinic-based MVA is the most cost-effective option in Nigeria. If clinic-based MVA is not available, MA is the next best strategy. Conversely, in Ghana, MA is the most cost-effective strategy, followed by clinic-based MVA if MA is not available. From a real world policy perspective, increasing access to safe abortion in favor over unsafe abortion is the single most important factor in saving lives and societal costs, and is more influential than the actual choice of safe abortion modality.

  12. Satellite broadcasting system study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The study to develop a system model and computer program representative of broadcasting satellite systems employing community-type receiving terminals is reported. The program provides a user-oriented tool for evaluating performance/cost tradeoffs, synthesizing minimum cost systems for a given set of system requirements, and performing sensitivity analyses to identify critical parameters and technology. The performance/ costing philosophy and what is meant by a minimum cost system is shown graphically. Topics discussed include: main line control program, ground segment model, space segment model, cost models and launch vehicle selection. Several examples of minimum cost systems resulting from the computer program are presented. A listing of the computer program is also included.

  13. Correlating Computed and Flight Instructor Assessments of Straight-In Landing Approaches by Novice Pilots on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.; Khan, M. Javed; Rossi, Marcia; Ali, Syed Firasat

    2005-01-01

    The rising cost of flight training and the low cost of powerful computers have resulted in increasing use of PC-based flight simulators. This has prompted FAA standards regulating such use and allowing aspects of training on simulators meeting these standards to be substituted for flight time. However, the FAA regulations require an authorized flight instructor as part of the training environment. Thus, while costs associated with flight time have been reduced, the cost associated with the need for a flight instructor still remains. The obvious area of research, therefore, has been to develop intelligent simulators. However, the two main challenges of such attempts have been training strategies and assessment. The research reported in this paper was conducted to evaluate various performance metrics of a straight-in landing approach by 33 novice pilots flying a light single engine aircraft simulation. These metrics were compared to assessments of these flights by two flight instructors to establish a correlation between the two techniques in an attempt to determine a composite performance metric for this flight maneuver.

  14. Thin Client Architecture: The Promise and the Problems.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1997-01-01

    Describes thin clients, a networking technology that allows organizations to provide software applications over networked workstations connected to a central server. Topics include corporate settings; major advantages, including cost effectiveness and increased computer security; problems; and possible applications for large public and academic…

  15. Microcomputers in the Anesthesia Library.

    ERIC Educational Resources Information Center

    Wright, A. J.

    The combination of computer technology and library operation is helping to alleviate such library problems as escalating costs, increasing collection size, deteriorating materials, unwieldy arrangement schemes, poor subject control, and the acquisition and processing of large numbers of rarely used documents. Small special libraries such as…

  16. Increases in muscle strength and balance using a resistance training program administered via a telecommunications system in older adults.

    PubMed

    Sparrow, David; Gottlieb, Daniel J; Demolles, Deborah; Fielding, Roger A

    2011-11-01

    Resistance training programs have been found to improve muscle strength, physical function, and depressive symptoms in middle-aged and older adults. These programs have typically been provided in clinical facilities, health clubs, and senior centers, which may be inconvenient and/or cost prohibitive for some older adults. The purpose of this study was to investigate the effectiveness of an automated telemedicine intervention that provides real-time guidance and monitoring of resistance training in the home. A randomized clinical trial in 103 middle-aged or older participants. Participants were assigned to use of a theory-driven interactive voice response system designed to promote resistance training (Telephone-Linked Computer-based Long-term Interactive Fitness Trainer; n = 52) or to an attention control (n = 51) for a period of 12 months. Measurements of muscle strength, balance, walk distance, and mood were obtained at baseline, 3, 6, and 12 months. We observed increased strength, improved balance, and fewer depressive symptoms in the intervention group than in the control group. Using generalized estimating equations modeling, group differences were statistically significant for knee flexion strength (p = .035), single-leg stance time (p = .029), and Beck Depression Inventory (p = .030). This computer-based telecommunications exercise intervention led to improvements in participants' strength, balance, and depressive symptoms. Because of their low cost and easy accessibility, computer-based interventions may be a cost-effective way of promoting exercise in the home.

  17. Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.

    PubMed

    Loschen, Christoph; Klamt, Andreas

    2015-06-01

    The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.

  18. Neural basis of increased costly norm enforcement under adversity.

    PubMed

    Wu, Yan; Yu, Hongbo; Shen, Bo; Yu, Rongjun; Zhou, Zhiheng; Zhang, Guoping; Jiang, Yushi; Zhou, Xiaolin

    2014-12-01

    Humans are willing to punish norm violations even at a substantial personal cost. Using fMRI and a variant of the ultimatum game and functional magnetic resonance imaging, we investigated how the brain differentially responds to fairness in loss and gain domains. Participants (responders) received offers from anonymous partners indicating a division of an amount of monetary gain or loss. If they accept, both get their shares according to the division; if they reject, both get nothing or lose the entire stake. We used a computational model to derive perceived fairness of offers and participant-specific inequity aversion. Behaviorally, participants were more likely to reject unfair offers in the loss (vs gain) domain. Neurally, the positive correlation between fairness and activation in ventral striatum was reduced, whereas the negative correlations between fairness and activations in dorsolateral prefrontal cortex were enhanced in the loss domain. Moreover, rejection-related dorsal striatum activation was higher in the loss domain. Furthermore, the gain-loss domain modulates costly punishment only when unfair behavior was directed toward the participants and not when it was directed toward others. These findings provide neural and computational accounts of increased costly norm enforcement under adversity and advanced our understanding of the context-dependent nature of fairness preference. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. An Analysis of Failure Handling in Chameleon, A Framework for Supporting Cost-Effective Fault Tolerant Services

    NASA Technical Reports Server (NTRS)

    Haakensen, Erik Edward

    1998-01-01

    The desire for low-cost reliable computing is increasing. Most current fault tolerant computing solutions are not very flexible, i.e., they cannot adapt to reliability requirements of newly emerging applications in business, commerce, and manufacturing. It is important that users have a flexible, reliable platform to support both critical and noncritical applications. Chameleon, under development at the Center for Reliable and High-Performance Computing at the University of Illinois, is a software framework. for supporting cost-effective adaptable networked fault tolerant service. This thesis details a simulation of fault injection, detection, and recovery in Chameleon. The simulation was written in C++ using the DEPEND simulation library. The results obtained from the simulation included the amount of overhead incurred by the fault detection and recovery mechanisms supported by Chameleon. In addition, information about fault scenarios from which Chameleon cannot recover was gained. The results of the simulation showed that both critical and noncritical applications can be executed in the Chameleon environment with a fairly small amount of overhead. No single point of failure from which Chameleon could not recover was found. Chameleon was also found to be capable of recovering from several multiple failure scenarios.

  20. Brief report: a cost analysis of neuraxial anesthesia to facilitate external cephalic version for breech fetal presentation.

    PubMed

    Carvalho, Brendan; Tan, Jonathan M; Macario, Alex; El-Sayed, Yasser Y; Sultan, Pervez

    2013-07-01

    In this study, we sought to determine whether neuraxial anesthesia to facilitate external cephalic version (ECV) increased delivery costs for breech fetal presentation. Using a computer cost model, which considers possible outcomes and probability uncertainties at the same time, we estimated total expected delivery costs for breech presentation managed by a trial of ECV with and without neuraxial anesthesia. From published studies, the average probability of successful ECV with neuraxial anesthesia was 60% (with individual studies ranging from 44% to 87%) compared with 38% (with individual studies ranging from 31% to 58%) without neuraxial anesthesia. The mean expected total delivery costs, including the cost of attempting/performing ECV with anesthesia, equaled $8931 (2.5th-97.5th percentile prediction interval $8541-$9252). The cost was $9207 (2.5th-97.5th percentile prediction interval $8896-$9419) if ECV was attempted/performed without anesthesia. The expected mean incremental difference between the total cost of delivery that includes ECV with anesthesia and ECV without anesthesia was $-276 (2.5th-97.5th percentile prediction interval $-720 to $112). The total cost of delivery in women with breech presentation may be decreased (up to $720) or increased (up to $112) if ECV is attempted/performed with neuraxial anesthesia compared with ECV without neuraxial anesthesia. Increased ECV success with neuraxial anesthesia and the subsequent reduction in breech cesarean delivery rate offset the costs of providing anesthesia to facilitate ECV.

  1. Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters

    PubMed Central

    Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika

    2015-01-01

    Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323

  2. Costs and benefits of integrating information between the cerebral hemispheres: a computational perspective.

    PubMed

    Belger, A; Banich, M T

    1998-07-01

    Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.

  3. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  4. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    ERIC Educational Resources Information Center

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  5. Development of a low-cost virtual reality workstation for training and education

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.

  6. Computers in Education: Their Use and Cost, Education Automation Monograph Number 2.

    ERIC Educational Resources Information Center

    American Data Processing, Inc., Detroit, MI.

    This monograph on the cost and use of computers in education consists of two parts. Part I is a report of the President's Science Advisory Committee concerning the cost and use of the computer in undergraduate, secondary, and higher education. In addition, the report contains a discussion of the interaction between research and educational uses of…

  7. A computer program for analysis of fuelwood harvesting costs

    Treesearch

    George B. Harpole; Giuseppe Rensi

    1985-01-01

    The fuelwood harvesting computer program (FHP) is written in FORTRAN 60 and designed to select a collection of harvest units and systems from among alternatives to satisfy specified energy requirements at a lowest cost per million Btu's as recovered in a boiler, or thousand pounds of H2O evaporative capacity kiln drying. Computed energy costs are used as a...

  8. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage...

  9. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage...

  10. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE PAGES

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-25

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  11. Data-driven train set crash dynamics simulation

    NASA Astrophysics Data System (ADS)

    Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2017-02-01

    Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.

  12. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krogel, Jaron T.; Reboredo, Fernando A.

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  13. On the precision of aero-thermal simulations for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos; Thompson, Hugh

    2016-08-01

    Environmental effects on the Image Quality (IQ) of the Thirty Meter Telescope (TMT) are estimated by aero-thermal numerical simulations. These simulations utilize Computational Fluid Dynamics (CFD) to estimate, among others, thermal (dome and mirror) seeing as well as wind jitter and blur. As the design matures, guidance obtained from these numerical experiments can influence significant cost-performance trade-offs and even component survivability. The stochastic nature of environmental conditions results in the generation of a large computational solution matrix in order to statistically predict Observatory Performance. Moreover, the relative contribution of selected key subcomponents to IQ increases the parameter space and thus computational cost, while dictating a reduced prediction error bar. The current study presents the strategy followed to minimize prediction time and computational resources, the subsequent physical and numerical limitations and finally the approach to mitigate the issues experienced. In particular, the paper describes a mesh-independence study, the effect of interpolation of CFD results on the TMT IQ metric, and an analysis of the sensitivity of IQ to certain important heat sources and geometric features.

  14. Comparison of different models for non-invasive FFR estimation

    NASA Astrophysics Data System (ADS)

    Mirramezani, Mehran; Shadden, Shawn

    2017-11-01

    Coronary artery disease is a leading cause of death worldwide. Fractional flow reserve (FFR), derived from invasively measuring the pressure drop across a stenosis, is considered the gold standard to diagnose disease severity and need for treatment. Non-invasive estimation of FFR has gained recent attention for its potential to reduce patient risk and procedural cost versus invasive FFR measurement. Non-invasive FFR can be obtained by using image-based computational fluid dynamics to simulate blood flow and pressure in a patient-specific coronary model. However, 3D simulations require extensive effort for model construction and numerical computation, which limits their routine use. In this study we compare (ordered by increasing computational cost/complexity): reduced-order algebraic models of pressure drop across a stenosis; 1D, 2D (multiring) and 3D CFD models; as well as 3D FSI for the computation of FFR in idealized and patient-specific stenosis geometries. We demonstrate the ability of an appropriate reduced order algebraic model to closely predict FFR when compared to FFR from a full 3D simulation. This work was supported by the NIH, Grant No. R01-HL103419.

  15. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-01

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this work, we explore alternatives to reduce the memory usage of splined orbitals without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. For production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.

  16. PCS: a pallet costing system for wood pallet manufacturers (version 1.0 for Windows®)

    Treesearch

    A. Jefferson, Jr. Palmer; Cynthia D. West; Bruce G. Hansen; Marshall S. White; Hal L. Mitchell

    2002-01-01

    The Pallet Costing System (PCS) is a computer-based, Microsoft Windows® application that computes the total and per-unit cost of manufacturing an order of wood pallets. Information about the manufacturing facility, along with the pallet-order requirements provided by the customer, is used in determining production cost. The major cost factors addressed by PCS...

  17. Dopamine Manipulation Affects Response Vigor Independently of Opportunity Cost.

    PubMed

    Zénon, Alexandre; Devesse, Sophie; Olivier, Etienne

    2016-09-14

    Dopamine is known to be involved in regulating effort investment in relation to reward, and the disruption of this mechanism is thought to be central in some pathological situations such as Parkinson's disease, addiction, and depression. According to an influential model, dopamine plays this role by encoding the opportunity cost, i.e., the average value of forfeited actions, which is an important parameter to take into account when making decisions about which action to undertake and how fast to execute it. We tested this hypothesis by asking healthy human participants to perform two effort-based decision-making tasks, following either placebo or levodopa intake in a double blind within-subject protocol. In the effort-constrained task, there was a trade-off between the amount of force exerted and the time spent in executing the task, such that investing more effort decreased the opportunity cost. In the time-constrained task, the effort duration was constant, but exerting more force allowed the subject to earn more substantial reward instead of saving time. Contrary to the model predictions, we found that levodopa caused an increase in the force exerted only in the time-constrained task, in which there was no trade-off between effort and opportunity cost. In addition, a computational model showed that dopamine manipulation left the opportunity cost factor unaffected but altered the ratio between the effort cost and reinforcement value. These findings suggest that dopamine does not represent the opportunity cost but rather modulates how much effort a given reward is worth. Dopamine has been proposed in a prevalent theory to signal the average reward rate, used to estimate the cost of investing time in an action, also referred to as opportunity cost. We contrasted the effect of dopamine manipulation in healthy participants in two tasks, in which increasing response vigor (i.e., the amount of effort invested in an action) allowed either to save time or to earn more reward. We found that levodopa-a synthetic precursor of dopamine-increases response vigor only in the latter situation, demonstrating that, rather than the opportunity cost, dopamine is involved in computing the expected value of effort. Copyright © 2016 the authors 0270-6474/16/369516-10$15.00/0.

  18. Bio and health informatics meets cloud : BioVLab as an example.

    PubMed

    Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun

    2013-01-01

    The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.

  19. SARANA: language, compiler and run-time system support for spatially aware and resource-aware mobile computing.

    PubMed

    Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei

    2008-10-28

    Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.

  20. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  1. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  2. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  3. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  4. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    PubMed

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  5. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  6. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  7. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    NASA Astrophysics Data System (ADS)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  8. Educational Technology--The White Elephant.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    A ten year experiment in educational technology sponsored under Title VII of the National Defense Education Act (NDEA) demonstrated the feasibility of large-scale educational systems which can extend education to all while permitting the individualization of instruction without significant increase in cost (through television, computer systems,…

  9. Home Banking Experiments: Do the Advantages Outweigh the Increased Costs for Customers?

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1984-01-01

    Reviews services available through home banking via a personal computer, and discusses cash management, whether these services are worthwhile, and if they'll be successful. Two home banking services--Chemical Bank's Pronto and Bank of America's Homebanking--are described. (MBR)

  10. But Is It Nutritious? Computer Analysis Creates Healthier Meals.

    ERIC Educational Resources Information Center

    Corrigan, Kathleen A.; Aumann, Margaret B.

    1993-01-01

    A computerized menu-planning method, "Nutrient Standard Menu Planning" (NSMP), uses today's technology to create healthier menus. Field tested in 20 California school districts, the advantages of NSMP are cost effectiveness, increased flexibility, greater productivity, improved public relations, improved finances, and improved student…

  11. Design and operations technologies - Integrating the pieces. [for future space systems design

    NASA Technical Reports Server (NTRS)

    Eldred, C. H.

    1979-01-01

    As major elements of life-cycle costs (LCC) having critical impacts on the initiation and utilization of future space programs, the areas of vehicle design and operations are reviewed in order to identify technology requirements. Common to both areas is the requirement for efficient integration of broad, complex systems. Operations technologies focus on the extension of space-based capabilities and cost reduction through the combination of innovative design, low-maintenance hardware, and increased manpower productivity. Design technologies focus on computer-aided techniques which increase productivity while maintaining a high degree of flexibility which enhances creativity and permits graceful design changes.

  12. Computer Surveillance of Hospital-Acquired Infections: A 25 year Update

    PubMed Central

    Evans, R. Scott; Abouzelof, Rouett H.; Taylor, Caroline W.; Anderson, Vickie; Sumner, Sharon; Soutter, Sharon; Kleckner, Ruth; Lloyd, James F.

    2009-01-01

    Hospital-acquired infections (HAIs) are a significant cause of patient harm and increased healthcare cost. Many states have instituted mandatory hospital-wide reporting of HAIs which will increase the workload of infection preventionists and the Center for Medicare and Medicaid Services is no longer paying hospitals to treat certain HAIs. These competing priorities for increased reporting and prevention have many hospitals worried. Manual surveillance of HAIs cannot provide the speed, accuracy and consistency of computerized surveillance. Computer tools can also improve the speed and accuracy of HAI analysis and reporting. Computerized surveillance for HAIs was implemented at LDS Hospital in 1984, but that system required manual entry of data for analysis and reporting. This paper reports on the current functionality and status of the updated computer system for HAI surveillance, analysis and reporting used at LDS Hospital and the 21 other Intermountain Healthcare hospitals. PMID:20351845

  13. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  14. Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.

    PubMed

    Ahmadi, Maryam; Aslani, Nasim

    2018-01-01

    With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.

  15. Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record

    PubMed Central

    Ahmadi, Maryam; Aslani, Nasim

    2018-01-01

    Background: With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. Methods: The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. Results: The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. Conclusion: According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology. PMID:29719309

  16. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization

    PubMed Central

    Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan

    2017-01-01

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325

  17. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    PubMed

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  18. Advanced information processing system: Hosting of advanced guidance, navigation and control algorithms on AIPS using ASTER

    NASA Technical Reports Server (NTRS)

    Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John

    1994-01-01

    This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.

  19. Benchmark tests for a Formula SAE Student car prototyping

    NASA Astrophysics Data System (ADS)

    Mariasiu, Florin

    2011-12-01

    Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.

  20. A model to forecast data centre infrastructure costs.

    NASA Astrophysics Data System (ADS)

    Vernet, R.

    2015-12-01

    The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.

  1. A Series of Computational Neuroscience Labs Increases Comfort with MATLAB.

    PubMed

    Nichols, David F

    2015-01-01

    Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience.

  2. A Series of Computational Neuroscience Labs Increases Comfort with MATLAB

    PubMed Central

    Nichols, David F.

    2015-01-01

    Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience. PMID:26557798

  3. Effectiveness and cost effectiveness of television, radio and print advertisements in promoting the New York smokers' quitline

    PubMed Central

    Farrelly, Matthew C; Hussin, Altijani; Bauer, Ursula E

    2007-01-01

    Objectives This study assessed the relative effectiveness and cost effectiveness of television, radio and print advertisements to generate calls to the New York smokers' quitline. Methods Regression analysis was used to link total county level monthly quitline calls to television, radio and print advertising expenditures. Based on regression results, standardised measures of the relative effectiveness and cost effectiveness of expenditures were computed. Results There was a positive and statistically significant relation between call volume and expenditures for television (p<0.01) and radio (p<0.001) advertisements and a marginally significant effect for expenditures on newspaper advertisements (p<0.065). The largest effect was for television advertising. However, because of differences in advertising costs, for every $1000 increase in television, radio and newspaper expenditures, call volume increased by 0.1%, 5.7% and 2.8%, respectively. Conclusions Television, radio and print media all effectively increased calls to the New York smokers' quitline. Although increases in expenditures for television were the most effective, their relatively high costs suggest they are not currently the most cost effective means to promote a quitline. This implies that a more efficient mix of media would place greater emphasis on radio than television. However, because the current study does not adequately assess the extent to which radio expenditures would sustain their effectiveness with substantial expenditure increases, it is not feasible to determine a more optimal mix of expenditures. PMID:18048625

  4. [Partial reimbursement of prescription charges for generic drugs reduces costs for both health insurance and patients].

    PubMed

    Gouya, Ghazaleh; Reichardt, Berthold; Bidner, Anja; Weissenfels, Robert; Wolzt, Michael

    2008-01-01

    Rising costs of pharmaceuticals are a challenge to the public health care system. In collaboration with a company health insurance with 3143 members we analysed the economic benefit of reduced prescription fees for generic drugs in a 12-month period. Within the observation period 1 euro per prescription of a generic drug was reimbursed to the insurants. On the basis of 5 drug classes the prescribed proportion of generic drugs and the change in prescription pattern was computed. The acceptance of the intervention by the insurants was assessed using anonymous questionnaires. 42,219 drug prescriptons for insurants of the health insurance company were registered, with an overall cost of euro 843,954.95. In the observation period there was a 45% increase of the proportion of overall costs spent for generic drugs, from euro 78,325.65 to euro 110,419.90, together with a 38% increase of prescriptions of generic drugs. The expenditures for reimbursements of prescription payments amounted to euro 9,984 (euro 1-74 to insurants). In the 5 selected drug classes the proportion of generic drugs increased from 23% before the observation period to 40%, whereby a cost reduction of euro 2.47 per prescription was achieved. Taking into account an overall increase of prescriptions of the selected drugs, a cost reduction from euro 188,811.45 to euro 173,677.15 was accomplished. This intervention was considered useful by 84% of all insurants. Financial incentives for insurants by partial reimbursement of prescription charges are effective for increasing the proportion of generic substitutes and for controlling drug costs.

  5. The Economic Value of Long-Lasting Insecticidal Nets and Indoor Residual Spraying Implementation in Mozambique.

    PubMed

    Lee, Bruce Y; Bartsch, Sarah M; Stone, Nathan T B; Zhang, Shufang; Brown, Shawn T; Chatterjee, Chandrani; DePasse, Jay V; Zenkov, Eli; Briët, Olivier J T; Mendis, Chandana; Viisainen, Kirsi; Candrinho, Baltazar; Colborn, James

    2017-06-01

    AbstractMalaria-endemic countries have to decide how much of their limited resources for vector control to allocate toward implementing long-lasting insecticidal nets (LLINs) versus indoor residual spraying (IRS). To help the Mozambique Ministry of Health use an evidence-based approach to determine funding allocation toward various malaria control strategies, the Global Fund convened the Mozambique Modeling Working Group which then used JANUS, a software platform that includes integrated computational economic, operational, and clinical outcome models that can link with different transmission models (in this case, OpenMalaria) to determine the economic value of vector control strategies. Any increase in LLINs (from 80% baseline coverage) or IRS (from 80% baseline coverage) would be cost-effective (incremental cost-effectiveness ratios ≤ $114/disability-adjusted life year averted). However, LLIN coverage increases tend to be more cost-effective than similar IRS coverage increases, except where both pyrethroid resistance is high and LLIN usage is low. In high-transmission northern regions, increasing LLIN coverage would be more cost-effective than increasing IRS coverage. In medium-transmission central regions, changing from LLINs to IRS would be more costly and less effective. In low-transmission southern regions, LLINs were more costly and less effective than IRS, due to low LLIN usage. In regions where LLINs are more cost-effective than IRS, it is worth considering prioritizing LLIN coverage and use. However, IRS may have an important role in insecticide resistance management and epidemic control. Malaria intervention campaigns are not a one-size-fits-all solution, and tailored approaches are necessary to account for the heterogeneity of malaria epidemiology.

  6. Impact of an Advanced Imaging Utilization Review Program on Downstream Health Care Utilization and Costs for Low Back Pain.

    PubMed

    Graves, Janessa M; Fulton-Kehoe, Deborah; Jarvik, Jeffrey G; Franklin, Gary M

    2018-06-01

    Early magnetic resonance imaging (MRI) for acute low back pain (LBP) has been associated with increased costs, greater health care utilization, and longer disability duration in workers' compensation claimants. To assess the impact of a state policy implemented in June 2010 that required prospective utilization review (UR) for early MRI among workers' compensation claimants with LBP. Interrupted time series. In total, 76,119 Washington State workers' compensation claimants with LBP between 2006 and 2014. Proportion of workers receiving imaging per month (MRI, computed tomography, radiographs) and lumbosacral injections and surgery; mean total health care costs per worker; mean duration of disability per worker. Measures were aggregated monthly and attributed to injury month. After accounting for secular trends, decreases in early MRI [level change: -5.27 (95% confidence interval, -4.22 to -6.31); trend change: -0.06 (-0.01 to -0.12)], any MRI [-4.34 (-3.01 to -5.67); -0.10 (-0.04 to -0.17)], and injection [trend change: -0.12 (-0.06 to -0.18)] utilization were associated with the policy. Radiograph utilization increased in parallel [level change: 2.46 (1.24-3.67)]. In addition, the policy resulted in significant decreasing changes in mean costs per claim, mean disability duration, and proportion of workers who received disability benefits. The policy had no effect on computed tomography or surgery utilization. The UR policy had discernable effects on health care utilization, costs, and disability. Integrating evidence-based guidelines with UR can improve quality of care and patient outcomes, while reducing use of low-value health services.

  7. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1A: Summary

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Redhed, D. D.; Kawaguchi, A. S.; Hansen, S. D.; Southall, J. W.

    1973-01-01

    IPAD was defined as a total system oriented to the product design process. This total system was designed to recognize the product design process, individuals and their design process tasks, and the computer-based IPAD System to aid product design. Principal elements of the IPAD System include the host computer and its interactive system software, new executive and data management software, and an open-ended IPAD library of technical programs to match the intended product design process. The basic goal of the IPAD total system is to increase the productivity of the product design organization. Increases in individual productivity were feasible through automation and computer support of routine information handling. Such proven automation can directly decrease cost and flowtime in the product design process.

  8. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  9. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  10. Aether: leveraging linear programming for optimal cloud computing in genomics.

    PubMed

    Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J; Kostic, Aleksandar D

    2018-05-01

    Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users' existing HPC pipelines. Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu. Supplementary data are available at Bioinformatics online.

  11. Efficient searching in meshfree methods

    NASA Astrophysics Data System (ADS)

    Olliff, James; Alford, Brad; Simkins, Daniel C.

    2018-04-01

    Meshfree methods such as the Reproducing Kernel Particle Method and the Element Free Galerkin method have proven to be excellent choices for problems involving complex geometry, evolving topology, and large deformation, owing to their ability to model the problem domain without the constraints imposed on the Finite Element Method (FEM) meshes. However, meshfree methods have an added computational cost over FEM that come from at least two sources: increased cost of shape function evaluation and the determination of adjacency or connectivity. The focus of this paper is to formally address the types of adjacency information that arises in various uses of meshfree methods; a discussion of available techniques for computing the various adjacency graphs; propose a new search algorithm and data structure; and finally compare the memory and run time performance of the methods.

  12. Recent advances in QM/MM free energy calculations using reference potentials.

    PubMed

    Duarte, Fernanda; Amrein, Beat A; Blaha-Nelson, David; Kamerlin, Shina C L

    2015-05-01

    Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014. Published by Elsevier B.V.

  13. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modelingmore » flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.« less

  14. Secure data sharing in public cloud

    NASA Astrophysics Data System (ADS)

    Venkataramana, Kanaparti; Naveen Kumar, R.; Tatekalva, Sandhya; Padmavathamma, M.

    2012-04-01

    Secure multi-party protocols have been proposed for entities (organizations or individuals) that don't fully trust each other to share sensitive information. Many types of entities need to collect, analyze, and disseminate data rapidly and accurately, without exposing sensitive information to unauthorized or untrusted parties. Solutions based on secure multiparty computation guarantee privacy and correctness, at an extra communication (too costly in communication to be practical) and computation cost. The high overhead motivates us to extend this SMC to cloud environment which provides large computation and communication capacity which makes SMC to be used between multiple clouds (i.e., it may between private or public or hybrid clouds).Cloud may encompass many high capacity servers which acts as a hosts which participate in computation (IaaS and PaaS) for final result, which is controlled by Cloud Trusted Authority (CTA) for secret sharing within the cloud. The communication between two clouds is controlled by High Level Trusted Authority (HLTA) which is one of the hosts in a cloud which provides MgaaS (Management as a Service). Due to high risk for security in clouds, HLTA generates and distributes public keys and private keys by using Carmichael-R-Prime- RSA algorithm for exchange of private data in SMC between itself and clouds. In cloud, CTA creates Group key for Secure communication between the hosts in cloud based on keys sent by HLTA for exchange of Intermediate values and shares for computation of final result. Since this scheme is extended to be used in clouds( due to high availability and scalability to increase computation power) it is possible to implement SMC practically for privacy preserving in data mining at low cost for the clients.

  15. Economic analysis and assessment of syngas production using a modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost ofmore » syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horowitz, Kelsey A. W.; Fu, Ran; Woodhouse, Michael

    This article examines current cost drivers and potential avenues to reduced cost for monolithic, glass-glass Cu(In,Ga)(Se,S)2 (CIGS) modules by constructing a comprehensive bottom-up cost model. For a reference case where sputtering plus batch sulfurization after selenization (SAS) is employed, we compute a manufacturing cost of $69/m2 if the modules are made in the United States at a 1 GW/year production volume. At 14% module efficiency, this corresponds to a manufacturing cost of $0.49/WDC and a minimum sustainable price (MSP) of $0.67/WDC. We estimate that MSP could vary within +/-20% of this value given the range of quoted input prices, andmore » existing variations in module design, manufacturing processes, and manufacturing location. Potential for reduction in manufacturing costs to below $0.40/WDC may be possible if average production module efficiencies can be increased above 17% without increasing $/m2 costs; even lower costs could be achieved if $/m2 costs could be reduced, particularly via innovations in the CIGS deposition process or balance-of-module elements. We present the impact on cost of regional factors, CIGS deposition method, device design, and price fluctuations. One metric of competitiveness-levelized cost of energy (LCOE) -- is also assessed for several U.S. locations and compared to that of standard multi-crystalline silicon (m(c-Si)) and cadmium telluride (CdTe).« less

  17. Algorithm For Optimal Control Of Large Structures

    NASA Technical Reports Server (NTRS)

    Salama, Moktar A.; Garba, John A..; Utku, Senol

    1989-01-01

    Cost of computation appears competitive with other methods. Problem to compute optimal control of forced response of structure with n degrees of freedom identified in terms of smaller number, r, of vibrational modes. Article begins with Hamilton-Jacobi formulation of mechanics and use of quadratic cost functional. Complexity reduced by alternative approach in which quadratic cost functional expressed in terms of control variables only. Leads to iterative solution of second-order time-integral matrix Volterra equation of second kind containing optimal control vector. Cost of algorithm, measured in terms of number of computations required, is of order of, or less than, cost of prior algoritms applied to similar problems.

  18. The Fast Multipole Method and Fourier Convolution for the Solution of Acoustic Scattering on Regular Volumetric Grids

    PubMed Central

    Hesford, Andrew J.; Waag, Robert C.

    2010-01-01

    The fast multipole method (FMM) is applied to the solution of large-scale, three-dimensional acoustic scattering problems involving inhomogeneous objects defined on a regular grid. The grid arrangement is especially well suited to applications in which the scattering geometry is not known a priori and is reconstructed on a regular grid using iterative inverse scattering algorithms or other imaging techniques. The regular structure of unknown scattering elements facilitates a dramatic reduction in the amount of storage and computation required for the FMM, both of which scale linearly with the number of scattering elements. In particular, the use of fast Fourier transforms to compute Green's function convolutions required for neighboring interactions lowers the often-significant cost of finest-level FMM computations and helps mitigate the dependence of FMM cost on finest-level box size. Numerical results demonstrate the efficiency of the composite method as the number of scattering elements in each finest-level box is increased. PMID:20835366

  19. The fast multipole method and Fourier convolution for the solution of acoustic scattering on regular volumetric grids

    NASA Astrophysics Data System (ADS)

    Hesford, Andrew J.; Waag, Robert C.

    2010-10-01

    The fast multipole method (FMM) is applied to the solution of large-scale, three-dimensional acoustic scattering problems involving inhomogeneous objects defined on a regular grid. The grid arrangement is especially well suited to applications in which the scattering geometry is not known a priori and is reconstructed on a regular grid using iterative inverse scattering algorithms or other imaging techniques. The regular structure of unknown scattering elements facilitates a dramatic reduction in the amount of storage and computation required for the FMM, both of which scale linearly with the number of scattering elements. In particular, the use of fast Fourier transforms to compute Green's function convolutions required for neighboring interactions lowers the often-significant cost of finest-level FMM computations and helps mitigate the dependence of FMM cost on finest-level box size. Numerical results demonstrate the efficiency of the composite method as the number of scattering elements in each finest-level box is increased.

  20. The Fast Multipole Method and Fourier Convolution for the Solution of Acoustic Scattering on Regular Volumetric Grids.

    PubMed

    Hesford, Andrew J; Waag, Robert C

    2010-10-20

    The fast multipole method (FMM) is applied to the solution of large-scale, three-dimensional acoustic scattering problems involving inhomogeneous objects defined on a regular grid. The grid arrangement is especially well suited to applications in which the scattering geometry is not known a priori and is reconstructed on a regular grid using iterative inverse scattering algorithms or other imaging techniques. The regular structure of unknown scattering elements facilitates a dramatic reduction in the amount of storage and computation required for the FMM, both of which scale linearly with the number of scattering elements. In particular, the use of fast Fourier transforms to compute Green's function convolutions required for neighboring interactions lowers the often-significant cost of finest-level FMM computations and helps mitigate the dependence of FMM cost on finest-level box size. Numerical results demonstrate the efficiency of the composite method as the number of scattering elements in each finest-level box is increased.

  1. The Many-Headed Hydra: Information Networking at LAA.

    ERIC Educational Resources Information Center

    Winzenried, Arthur P.

    1997-01-01

    Describes an integrated computer library system installed at Lilydale Adventist Academy (LAA) in Melbourne (Australia) in response to a limited budget, increased demand, and greater user expectations. Topics include student workstations, cost effectiveness, CD-ROMS on local area networks, and student input regarding their needs. (Author/LRW)

  2. Desktop Virtualization in Action: Simplicity Is Power

    ERIC Educational Resources Information Center

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  3. Performance limits and trade-offs in entropy-driven biochemical computers.

    PubMed

    Chu, Dominique

    2018-04-14

    It is now widely accepted that biochemical reaction networks can perform computations. Examples are kinetic proof reading, gene regulation, or signalling networks. For many of these systems it was found that their computational performance is limited by a trade-off between the metabolic cost, the speed and the accuracy of the computation. In order to gain insight into the origins of these trade-offs, we consider entropy-driven computers as a model of biochemical computation. Using tools from stochastic thermodynamics, we show that entropy-driven computation is subject to a trade-off between accuracy and metabolic cost, but does not involve time-trade-offs. Time trade-offs appear when it is taken into account that the result of the computation needs to be measured in order to be known. We argue that this measurement process, although usually ignored, is a major contributor to the cost of biochemical computation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. 75 FR 75911 - Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents/Incidents for Calendar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ...This rule increases the rail equipment accident/incident reporting threshold from $9,200 to $9,400 for certain railroad accidents/incidents involving property damage that occur during calendar year 2011. This action is needed to ensure that FRA's reporting requirements reflect cost increases that have occurred since the reporting threshold was last computed in December of 2009.

  5. Implications of the Third Industrial Revolution on the Elements of National Power and Their Impact on National Security Strategy

    DTIC Science & Technology

    1992-03-16

    34A Hidden U.S. Export: Higher Education ." The WashinQton Post, 16 February 1992, H1 and H4. Brandin , David H., and Michael A. Harrison. The...frequent significant technological change now occurs within the individual person’s working lifespan, life-long education is a necessity to remain...INDUSTRIAL REVOLUTION The phenomenal increase in speed and in raw power of computer processors, the shrinking size and cost of basic computing systems, the

  6. Dissociative Global and Local Task-Switching Costs Across Younger Adults, Middle-Aged Adults, Older Adults, and Very Mild Alzheimer Disease Individuals

    PubMed Central

    Huff, Mark J.; Balota, David A.; Minear, Meredith; Aschenbrenner, Andrew J.; Duchek, Janet M.

    2015-01-01

    A task-switching paradigm was used to examine differences in attentional control across younger adults, middle-aged adults, healthy older adults, and individuals classified in the earliest detectable stage of Alzheimer's disease (AD). A large sample of participants (570) completed a switching task in which participants were cued to classify the letter (consonant/vowel) or number (odd/even) task-set dimension of a bivalent stimulus (e.g., A 14), respectively. A Pure block consisting of single-task trials and a Switch block consisting of nonswitch and switch trials were completed. Local (switch vs. nonswitch trials) and global (nonswitch vs. pure trials) costs in mean error rates, mean response latencies, underlying reaction time distributions, along with stimulus-response congruency effects were computed. Local costs in errors were group invariant, but global costs in errors systematically increased as a function of age and AD. Response latencies yielded a strong dissociation: Local costs decreased across groups whereas global costs increased across groups. Vincentile distribution analyses revealed that the dissociation of local and global costs primarily occurred in the slowest response latencies. Stimulus-response congruency effects within the Switch block were particularly robust in accuracy in the very mild AD group. We argue that the results are consistent with the notion that the impaired groups show a reduced local cost because the task sets are not as well tuned, and hence produce minimal cost on switch trials. In contrast, global costs increase because of the additional burden on working memory of maintaining two task sets. PMID:26652720

  7. [Cost analysis for navigation in knee endoprosthetics].

    PubMed

    Cerha, O; Kirschner, S; Günther, K-P; Lützner, J

    2009-12-01

    Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to 300-395 depending on the navigation system. Computer-assisted TKA is associated with additional costs. From an economical point of view an amount of more than 50 procedures per year appears to be favourable. The cost-effectiveness could be estimated if long-term results will show a reduction of revisions or a better clinical outcome.

  8. Increasing thermal efficiency of solar flat plate collectors

    NASA Astrophysics Data System (ADS)

    Pona, J.

    A study of methods to increase the efficiency of heat transfer in flat plate solar collectors is presented. In order to increase the heat transfer from the absorber plate to the working fluid inside the tubes, turbulent flow was induced by installing baffles within the tubes. The installation of the baffles resulted in a 7 to 12% increase in collector efficiency. Experiments were run on both 1 sq ft and 2 sq ft collectors each fitted with either slotted baffles or tubular baffles. A computer program was run comparing the baffled collector to the standard collector. The results obtained from the computer show that the baffled collectors have a 2.7% increase in life cycle cost (LCC) savings and a 3.6% increase in net cash flow for use in domestic hot water systems, and even greater increases when used in solar heating systems.

  9. Low-Cost Computer-Aided Instruction/Computer-Managed Instruction (CAI/CMI) System: Feasibility Study. Final Report.

    ERIC Educational Resources Information Center

    Lintz, Larry M.; And Others

    This study investigated the feasibility of a low cost computer-aided instruction/computer-managed instruction (CAI/CMI) system. Air Force instructors and training supervisors were surveyed to determine the potential payoffs of various CAI and CMI functions. Results indicated that a wide range of capabilities had potential for resident technical…

  10. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    NASA Astrophysics Data System (ADS)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer school on Bayesian Computing for Astronomical Data Analysis with support of the Penn State Center for Astrostatistics and Institute for CyberScience.

  11. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  12. Computer programs for estimating civil aircraft economics

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.; Molloy, J. K.; Neubawer, M. J.

    1980-01-01

    Computer programs for calculating airline direct operating cost, indirect operating cost, and return on investment were developed to provide a means for determining commercial aircraft life cycle cost and economic performance. A representative wide body subsonic jet aircraft was evaluated to illustrate use of the programs.

  13. Cost-effectiveness of traffic enforcement: case study from Uganda.

    PubMed

    Bishai, D; Asiimwe, B; Abbas, S; Hyder, A A; Bazeyo, W

    2008-08-01

    In October 2004, the Ugandan Police department deployed enhanced traffic safety patrols on the four major roads to the capital Kampala. To assess the costs and potential effectiveness of increasing traffic enforcement in Uganda. Record review and key informant interviews were conducted at 10 police stations along the highways that were patrolled. Monthly data on traffic citations and casualties were reviewed for January 2001 to December 2005; time series (ARIMA) regression was used to assess for a statistically significant change in traffic deaths. Costs were computed from the perspective of the police department in $US 2005. Cost offsets from savings to the health sector were not included. The annual cost of deploying the four squads of traffic patrols (20 officers, four vehicles, equipment, administration) is estimated at $72,000. Since deployment, the number of citations has increased substantially with a value of $327 311 annually. Monthly crash data pre- and post-intervention show a statistically significant 17% drop in road deaths after the intervention. The average cost-effectiveness of better road safety enforcement in Uganda is $603 per death averted or $27 per life year saved discounted at 3% (equivalent to 9% of Uganda's $300 GDP per capita). The costs of traffic safety enforcement are low in comparison to the potential number of lives saved and revenue generated. Increasing enforcement of existing traffic safety norms can prove to be an extremely cost-effective public health intervention in low-income countries, even from a government perspective.

  14. MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.

    PubMed

    Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed

    2017-01-20

    Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.

  15. Resource Provisioning in SLA-Based Cluster Computing

    NASA Astrophysics Data System (ADS)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  16. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  17. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  18. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING POWERPLANTS Pt. 504, App. I Appendix I to Part 504—Procedures for the Computation of the Real Cost of Capital (a) The firm's real after-tax weighted average...

  19. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  20. Credit Cards: What You Don't Know Can Cost You!

    ERIC Educational Resources Information Center

    Detweiler, Gerri

    1993-01-01

    The role of credit cards in personal finance has increased dramatically over the past two decades. Complex interest computation methods and additional fees often boost the price of credit card loans and help make credit cards the most profitable type of consumer loan for many lenders. (Author/JOW)

  1. Taking the Heat off the School Lunchroom.

    ERIC Educational Resources Information Center

    Lutz, Raymond P.; And Others

    The application of operations research techniques to a public school system's lunch program suggests a possible solution to the problem of rapidly increasing program costs. A computer-assisted menu planner was developed which generated a monthly set of menus satisfying nutritional and Federal standards, and food demand cycles. When compared to the…

  2. 77 FR 35466 - Pilot Project Grants in Support of Railroad Safety Risk Reduction Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ... mobile telephones and laptop computers. This subpart was codified in response to an increase in the... FRA funding. Applications should include feasibility studies and cost estimates, if completed. FRA will more favorably consider applications that include these types of studies and estimates, as they...

  3. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    NASA Astrophysics Data System (ADS)

    Handford, Matthew L.; Srinivasan, Manoj

    2016-02-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost - even lower than assuming that the non-amputee’s ankle torques are cost-free.

  4. A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.

    2009-09-01

    Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.

  5. Wavelet-based multicomponent denoising on GPU to improve the classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Quesada-Barriuso, Pablo; Heras, Dora B.; Argüello, Francisco; Mouriño, J. C.

    2017-10-01

    Supervised classification allows handling a wide range of remote sensing hyperspectral applications. Enhancing the spatial organization of the pixels over the image has proven to be beneficial for the interpretation of the image content, thus increasing the classification accuracy. Denoising in the spatial domain of the image has been shown as a technique that enhances the structures in the image. This paper proposes a multi-component denoising approach in order to increase the classification accuracy when a classification method is applied. It is computed on multicore CPUs and NVIDIA GPUs. The method combines feature extraction based on a 1Ddiscrete wavelet transform (DWT) applied in the spectral dimension followed by an Extended Morphological Profile (EMP) and a classifier (SVM or ELM). The multi-component noise reduction is applied to the EMP just before the classification. The denoising recursively applies a separable 2D DWT after which the number of wavelet coefficients is reduced by using a threshold. Finally, inverse 2D-DWT filters are applied to reconstruct the noise free original component. The computational cost of the classifiers as well as the cost of the whole classification chain is high but it is reduced achieving real-time behavior for some applications through their computation on NVIDIA multi-GPU platforms.

  6. Construction and field test of a programmable and self-cleaning auto-sampler controlled by a low-cost one-board computer

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias

    2016-04-01

    This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.

  7. Systems cost/performance analysis (study 2.3). Volume 3: Programmer's manual and user's guide. [for unmanned spacecraft

    NASA Technical Reports Server (NTRS)

    Janz, R. F.

    1974-01-01

    The systems cost/performance model was implemented as a digital computer program to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses. The computer is described along with the operating environment in which the program was written and checked, the program specifications such as discussions of logic and computational flow, the different subsystem models involved in the design of the spacecraft, and routines involved in the nondesign area such as costing and scheduling of the design. Preliminary results for the DSCS-II design are also included.

  8. Long range Debye-Hückel correction for computation of grid-based electrostatic forces between biomacromolecules

    PubMed Central

    2014-01-01

    Background Brownian dynamics (BD) simulations can be used to study very large molecular systems, such as models of the intracellular environment, using atomic-detail structures. Such simulations require strategies to contain the computational costs, especially for the computation of interaction forces and energies. A common approach is to compute interaction forces between macromolecules by precomputing their interaction potentials on three-dimensional discretized grids. For long-range interactions, such as electrostatics, grid-based methods are subject to finite size errors. We describe here the implementation of a Debye-Hückel correction to the grid-based electrostatic potential used in the SDA BD simulation software that was applied to simulate solutions of bovine serum albumin and of hen egg white lysozyme. Results We found that the inclusion of the long-range electrostatic correction increased the accuracy of both the protein-protein interaction profiles and the protein diffusion coefficients at low ionic strength. Conclusions An advantage of this method is the low additional computational cost required to treat long-range electrostatic interactions in large biomacromolecular systems. Moreover, the implementation described here for BD simulations of protein solutions can also be applied in implicit solvent molecular dynamics simulations that make use of gridded interaction potentials. PMID:25045516

  9. One laptop per child, local refurbishment or overseas donations? Sustainability assessment of computer supply scenarios for schools in Colombia.

    PubMed

    Streicher-Porte, Martin; Marthaler, Christian; Böni, Heinz; Schluep, Mathias; Camacho, Angel; Hilty, Lorenz M

    2009-08-01

    With the intention of bridging the 'digital divide' many programmes have been launched to provide computers for educational institutions, ranging from refurbishing second hand computers to delivering low cost new computers. The fast and economical provision of large quantities of equipment is one of the many challenges faced by such programmes. If an increase is to be achieved in the sustainability of computer supplies for schools, not only must equipment be provided, but also suitable training and maintenance delivered. Furthermore, appropriate recycling has to be ensured, so that end-of-life equipment can be dealt with properly. This study has evaluated the suitability of three computer supply scenarios to schools in Colombia: (i) 'Colombian refurbishment', -refurbishment of computers donated in Colombia, (ii) 'Overseas refurbishment', -import of computers which were donated and refurbished abroad, and (iii) 'XO Laptop', -purchase of low cost computers manufactured in Korea. The methods applied were: Material Flow Assessment, -to assess the quantities-, Life Cycle Assessment, -to assess the environmental impacts, and the application of the Multiple Attribute Utility Theory, -to analyse, evaluate and compare different scenarios. The most sustainable solution proved to be the local refurbishment of second hand computers of Colombian origin to an appropriate technical standard. The environmental impacts of such practices need to be evaluated carefully, as second hand appliances have to be maintained, require spare parts and sometimes use more energy than newer equipment. Providing schools with second hand computers from overseas and through programmes such as 'One Laptop Per Child' has the disadvantage that the potential for social improvements - such as creation of jobs and local industry involvement - is very low.

  10. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  11. 75 FR 75393 - Schools and Libraries Universal Service Support Mechanism and A National Broadband Plan for Our...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... anchors, both as centers for digital literacy and as hubs for access to public computers. While their... expansion of computer labs, and facilitated deployment of new educational applications that would not have... computer fees to help defray the cost of computers or training fees to help cover the cost of training...

  12. Exploring Discretization Error in Simulation-Based Aerodynamic Databases

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael J.; Nemec, Marian

    2010-01-01

    This work examines the level of discretization error in simulation-based aerodynamic databases and introduces strategies for error control. Simulations are performed using a parallel, multi-level Euler solver on embedded-boundary Cartesian meshes. Discretization errors in user-selected outputs are estimated using the method of adjoint-weighted residuals and we use adaptive mesh refinement to reduce these errors to specified tolerances. Using this framework, we examine the behavior of discretization error throughout a token database computed for a NACA 0012 airfoil consisting of 120 cases. We compare the cost and accuracy of two approaches for aerodynamic database generation. In the first approach, mesh adaptation is used to compute all cases in the database to a prescribed level of accuracy. The second approach conducts all simulations using the same computational mesh without adaptation. We quantitatively assess the error landscape and computational costs in both databases. This investigation highlights sensitivities of the database under a variety of conditions. The presence of transonic shocks or the stiffness in the governing equations near the incompressible limit are shown to dramatically increase discretization error requiring additional mesh resolution to control. Results show that such pathologies lead to error levels that vary by over factor of 40 when using a fixed mesh throughout the database. Alternatively, controlling this sensitivity through mesh adaptation leads to mesh sizes which span two orders of magnitude. We propose strategies to minimize simulation cost in sensitive regions and discuss the role of error-estimation in database quality.

  13. Conference summary: computers in respiratory care.

    PubMed

    Nelson, Steven B

    2004-05-01

    Computers and data management in respiratory care reflect the larger practices of hospital information systems: the diversity of conference topics provides evidence. Respiratory care computing has shown a steady, slow progression from writing programs that calculate shunt equations to departmental management systems. Wider acceptance and utilization have been stifled by costs, both initial and on-going. Several authors pointed out the savings that were realized from information systems exceeded the costs of implementation and maintenance. The most significant finding from one of the presentations was that no other structure or skilled personnel could provide respiratory care more efficiently or cost-effectively than respiratory therapists. Online information resources have increased, in forms ranging from peer-reviewed journals to corporate-sponsored advertising posing as authoritative treatment regimens. Practitioners and patients need to know how to use these resources as well as how to judge the value of information they present. Departments are using computers for training on a schedule that is more convenient for the staff, providing information in a timely manner and potentially in more useful formats. Portable devices, such as personal digital assistants (PDAs) have improved the ability not only to share data to dispersed locations, but also to collect data at the point of care, thus greatly improving data capture. Ventilators are changing from simple automated bellows to complex systems collecting numerous respiratory parameters and offering feedback to improve ventilation. Clinical databases routinely collect information from a wide variety of resources and can be used for analysis to improve patient outcomes. What could possibly go wrong?

  14. IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.

    ERIC Educational Resources Information Center

    Sheehan, Mark C.; Williams, James G.

    1987-01-01

    Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)

  15. The Next Generation of Personal Computers.

    ERIC Educational Resources Information Center

    Crecine, John P.

    1986-01-01

    Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…

  16. GT-WGS: an efficient and economic tool for large-scale WGS analyses based on the AWS cloud service.

    PubMed

    Wang, Yiqi; Li, Gen; Ma, Mark; He, Fazhong; Song, Zhuo; Zhang, Wei; Wu, Chengkun

    2018-01-19

    Whole-genome sequencing (WGS) plays an increasingly important role in clinical practice and public health. Due to the big data size, WGS data analysis is usually compute-intensive and IO-intensive. Currently it usually takes 30 to 40 h to finish a 50× WGS analysis task, which is far from the ideal speed required by the industry. Furthermore, the high-end infrastructure required by WGS computing is costly in terms of time and money. In this paper, we aim to improve the time efficiency of WGS analysis and minimize the cost by elastic cloud computing. We developed a distributed system, GT-WGS, for large-scale WGS analyses utilizing the Amazon Web Services (AWS). Our system won the first prize on the Wind and Cloud challenge held by Genomics and Cloud Technology Alliance conference (GCTA) committee. The system makes full use of the dynamic pricing mechanism of AWS. We evaluate the performance of GT-WGS with a 55× WGS dataset (400GB fastq) provided by the GCTA 2017 competition. In the best case, it only took 18.4 min to finish the analysis and the AWS cost of the whole process is only 16.5 US dollars. The accuracy of GT-WGS is 99.9% consistent with that of the Genome Analysis Toolkit (GATK) best practice. We also evaluated the performance of GT-WGS performance on a real-world dataset provided by the XiangYa hospital, which consists of 5× whole-genome dataset with 500 samples, and on average GT-WGS managed to finish one 5× WGS analysis task in 2.4 min at a cost of $3.6. WGS is already playing an important role in guiding therapeutic intervention. However, its application is limited by the time cost and computing cost. GT-WGS excelled as an efficient and affordable WGS analyses tool to address this problem. The demo video and supplementary materials of GT-WGS can be accessed at https://github.com/Genetalks/wgs_analysis_demo .

  17. Integration of active pauses and pattern of muscular activity during computer work.

    PubMed

    St-Onge, Nancy; Samani, Afshin; Madeleine, Pascal

    2017-09-01

    Submaximal isometric muscle contractions have been reported to increase variability of muscle activation during computer work; however, other types of active contractions may be more beneficial. Our objective was to determine which type of active pause vs. rest is more efficient in changing muscle activity pattern during a computer task. Asymptomatic regular computer users performed a standardised 20-min computer task four times, integrating a different type of pause: sub-maximal isometric contraction, dynamic contraction, postural exercise and rest. Surface electromyographic (SEMG) activity was recorded bilaterally from five neck/shoulder muscles. Root-mean-square decreased with isometric pauses in the cervical paraspinals, upper trapezius and middle trapezius, whereas it increased with rest. Variability in the pattern of muscular activity was not affected by any type of pause. Overall, no detrimental effects on the level of SEMG during active pauses were found suggesting that they could be implemented without a cost on activation level or variability. Practitioner Summary: We aimed to determine which type of active pause vs. rest is best in changing muscle activity pattern during a computer task. Asymptomatic computer users performed a standardised computer task integrating different types of pauses. Muscle activation decreased with isometric pauses in neck/shoulder muscles, suggesting their implementation during computer work.

  18. Overuse of Diagnostic Imaging for Work-Related Injuries.

    PubMed

    Clendenin, Brianna Rebecca; Conlon, Helen Acree; Burns, Candace

    2017-02-01

    Overuse of health care in the United States is a growing concern. This article addresses the use of diagnostic imaging for work-related injuries. Diagnostic imaging drives substantial cost for increases in workers' compensation. Despite guidelines published by the American College of Radiology and the American College of Occupational Medicine and the Official Disability Guidelines, practitioners are prematurely ordering imaging sooner than recommended. Workers are exposed to unnecessary radiation and are incurring increasing costs without evidence of better outcomes. Practitioners caring for workers and submitting workers' compensation claims should adhere to official guidelines, using their professional judgment to consider financial impact and health outcomes of diagnostic imaging including computed tomography, magnetic resonance imaging, nuclear medicine imaging, radiography, and ultrasound.

  19. Computationally Efficient Characterization of Potential Energy Surfaces Based on Fingerprint Distances

    NASA Astrophysics Data System (ADS)

    Schaefer, Bastian; Goedecker, Stefan; Goedecker Group Team

    Based on Lennard-Jones, Silicon, Sodium-Chloride and Gold clusters, it was found that uphill barrier energies of transition states between directly connected minima tend to increase with increasing structural differences of the two minima. Based on this insight it also turned out that post-processing minima hopping data at a negligible computational cost allows to obtain qualitative topological information on potential energy surfaces that can be stored in so called qualitative connectivity databases. These qualitative connectivity databases are used for generating fingerprint disconnectivity graphs that allow to obtain a first qualitative idea on thermodynamic and kinetic properties of a system of interest. This research was supported by the NCCR MARVEL, funded by the Swiss National Science Foundation. Computer time was provided by the Swiss National Supercomputing Centre (CSCS) under Project ID No. s499.

  20. Cone beam computed tomography and intraoral radiography for diagnosis of dental abnormalities in dogs and cats

    PubMed Central

    Silva, Luiz Antonio F.; Barriviera, Mauricio; Januário, Alessandro L.; Bezerra, Ana Cristina B.; Fioravanti, Maria Clorinda S.

    2011-01-01

    The development of veterinary dentistry has substantially improved the ability to diagnose canine and feline dental abnormalities. Consequently, examinations previously performed only on humans are now available for small animals, thus improving the diagnostic quality. This has increased the need for technical qualification of veterinary professionals and increased technological investments. This study evaluated the use of cone beam computed tomography and intraoral radiography as complementary exams for diagnosing dental abnormalities in dogs and cats. Cone beam computed tomography was provided faster image acquisition with high image quality, was associated with low ionizing radiation levels, enabled image editing, and reduced the exam duration. Our results showed that radiography was an effective method for dental radiographic examination with low cost and fast execution times, and can be performed during surgical procedures. PMID:22122905

  1. 12 CFR 1402.21 - Categories of requesters-fees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... searches made by computer, the Farm Credit System Insurance Corporation will determine the hourly cost of... the cost of search (including the operator time and the cost of operating the computer to process a... 1402.21 Banks and Banking FARM CREDIT SYSTEM INSURANCE CORPORATION RELEASING INFORMATION Fees for...

  2. 12 CFR 1402.21 - Categories of requesters-fees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... searches made by computer, the Farm Credit System Insurance Corporation will determine the hourly cost of... the cost of search (including the operator time and the cost of operating the computer to process a... 1402.21 Banks and Banking FARM CREDIT SYSTEM INSURANCE CORPORATION RELEASING INFORMATION Fees for...

  3. 12 CFR 1402.21 - Categories of requesters-fees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... searches made by computer, the Farm Credit System Insurance Corporation will determine the hourly cost of... the cost of search (including the operator time and the cost of operating the computer to process a... 1402.21 Banks and Banking FARM CREDIT SYSTEM INSURANCE CORPORATION RELEASING INFORMATION Fees for...

  4. 12 CFR 1402.21 - Categories of requesters-fees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... searches made by computer, the Farm Credit System Insurance Corporation will determine the hourly cost of... the cost of search (including the operator time and the cost of operating the computer to process a... 1402.21 Banks and Banking FARM CREDIT SYSTEM INSURANCE CORPORATION RELEASING INFORMATION Fees for...

  5. 12 CFR 1402.21 - Categories of requesters-fees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... searches made by computer, the Farm Credit System Insurance Corporation will determine the hourly cost of... the cost of search (including the operator time and the cost of operating the computer to process a... 1402.21 Banks and Banking FARM CREDIT SYSTEM INSURANCE CORPORATION RELEASING INFORMATION Fees for...

  6. Analysis of contemporary HIV/AIDS health care costs in Germany

    PubMed Central

    Treskova, Marina; Kuhlmann, Alexander; Bogner, Johannes; Hower, Martin; Heiken, Hans; Stellbrink, Hans-Jürgen; Mahlich, Jörg; von der Schulenburg, Johann-Matthias Graf; Stoll, Matthias

    2016-01-01

    Abstract To analyze contemporary costs of HIV health care and the cost distribution across lines of combination antiretroviral therapy (cART). To identify variations in expenditures with patient characteristics and to identify main cost determinants. To compute cost ratios between patients with varying characteristics. Empirical data on costs are collected in Germany within a 2-year prospective observational noninterventional multicenter study. The database contains information for 1154 HIV-infected patients from 8 medical centers. Means and standard deviations of the total costs are estimated for each cost fraction and across cART lines and regimens. The costs are regressed against various patient characteristics using a generalized linear model. Relative costs are calculated using the resultant coefficients. The average annual total costs (SD) per patient are €22,231.03 (8786.13) with a maximum of €83,970. cART medication is the major cost fraction (83.8%) with a mean of €18,688.62 (5289.48). The major cost-driving factors are cART regimen, CD4-T cell count, cART drug resistance, and concomitant diseases. Viral load, pathology tests, and demographics have no significant impact. Standard non-nucleoside reverse transcriptase inhibitor-based regimens induce 28% lower total costs compared with standard PI/r regimens. Resistance to 3 or more antiretroviral classes induces a significant increase in costs. HIV treatment in Germany continues to be expensive. Majority of costs are attributable to cART. Main cost determinants are CD4-T cells count, comorbidity, genotypic antiviral resistance, and therapy regimen. Combinations of characteristics associated with higher expenditures enhance the increasing effect on the costs and induce high cost cases. PMID:27367993

  7. The Economic Value of Long-Lasting Insecticidal Nets and Indoor Residual Spraying Implementation in Mozambique

    PubMed Central

    Lee, Bruce Y.; Bartsch, Sarah M.; Stone, Nathan T. B.; Zhang, Shufang; Brown, Shawn T.; Chatterjee, Chandrani; DePasse, Jay V.; Zenkov, Eli; Briët, Olivier J. T.; Mendis, Chandana; Viisainen, Kirsi; Candrinho, Baltazar; Colborn, James

    2017-01-01

    Malaria-endemic countries have to decide how much of their limited resources for vector control to allocate toward implementing long-lasting insecticidal nets (LLINs) versus indoor residual spraying (IRS). To help the Mozambique Ministry of Health use an evidence-based approach to determine funding allocation toward various malaria control strategies, the Global Fund convened the Mozambique Modeling Working Group which then used JANUS, a software platform that includes integrated computational economic, operational, and clinical outcome models that can link with different transmission models (in this case, OpenMalaria) to determine the economic value of vector control strategies. Any increase in LLINs (from 80% baseline coverage) or IRS (from 80% baseline coverage) would be cost-effective (incremental cost-effectiveness ratios ≤ $114/disability-adjusted life year averted). However, LLIN coverage increases tend to be more cost-effective than similar IRS coverage increases, except where both pyrethroid resistance is high and LLIN usage is low. In high-transmission northern regions, increasing LLIN coverage would be more cost-effective than increasing IRS coverage. In medium-transmission central regions, changing from LLINs to IRS would be more costly and less effective. In low-transmission southern regions, LLINs were more costly and less effective than IRS, due to low LLIN usage. In regions where LLINs are more cost-effective than IRS, it is worth considering prioritizing LLIN coverage and use. However, IRS may have an important role in insecticide resistance management and epidemic control. Malaria intervention campaigns are not a one-size-fits-all solution, and tailored approaches are necessary to account for the heterogeneity of malaria epidemiology. PMID:28719286

  8. Design and implementation of a UNIX based distributed computing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, J.S.; Michael, M.W.

    1994-12-31

    We have designed, implemented, and are running a corporate-wide distributed processing batch queue on a large number of networked workstations using the UNIX{reg_sign} operating system. Atlas Wireline researchers and scientists have used the system for over a year. The large increase in available computer power has greatly reduced the time required for nuclear and electromagnetic tool modeling. Use of remote distributed computing has simultaneously reduced computation costs and increased usable computer time. The system integrates equipment from different manufacturers, using various CPU architectures, distinct operating system revisions, and even multiple processors per machine. Various differences between the machines have tomore » be accounted for in the master scheduler. These differences include shells, command sets, swap spaces, memory sizes, CPU sizes, and OS revision levels. Remote processing across a network must be performed in a manner that is seamless from the users` perspective. The system currently uses IBM RISC System/6000{reg_sign}, SPARCstation{sup TM}, HP9000s700, HP9000s800, and DEC Alpha AXP{sup TM} machines. Each CPU in the network has its own speed rating, allowed working hours, and workload parameters. The system if designed so that all of the computers in the network can be optimally scheduled without adversely impacting the primary users of the machines. The increase in the total usable computational capacity by means of distributed batch computing can change corporate computing strategy. The integration of disparate computer platforms eliminates the need to buy one type of computer for computations, another for graphics, and yet another for day-to-day operations. It might be possible, for example, to meet all research and engineering computing needs with existing networked computers.« less

  9. From computers to ubiquitous computing by 2010: health care.

    PubMed

    Aziz, Omer; Lo, Benny; Pansiot, Julien; Atallah, Louis; Yang, Guang-Zhong; Darzi, Ara

    2008-10-28

    Over the past decade, miniaturization and cost reduction in semiconductors have led to computers smaller in size than a pinhead with powerful processing abilities that are affordable enough to be disposable. Similar advances in wireless communication, sensor design and energy storage have meant that the concept of a truly pervasive 'wireless sensor network', used to monitor environments and objects within them, has become a reality. The need for a wireless sensor network designed specifically for human body monitoring has led to the development of wireless 'body sensor network' (BSN) platforms composed of tiny integrated microsensors with on-board processing and wireless data transfer capability. The ubiquitous computing abilities of BSNs offer the prospect of continuous monitoring of human health in any environment, be it home, hospital, outdoors or the workplace. This pervasive technology comes at a time when Western world health care costs have sharply risen, reflected by increasing expenditure on health care as a proportion of gross domestic product over the last 20 years. Drivers of this rise include an ageing post 'baby boom' population, higher incidence of chronic disease and the need for earlier diagnosis. This paper outlines the role of pervasive health care technologies in providing more efficient health care.

  10. Aether: leveraging linear programming for optimal cloud computing in genomics

    PubMed Central

    Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J

    2018-01-01

    Abstract Motivation Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Results Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users’ existing HPC pipelines. Availability and implementation Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. Contact chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:29228186

  11. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1982-01-01

    The use of superminicomputers for solving a series of increasingly complex thermal analysis problems is investigated. The approach involved (1) installation and verification of the SPAR thermal analyzer software on superminicomputers at Langley Research Center and Goddard Space Flight Center, (2) solution of six increasingly complex thermal problems on this equipment, and (3) comparison of solution (accuracy, CPU time, turnaround time, and cost) with solutions on large mainframe computers.

  12. Using multi-level remote sensing and ground data to estimate forest biomass resources in remote regions: a case study in the boreal forests of interior Alaska

    Treesearch

    Hans-Erik Andersen; Strunk Jacob; Hailemariam Temesgen; Donald Atwood; Ken Winterberger

    2012-01-01

    The emergence of a new generation of remote sensing and geopositioning technologies, as well as increased capabilities in image processing, computing, and inferential techniques, have enabled the development and implementation of increasingly efficient and cost-effective multilevel sampling designs for forest inventory. In this paper, we (i) describe the conceptual...

  13. Costs of cloud computing for a biometry department. A case study.

    PubMed

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  14. Economic evaluation comparing intraoperative cone beam CT-based navigation and conventional fluoroscopy for the placement of spinal pedicle screws: a patient-level data cost-effectiveness analysis.

    PubMed

    Dea, Nicolas; Fisher, Charles G; Batke, Juliet; Strelzow, Jason; Mendelsohn, Daniel; Paquette, Scott J; Kwon, Brian K; Boyd, Michael D; Dvorak, Marcel F S; Street, John T

    2016-01-01

    Pedicle screws are routinely used in contemporary spinal surgery. Screw misplacement may be asymptomatic but is also correlated with potential adverse events. Computer-assisted surgery (CAS) has been associated with improved screw placement accuracy rates. However, this technology has substantial acquisition and maintenance costs. Despite its increasing usage, no rigorous full economic evaluation comparing this technology to current standard of care has been reported. Medical costs are exploding in an unsustainable way. Health economic theory requires that medical equipment costs be compared with expected benefits. To answer this question for computer-assisted spinal surgery, we present an economic evaluation looking specifically at symptomatic misplaced screws leading to reoperation secondary to neurologic deficits or biomechanical concerns. The study design was an observational case-control study from prospectively collected data of consecutive patients treated with the aid of CAS (treatment group) compared with a matched historical cohort of patients treated with conventional fluoroscopy (control group). The patient sample consisted of consecutive patients treated surgically at a quaternary academic center. The primary effectiveness measure studied was the number of reoperations for misplaced screws within 1 year of the index surgery. Secondary outcome measures included were total adverse event rate and postoperative computed tomography usage for pedicle screw examination. A patient-level data cost-effectiveness analysis from the hospital perspective was conducted to determine the value of a navigation system coupled with intraoperative 3-D imaging (O-arm Imaging and the StealthStation S7 Navigation Systems, Medtronic, Louisville, CO, USA) in adult spinal surgery. The capital costs for both alternatives were reported as equivalent annual costs based on the annuitization of capital expenditures method using a 3% discount rate and a 7-year amortization period. Annual maintenance costs were also added. Finally, reoperation costs using a micro-costing approach were calculated for both groups. An incremental cost-effectiveness ratio was calculated and reported as cost per reoperation avoided. Based on reoperation costs in Canada and in the United States, a minimal caseload was calculated for the more expensive alternative to be cost saving. Sensitivity analyses were also conducted. A total of 5,132 pedicle screws were inserted in 502 patients during the study period: 2,682 screws in 253 patients in the treatment group and 2,450 screws in 249 patients in the control group. Overall accuracy rates were 95.2% for the treatment group and 86.9% for the control group. Within 1 year post treatment, two patients (0.8%) required a revision surgery in the treatment group compared with 15 patients (6%) in the control group. An incremental cost-effectiveness ratio of $15,961 per reoperation avoided was calculated for the CAS group. Based on a reoperation cost of $12,618, this new technology becomes cost saving for centers performing more than 254 instrumented spinal procedures per year. Computer-assisted spinal surgery has the potential to reduce reoperation rates and thus to have serious cost-effectiveness and policy implications. High acquisition and maintenance costs of this technology can be offset by equally high reoperation costs. Our cost-effectiveness analysis showed that for high-volume centers with a similar case complexity to the studied population, this technology is economically justified. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Cost-Effectiveness and Cost-Utility of Internet-Based Computer Tailoring for Smoking Cessation

    PubMed Central

    Evers, Silvia MAA; de Vries, Hein; Hoving, Ciska

    2013-01-01

    Background Although effective smoking cessation interventions exist, information is limited about their cost-effectiveness and cost-utility. Objective To assess the cost-effectiveness and cost-utility of an Internet-based multiple computer-tailored smoking cessation program and tailored counseling by practice nurses working in Dutch general practices compared with an Internet-based multiple computer-tailored program only and care as usual. Methods The economic evaluation was embedded in a randomized controlled trial, for which 91 practice nurses recruited 414 eligible smokers. Smokers were randomized to receive multiple tailoring and counseling (n=163), multiple tailoring only (n=132), or usual care (n=119). Self-reported cost and quality of life were assessed during a 12-month follow-up period. Prolonged abstinence and 24-hour and 7-day point prevalence abstinence were assessed at 12-month follow-up. The trial-based economic evaluation was conducted from a societal perspective. Uncertainty was accounted for by bootstrapping (1000 times) and sensitivity analyses. Results No significant differences were found between the intervention arms with regard to baseline characteristics or effects on abstinence, quality of life, and addiction level. However, participants in the multiple tailoring and counseling group reported significantly more annual health care–related costs than participants in the usual care group. Cost-effectiveness analysis, using prolonged abstinence as the outcome measure, showed that the mere multiple computer-tailored program had the highest probability of being cost-effective. Compared with usual care, in this group €5100 had to be paid for each additional abstinent participant. With regard to cost-utility analyses, using quality of life as the outcome measure, usual care was probably most efficient. Conclusions To our knowledge, this was the first study to determine the cost-effectiveness and cost-utility of an Internet-based smoking cessation program with and without counseling by a practice nurse. Although the Internet-based multiple computer-tailored program seemed to be the most cost-effective treatment, the cost-utility was probably highest for care as usual. However, to ease the interpretation of cost-effectiveness results, future research should aim at identifying an acceptable cutoff point for the willingness to pay per abstinent participant. PMID:23491820

  16. Stratified cost-utility analysis of C-Leg versus mechanical knees: Findings from an Italian sample of transfemoral amputees.

    PubMed

    Cutti, Andrea Giovanni; Lettieri, Emanuele; Del Maestro, Martina; Radaelli, Giovanni; Luchetti, Martina; Verni, Gennero; Masella, Cristina

    2017-06-01

    The fitting rate of the C-Leg electronic knee (Otto-Bock, D) has increased steadily over the last 15 years. Current cost-utility studies, however, have not considered the patients' characteristics. To complete a cost-utility analysis involving C-Leg and mechanical knee users; "age at the time of enrollment," "age at the time of first prosthesis," and "experience with the current type of prosthesis" are assumed as non-nested stratification parameters. Cohort retrospective. In all, 70 C-Leg and 57 mechanical knee users were selected. For each stratification criteria, we evaluated the cost-utility of C-Leg versus mechanical knees by computing the incremental cost-utility ratio, that is, the ratio of the "difference in cost" and the "difference in utility" of the two technologies. Cost consisted of acquisition, maintenance, transportation, and lodging expenses. Utility was measured in terms of quality-adjusted life years, computed on the basis of participants' answers to the EQ-5D questionnaire. Patients over 40 years at the time of first prosthesis were the only group featuring an incremental cost-utility ratio (88,779 €/quality-adjusted life year) above the National Institute for Health and Care Excellence practical cost-utility threshold (54,120 €/quality-adjusted live year): C-Leg users experience a significant improvement of "mobility," but limited outcomes on "usual activities," "self-care," "depression/anxiety," and reduction of "pain/discomfort." The stratified cost-utility results have relevant clinical implications and provide useful information for practitioners in tailoring interventions. Clinical relevance A cost-utility analysis that considered patients characteristics provided insights on the "affordability" of C-Leg compared to mechanical knees. In particular, results suggest that C-Leg has a significant impact on "mobility" for first-time prosthetic users over 40 years, but implementation of specific low-cost physical/psychosocial interventions is required to retun within cost-utility thresholds.

  17. Efficient experimental design for uncertainty reduction in gene regulatory networks.

    PubMed

    Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R

    2015-01-01

    An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.

  18. Efficient experimental design for uncertainty reduction in gene regulatory networks

    PubMed Central

    2015-01-01

    Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515

  19. Recent advances in QM/MM free energy calculations using reference potentials☆

    PubMed Central

    Duarte, Fernanda; Amrein, Beat A.; Blaha-Nelson, David; Kamerlin, Shina C.L.

    2015-01-01

    Background Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Scope of review Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. Major conclusions The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. General significance As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. PMID:25038480

  20. Assessing Cost-effectiveness of Green Infrastructures in response to Large Storm Events at Household Scale

    NASA Astrophysics Data System (ADS)

    Chui, T. F. M.; Liu, X.; Zhan, W.

    2015-12-01

    Green infrastructures (GI) are becoming more important for urban stormwater control worldwide. However, relatively few studies focus on researching the specific designs of GI at household scale. This study assesses the hydrological performance and cost-effectiveness of different GI designs, namely green roofs, bioretention systems and porous pavements. It aims to generate generic insights by comparing the optimal designs of each GI in 2-year and 50-year storms of Hong Kong, China and Seattle, US. EPA SWMM is first used to simulate the hydrologic performance, in particular, the peak runoff reduction of thousands of GI designs. Then, life cycle costs of the designs are computed and their effectiveness, in terms of peak runoff reduction percentage per thousand dollars, is compared. The peak runoff reduction increases almost linearly with costs for green roofs. However, for bioretention systems and porous pavements, peak runoff reduction only increases significantly with costs in the mid values. For achieving the same peak runoff reduction percentage, the optimal soil depth of green roofs increases with the design storm, while surface area does not change significantly. On the other hand, for bioretention systems and porous pavements, the optimal surface area increases with the design storm, while thickness does not change significantly. In general, the cost effectiveness of porous pavements is highest, followed by bioretention systems and then green roofs. The cost effectiveness is higher for a smaller storm, and is thus higher for 2-year storm than 50-year storm, and is also higher for Seattle when compared to Hong Kong. This study allows us to better understand the hydrological performance and cost-effectiveness of different GI designs. It facilitates the implementation of optimal choice and design of each specific GI for stormwater mitigation.

  1. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  2. Self-organizing maps for learning the edit costs in graph matching.

    PubMed

    Neuhaus, Michel; Bunke, Horst

    2005-06-01

    Although graph matching and graph edit distance computation have become areas of intensive research recently, the automatic inference of the cost of edit operations has remained an open problem. In the present paper, we address the issue of learning graph edit distance cost functions for numerically labeled graphs from a corpus of sample graphs. We propose a system of self-organizing maps (SOMs) that represent the distance measuring spaces of node and edge labels. Our learning process is based on the concept of self-organization. It adapts the edit costs in such a way that the similarity of graphs from the same class is increased, whereas the similarity of graphs from different classes decreases. The learning procedure is demonstrated on two different applications involving line drawing graphs and graphs representing diatoms, respectively.

  3. The Next Generation of Lab and Classroom Computing - The Silver Lining

    DTIC Science & Technology

    2016-12-01

    desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The research... infrastructure , VDI, hardware cost, software cost, manpower, availability, cloud computing, private cloud, bring your own device, BYOD, thin client...virtual desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The

  4. Visual Information (6)

    DTIC Science & Technology

    1987-12-01

    definition 33., below). 7. Commercial VI Production. A completed VI production, purchased off-the- shelf; i.e., from the stocks of a vendor. 8. Computer ...Generated Graphics. The production of graphics through an electronic medium based on a computer or computer techniques. 9. Contract VI Production. A VI...displays, presentations, and exhibits prepared manually, by machine, or by computer . 16. Indirect Costs. An item of cost (or the aggregate thereof) that is

  5. A computer vision for animal ecology.

    PubMed

    Weinstein, Ben G

    2018-05-01

    A central goal of animal ecology is to observe species in the natural world. The cost and challenge of data collection often limit the breadth and scope of ecological study. Ecologists often use image capture to bolster data collection in time and space. However, the ability to process these images remains a bottleneck. Computer vision can greatly increase the efficiency, repeatability and accuracy of image review. Computer vision uses image features, such as colour, shape and texture to infer image content. I provide a brief primer on ecological computer vision to outline its goals, tools and applications to animal ecology. I reviewed 187 existing applications of computer vision and divided articles into ecological description, counting and identity tasks. I discuss recommendations for enhancing the collaboration between ecologists and computer scientists and highlight areas for future growth of automated image analysis. © 2017 The Author. Journal of Animal Ecology © 2017 British Ecological Society.

  6. Low-cost data analysis systems for processing multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitely, S. L.

    1976-01-01

    The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.

  7. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  8. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  9. Standardized ultrasound templates for diagnosing appendicitis reduce annual imaging costs.

    PubMed

    Nordin, Andrew B; Sales, Stephen; Nielsen, Jason W; Adler, Brent; Bates, David Gregory; Kenney, Brian

    2018-01-01

    Ultrasound is preferred over computed tomography (CT) for diagnosing appendicitis in children to avoid undue radiation exposure. We previously reported our experience in instituting a standardized appendicitis ultrasound template, which decreased CT rates by 67.3%. In this analysis, we demonstrate the ongoing cost savings associated with using this template. Retrospective chart review for the time period preceding template implementation (June 2012-September 2012) was combined with prospective review through December 2015 for all patients in the emergency department receiving diagnostic imaging for appendicitis. The type of imaging was recorded, and imaging rates and ultrasound test statistics were calculated. Estimated annual imaging costs based on pretemplate ultrasound and CT utilization rates were compared with post-template annual costs to calculate annual and cumulative savings. In the pretemplate period, ultrasound and CT rates were 80.2% and 44.3%, respectively, resulting in a combined annual cost of $300,527.70. Similar calculations were performed for each succeeding year, accounting for changes in patient volume. Using pretemplate rates, our projected 2015 imaging cost was $371,402.86; however, our ultrasound rate had increased to 98.3%, whereas the CT rate declined to 9.6%, yielding an annual estimated cost of $224,853.00 and a savings of $146,549.86. Since implementation, annual savings have steadily increased for a cumulative cost savings of $336,683.83. Standardizing ultrasound reports for appendicitis not only reduces the use of CT scans and the associated radiation exposure but also decreases annual imaging costs despite increased numbers of imaging studies. Continued cost reduction may be possible by using diagnostic algorithms. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation

    PubMed Central

    Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.

    1984-01-01

    A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.

  11. Influence of computational fluid dynamics on experimental aerospace facilities: A fifteen year projection

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An assessment was made of the impact of developments in computational fluid dynamics (CFD) on the traditional role of aerospace ground test facilities over the next fifteen years. With improvements in CFD and more powerful scientific computers projected over this period it is expected to have the capability to compute the flow over a complete aircraft at a unit cost three orders of magnitude lower than presently possible. Over the same period improvements in ground test facilities will progress by application of computational techniques including CFD to data acquisition, facility operational efficiency, and simulation of the light envelope; however, no dramatic change in unit cost is expected as greater efficiency will be countered by higher energy and labor costs.

  12. Minnesota Computer Aided Library System (MCALS); University of Minnesota Subsystem Cost/Benefits Analysis.

    ERIC Educational Resources Information Center

    Lourey, Eugene D., Comp.

    The Minnesota Computer Aided Library System (MCALS) provides a basis of unification for library service program development in Minnesota for eventual linkage to the national information network. A prototype plan for communications functions is illustrated. A cost/benefits analysis was made to show the cost/effectiveness potential for MCALS. System…

  13. Data Bases at a State Institution--Costs, Uses and Needs. AIR Forum Paper 1978.

    ERIC Educational Resources Information Center

    McLaughlin, Gerald W.

    The cost-benefit of administrative data at a state college is placed in perspective relative to the institutional involvement in computer use. The costs of computer operations, personnel, and peripheral equipment expenses related to instruction are analyzed. Data bases and systems support institutional activities, such as registration, and aid…

  14. Computer assisted yarding cost analysis.

    Treesearch

    Ronald W. Mifflin

    1980-01-01

    Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...

  15. 7 CFR 993.159 - Payments for services performed with respect to reserve tonnage prunes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... overhead costs, which include those for supervision, indirect labor, fuel, power and water, taxes and... tonnage prunes. The Committee will compute the average industry cost for holding reserve pool prunes by... choose to exclude the high and low data in computing an industry average. The industry average costs may...

  16. 7 CFR 993.159 - Payments for services performed with respect to reserve tonnage prunes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... overhead costs, which include those for supervision, indirect labor, fuel, power and water, taxes and... tonnage prunes. The Committee will compute the average industry cost for holding reserve pool prunes by... choose to exclude the high and low data in computing an industry average. The industry average costs may...

  17. 7 CFR 993.159 - Payments for services performed with respect to reserve tonnage prunes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overhead costs, which include those for supervision, indirect labor, fuel, power and water, taxes and... tonnage prunes. The Committee will compute the average industry cost for holding reserve pool prunes by... choose to exclude the high and low data in computing an industry average. The industry average costs may...

  18. Overview of the LINCS architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.; Watson, R.W.

    1982-01-13

    Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less

  19. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  20. Solution of 3-dimensional time-dependent viscous flows. Part 3: Application to turbulent and unsteady flows

    NASA Technical Reports Server (NTRS)

    Weinberg, B. C.; Mcdonald, H.

    1982-01-01

    A numerical scheme is developed for solving the time dependent, three dimensional compressible viscous flow equations to be used as an aid in the design of helicopter rotors. In order to further investigate the numerical procedure, the computer code developed to solve an approximate form of the three dimensional unsteady Navier-Stokes equations employing a linearized block implicit technique in conjunction with a QR operator scheme is tested. Results of calculations are presented for several two dimensional boundary layer flows including steady turbulent and unsteady laminar cases. A comparison of fourth order and second order solutions indicate that increased accuracy can be obtained without any significant increases in cost (run time). The results of the computations also indicate that the computer code can be applied to more complex flows such as those encountered on rotating airfoils. The geometry of a symmetric NACA four digit airfoil is considered and the appropriate geometrical properties are computed.

  1. A descriptive feast but an evaluative famine: systematic review of published articles on primary care computing during 1980-97.

    PubMed

    Mitchell, E; Sullivan, F

    2001-02-03

    To appraise findings from studies examining the impact of computers on primary care consultations. Systematic review of world literature from 1980 to 1997. 5475 references were identified from electronic databases (Medline, Science Citation Index, Social Sciences Citation Index, Index of Scientific and Technical Proceedings, Embase, OCLC FirstSearch Proceedings), bibliographies, books, identified articles, and by authors active in the field. 1892 eligible abstracts were independently rated, and 89 studies met the inclusion criteria. Effect on doctors' performance and patient outcomes; attitudes towards computerisation. 61 studies examined effects of computers on practitioners' performance, 17 evaluated their impact on patient outcome, and 20 studied practitioners' or patients' attitudes. Computer use during consultations lengthened the consultation. Reminder systems for preventive tasks and disease management improved process rates, although some returned to pre-intervention levels when reminders were stopped. Use of computers for issuing prescriptions increased prescribing of generic drugs, and use of computers for test ordering led to cost savings and fewer unnecessary tests. There were no negative effects on those patient outcomes evaluated. Doctors and patients were generally positive about use of computers, but issues of concern included their impact on privacy, the doctor-patient relationship, cost, time, and training needs. Primary care computing systems can improve practitioner performance, particularly for health promotion interventions. This may be at the expense of patient initiated activities, making many practitioners suspicious of the negative impact on relationships with patients. There remains a dearth of evidence evaluating effects on patient outcomes.

  2. A descriptive feast but an evaluative famine: systematic review of published articles on primary care computing during 1980-97

    PubMed Central

    Mitchell, Elizabeth; Sullivan, Frank

    2001-01-01

    Objectives To appraise findings from studies examining the impact of computers on primary care consultations. Design Systematic review of world literature from 1980 to 1997. Data sources 5475 references were identified from electronic databases (Medline, Science Citation Index, Social Sciences Citation Index, Index of Scientific and Technical Proceedings, Embase, OCLC FirstSearch Proceedings), bibliographies, books, identified articles, and by authors active in the field. 1892 eligible abstracts were independently rated, and 89 studies met the inclusion criteria. Main outcome measures Effect on doctors' performance and patient outcomes; attitudes towards computerisation. Results 61 studies examined effects of computers on practitioners' performance, 17 evaluated their impact on patient outcome, and 20 studied practitioners' or patients' attitudes. Computer use during consultations lengthened the consultation. Reminder systems for preventive tasks and disease management improved process rates, although some returned to pre-intervention levels when reminders were stopped. Use of computers for issuing prescriptions increased prescribing of generic drugs, and use of computers for test ordering led to cost savings and fewer unnecessary tests. There were no negative effects on those patient outcomes evaluated. Doctors and patients were generally positive about use of computers, but issues of concern included their impact on privacy, the doctor-patient relationship, cost, time, and training needs. Conclusions Primary care computing systems can improve practitioner performance, particularly for health promotion interventions. This may be at the expense of patient initiated activities, making many practitioners suspicious of the negative impact on relationships with patients. There remains a dearth of evidence evaluating effects on patient outcomes. PMID:11157532

  3. Localized overlap algorithm for unexpanded dispersion energies

    NASA Astrophysics Data System (ADS)

    Rob, Fazle; Misquitta, Alston J.; Podeszwa, Rafał; Szalewicz, Krzysztof

    2014-03-01

    First-principles-based, linearly scaling algorithm has been developed for calculations of dispersion energies from frequency-dependent density susceptibility (FDDS) functions with account of charge-overlap effects. The transition densities in FDDSs are fitted by a set of auxiliary atom-centered functions. The terms in the dispersion energy expression involving products of such functions are computed using either the unexpanded (exact) formula or from inexpensive asymptotic expansions, depending on the location of these functions relative to the dimer configuration. This approach leads to significant savings of computational resources. In particular, for a dimer consisting of two elongated monomers with 81 atoms each in a head-to-head configuration, the most favorable case for our algorithm, a 43-fold speedup has been achieved while the approximate dispersion energy differs by less than 1% from that computed using the standard unexpanded approach. In contrast, the dispersion energy computed from the distributed asymptotic expansion differs by dozens of percent in the van der Waals minimum region. A further increase of the size of each monomer would result in only small increased costs since all the additional terms would be computed from the asymptotic expansion.

  4. Computational methods in drug discovery

    PubMed Central

    Leelananda, Sumudu P

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341

  5. Computational methods in drug discovery.

    PubMed

    Leelananda, Sumudu P; Lindert, Steffen

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  6. Comparison of automated satellite systems with conventional systems for hydrologic data collection in west-central Florida

    USGS Publications Warehouse

    Woodham, W.M.

    1982-01-01

    This report provides results of reliability and cost-effective studies of the goes satellite data-collection system used to operate a small hydrologic data network in west-central Florida. The GOES system, in its present state of development, was found to be about as reliable as conventional methods of data collection. Benefits of using the GOES system include some cost and manpower reduction, improved data accuracy, near real-time data availability, and direct computer storage and analysis of data. The GOES system could allow annual manpower reductions of 19 to 23 percent with reduction in cost for some and increase in cost for other single-parameter sites, such as streamflow, rainfall, and ground-water monitoring stations. Manpower reductions of 46 percent or more appear possible for multiple-parameter sites. Implementation of expected improvements in instrumentation and data handling procedures should further reduce costs. (USGS)

  7. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation... 42 Public Health 3 2012-10-01 2012-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  8. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2011-10-01 2011-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  9. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2010-10-01 2010-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  10. hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers

    NASA Astrophysics Data System (ADS)

    Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland

    We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.

  11. Ambient Human-to-Human Communication

    NASA Astrophysics Data System (ADS)

    Härmä, Aki

    In the current technological landscape colored by environmental and security concerns the logic of replacing traveling by technical means of communications is undisputable. For example, consider a comparison between a normal family car and a video conference system with two laptop computers connected over the Internet. The power consumption of the car is approximately 25 kW while the two computers and their share of the power consumption in the intermediate routers in total is in the range of 50 W. Therefore, to meet a person using a car at an one hour driving distance is equivalent to 1000 hours of video conference. The difference in the costs is also increasing. An estimate on the same cost difference between travel and video conference twenty years ago gave only three days of continuous video conference for the same situation [29]. The cost of video conference depends on the duration of the session while traveling depends only on the distance. However, in a strict economical and environmental sense even a five minute trip by a car in 2008 becomes more economical than a video conference only when the meeting lasts more than three and half days.

  12. The feasibility of a public-private long-term care financing plan.

    PubMed

    Arling, G; Hagan, S; Buhaug, H

    1992-08-01

    In this study, the feasibility of a public-private long-term care (LTC) financing plan that would combine private LTC insurance with special Medicaid eligibility requirements was assessed. The plan would also raise the Medicaid asset limit from the current $2,000 to the value of an individual's insurance benefits. After using benefits the individual could enroll in Medicaid. Thus, insurance would substitute for asset spend-down, protecting individuals against catastrophic costs. This financing plan was analyzed through a computer model that simulated lifetime LTC use for a middle-income age cohort beginning at 65 years of age. LTC payments from Medicaid, personal income and assets, Medicare, and insurance were projected by the model. Assuming that LTC use and costs would not grow beyond current projections, the proposed plan would provide asset protection for the cohort without increasing Medicaid expenditures. In contrast, private insurance alone, with no change in Medicaid eligibility, would offer only limited asset protection. The results must be qualified, however, because even a modest increase in LTC cost growth or use of care (beyond current projections) could result in substantially higher Medicaid expenditures. Also, private insurance might increase personal LTC expenditures because of the added cost of insuring.

  13. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  14. Virtualization in education: Information Security lab in your hands

    NASA Astrophysics Data System (ADS)

    Karlov, A. A.

    2016-09-01

    The growing demand for qualified specialists in advanced information technologies poses serious challenges to the education and training of young personnel for science, industry and social problems. Virtualization as a way to isolate the user from the physical characteristics of computing resources (processors, servers, operating systems, networks, applications, etc.), has, in particular, an enormous influence in the field of education, increasing its efficiency, reducing the cost, making it more widely and readily available. The study of Information Security of computer systems is considered as an example of use of virtualization in education.

  15. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  16. Photo-reconnaissance applications of computer processing of images.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1972-01-01

    Discussion of imaging processing techniques for enhancement and calibration of Jet Propulsion Laboratory imaging experiment pictures returned from NASA space vehicles such as Ranger, Mariner and Surveyor. Particular attention is given to data transmission, resolution vs recognition, and color aspects of digital data processing. The effectiveness of these techniques in applications to images from a wide variety of sources is noted. It is anticipated that the use of computer processing for enhancement of imagery will increase with the improvement and cost reduction of these techniques in the future.

  17. Distributed sensor networks: a cellular nonlinear network perspective.

    PubMed

    Haenggi, Martin

    2003-12-01

    Large-scale networks of integrated wireless sensors become increasingly tractable. Advances in hardware technology and engineering design have led to dramatic reductions in size, power consumption, and cost for digital circuitry, and wireless communications. Networking, self-organization, and distributed operation are crucial ingredients to harness the sensing, computing, and computational capabilities of the nodes into a complete system. This article shows that those networks can be considered as cellular nonlinear networks (CNNs), and that their analysis and design may greatly benefit from the rich theoretical results available for CNNs.

  18. Model Order Reduction Algorithm for Estimating the Absorption Spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Beeumen, Roel; Williams-Young, David B.; Kasper, Joseph M.

    The ab initio description of the spectral interior of the absorption spectrum poses both a theoretical and computational challenge for modern electronic structure theory. Due to the often spectrally dense character of this domain in the quantum propagator’s eigenspectrum for medium-to-large sized systems, traditional approaches based on the partial diagonalization of the propagator often encounter oscillatory and stagnating convergence. Electronic structure methods which solve the molecular response problem through the solution of spectrally shifted linear systems, such as the complex polarization propagator, offer an alternative approach which is agnostic to the underlying spectral density or domain location. This generality comesmore » at a seemingly high computational cost associated with solving a large linear system for each spectral shift in some discretization of the spectral domain of interest. In this work, we present a novel, adaptive solution to this high computational overhead based on model order reduction techniques via interpolation. Model order reduction reduces the computational complexity of mathematical models and is ubiquitous in the simulation of dynamical systems and control theory. The efficiency and effectiveness of the proposed algorithm in the ab initio prediction of X-ray absorption spectra is demonstrated using a test set of challenging water clusters which are spectrally dense in the neighborhood of the oxygen K-edge. On the basis of a single, user defined tolerance we automatically determine the order of the reduced models and approximate the absorption spectrum up to the given tolerance. We also illustrate that, for the systems studied, the automatically determined model order increases logarithmically with the problem dimension, compared to a linear increase of the number of eigenvalues within the energy window. Furthermore, we observed that the computational cost of the proposed algorithm only scales quadratically with respect to the problem dimension.« less

  19. Transformational electronics: a powerful way to revolutionize our information world

    NASA Astrophysics Data System (ADS)

    Rojas, Jhonathan P.; Torres Sevilla, Galo A.; Ghoneim, Mohamed T.; Hussain, Aftab M.; Ahmed, Sally M.; Nassar, Joanna M.; Bahabry, Rabab R.; Nour, Maha; Kutbee, Arwa T.; Byas, Ernesto; Al-Saif, Bidoor; Alamri, Amal M.; Hussain, Muhammad M.

    2014-06-01

    With the emergence of cloud computation, we are facing the rising waves of big data. It is our time to leverage such opportunity by increasing data usage both by man and machine. We need ultra-mobile computation with high data processing speed, ultra-large memory, energy efficiency and multi-functionality. Additionally, we have to deploy energy-efficient multi-functional 3D ICs for robust cyber-physical system establishment. To achieve such lofty goals we have to mimic human brain, which is inarguably the world's most powerful and energy efficient computer. Brain's cortex has folded architecture to increase surface area in an ultra-compact space to contain its neuron and synapses. Therefore, it is imperative to overcome two integration challenges: (i) finding out a low-cost 3D IC fabrication process and (ii) foldable substrates creation with ultra-large-scale-integration of high performance energy efficient electronics. Hence, we show a low-cost generic batch process based on trench-protect-peel-recycle to fabricate rigid and flexible 3D ICs as well as high performance flexible electronics. As of today we have made every single component to make a fully flexible computer including non-planar state-of-the-art FinFETs. Additionally we have demonstrated various solid-state memory, movable MEMS devices, energy harvesting and storage components. To show the versatility of our process, we have extended our process towards other inorganic semiconductor substrates such as silicon germanium and III-V materials. Finally, we report first ever fully flexible programmable silicon based microprocessor towards foldable brain computation and wirelessly programmable stretchable and flexible thermal patch for pain management for smart bionics.

  20. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  1. Measuring costs of data collection at village clinics by village doctors for a syndromic surveillance system-a cross sectional survey from China.

    PubMed

    Ding, Yan; Fei, Yang; Xu, Biao; Yang, Jun; Yan, Weirong; Diwan, Vinod K; Sauerborn, Rainer; Dong, Hengjin

    2015-07-25

    Studies into the costs of syndromic surveillance systems are rare, especially for estimating the direct costs involved in implementing and maintaining these systems. An Integrated Surveillance System in rural China (ISSC project), with the aim of providing an early warning system for outbreaks, was implemented; village clinics were the main surveillance units. Village doctors expressed their willingness to join in the surveillance if a proper subsidy was provided. This study aims to measure the costs of data collection by village clinics to provide a reference regarding the subsidy level required for village clinics to participate in data collection. We conducted a cross-sectional survey with a village clinic questionnaire and a staff questionnaire using a purposive sampling strategy. We tracked reported events using the ISSC internal database. Cost data included staff time, and the annual depreciation and opportunity costs of computers. We measured the village doctors' time costs for data collection by multiplying the number of full time employment equivalents devoted to the surveillance by the village doctors' annual salaries and benefits, which equaled their net incomes. We estimated the depreciation and opportunity costs of computers by calculating the equivalent annual computer cost and then allocating this to the surveillance based on the percentage usage. The estimated total annual cost of collecting data was 1,423 Chinese Renminbi (RMB) in 2012 (P25 = 857, P75 = 3284), including 1,250 RMB (P25 = 656, P75 = 3000) staff time costs and 134 RMB (P25 = 101, P75 = 335) depreciation and opportunity costs of computers. The total costs of collecting data from the village clinics for the syndromic surveillance system was calculated to be low compared with the individual net income in County A.

  2. Image stacking approach to increase sensitivity of fluorescence detection using a low cost complementary metal-oxide-semiconductor (CMOS) webcam.

    PubMed

    Balsam, Joshua; Bruck, Hugh Alan; Kostov, Yordan; Rasooly, Avraham

    2012-01-01

    Optical technologies are important for biological analysis. Current biomedical optical analyses rely on high-cost, high-sensitivity optical detectors such as photomultipliers, avalanched photodiodes or cooled CCD cameras. In contrast, Webcams, mobile phones and other popular consumer electronics use lower-sensitivity, lower-cost optical components such as photodiodes or CMOS sensors. In order for consumer electronics devices, such as webcams, to be useful for biomedical analysis, they must have increased sensitivity. We combined two strategies to increase the sensitivity of CMOS-based fluorescence detector. We captured hundreds of low sensitivity images using a Webcam in video mode, instead of a single image typically used in cooled CCD devices.We then used a computational approach consisting of an image stacking algorithm to remove the noise by combining all of the images into a single image. While video mode is widely used for dynamic scene imaging (e.g. movies or time-lapse photography), it is not used to capture a single static image, which removes noise and increases sensitivity by more than thirty fold. The portable, battery-operated Webcam-based fluorometer system developed here consists of five modules: (1) a low cost CMOS Webcam to monitor light emission, (2) a plate to perform assays, (3) filters and multi-wavelength LED illuminator for fluorophore excitation, (4) a portable computer to acquire and analyze images, and (5) image stacking software for image enhancement. The samples consisted of various concentrations of fluorescein, ranging from 30 μM to 1000 μM, in a 36-well miniature plate. In the single frame mode, the fluorometer's limit-of-detection (LOD) for fluorescein is ∼1000 μM, which is relatively insensitive. However, when used in video mode combined with image stacking enhancement, the LOD is dramatically reduced to 30 μM, sensitivity which is similar to that of state-of-the-art ELISA plate photomultiplier-based readers. Numerous medical diagnostics assays rely on optical and fluorescence readers. Our novel combination of detection technologies, which is new to biodetection may enable the development of new low cost optical detectors based on an inexpensive Webcam (<$10). It has the potential to form the basis for high sensitivity, low cost medical diagnostics in resource-poor settings.

  3. Image stacking approach to increase sensitivity of fluorescence detection using a low cost complementary metal-oxide-semiconductor (CMOS) webcam

    PubMed Central

    Balsam, Joshua; Bruck, Hugh Alan; Kostov, Yordan; Rasooly, Avraham

    2013-01-01

    Optical technologies are important for biological analysis. Current biomedical optical analyses rely on high-cost, high-sensitivity optical detectors such as photomultipliers, avalanched photodiodes or cooled CCD cameras. In contrast, Webcams, mobile phones and other popular consumer electronics use lower-sensitivity, lower-cost optical components such as photodiodes or CMOS sensors. In order for consumer electronics devices, such as webcams, to be useful for biomedical analysis, they must have increased sensitivity. We combined two strategies to increase the sensitivity of CMOS-based fluorescence detector. We captured hundreds of low sensitivity images using a Webcam in video mode, instead of a single image typically used in cooled CCD devices.We then used a computational approach consisting of an image stacking algorithm to remove the noise by combining all of the images into a single image. While video mode is widely used for dynamic scene imaging (e.g. movies or time-lapse photography), it is not used to capture a single static image, which removes noise and increases sensitivity by more than thirty fold. The portable, battery-operated Webcam-based fluorometer system developed here consists of five modules: (1) a low cost CMOS Webcam to monitor light emission, (2) a plate to perform assays, (3) filters and multi-wavelength LED illuminator for fluorophore excitation, (4) a portable computer to acquire and analyze images, and (5) image stacking software for image enhancement. The samples consisted of various concentrations of fluorescein, ranging from 30 μM to 1000 μM, in a 36-well miniature plate. In the single frame mode, the fluorometer's limit-of-detection (LOD) for fluorescein is ∼1000 μM, which is relatively insensitive. However, when used in video mode combined with image stacking enhancement, the LOD is dramatically reduced to 30 μM, sensitivity which is similar to that of state-of-the-art ELISA plate photomultiplier-based readers. Numerous medical diagnostics assays rely on optical and fluorescence readers. Our novel combination of detection technologies, which is new to biodetection may enable the development of new low cost optical detectors based on an inexpensive Webcam (<$10). It has the potential to form the basis for high sensitivity, low cost medical diagnostics in resource-poor settings. PMID:23990697

  4. Processor Would Find Best Paths On Map

    NASA Technical Reports Server (NTRS)

    Eberhardt, Silvio P.

    1990-01-01

    Proposed very-large-scale integrated (VLSI) circuit image-data processor finds path of least cost from specified origin to any destination on map. Cost of traversal assigned to each picture element of map. Path of least cost from originating picture element to every other picture element computed as path that preserves as much as possible of signal transmitted by originating picture element. Dedicated microprocessor at each picture element stores cost of traversal and performs its share of computations of paths of least cost. Least-cost-path problem occurs in research, military maneuvers, and in planning routes of vehicles.

  5. A parallel implementation of an off-lattice individual-based model of multicellular populations

    NASA Astrophysics Data System (ADS)

    Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe

    2015-07-01

    As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.

  6. Pressure ulcers: implementation of evidence-based nursing practice.

    PubMed

    Clarke, Heather F; Bradley, Chris; Whytock, Sandra; Handfield, Shannon; van der Wal, Rena; Gundry, Sharon

    2005-03-01

    A 2-year project was carried out to evaluate the use of multi-component, computer-assisted strategies for implementing clinical practice guidelines. This paper describes the implementation of the project and lessons learned. The evaluation and outcomes of implementing clinical practice guidelines to prevent and treat pressure ulcers will be reported in a separate paper. The prevalence and incidence rates of pressure ulcers, coupled with the cost of treatment, constitute a substantial burden for our health care system. It is estimated that treating a pressure ulcer can increase nursing time up to 50%, and that treatment costs per ulcer can range from US$10,000 to $86,000, with median costs of $27,000. Although evidence-based guidelines for prevention and optimum treatment of pressure ulcers have been developed, there is little empirical evidence about the effectiveness of implementation strategies. The study was conducted across the continuum of care (primary, secondary and tertiary) in a Canadian urban Health Region involving seven health care organizations (acute, home and extended care). Trained surveyors (Registered Nurses) determined the prevalence and incidence of pressure ulcers among patients in these organizations. The use of a computerized decision-support system assisted staff to select optimal, evidence-based care strategies, record information and analyse individual and aggregate data. Evaluation indicated an increase in knowledge relating to pressure ulcer prevention, treatment strategies, resources required, and the role of the interdisciplinary team. Lack of visible senior nurse leadership; time required to acquire computer skills and to implement new guidelines; and difficulties with the computer system were identified as barriers. There is a need for a comprehensive, supported and sustained approach to implementation of evidence-based practice for pressure ulcer prevention and treatment, greater understanding of organization-specific barriers, and mechanisms for addressing the barriers.

  7. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  8. Semi-exact concentric atomic density fitting: Reduced cost and increased accuracy compared to standard density fitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollman, David S.; Department of Chemistry, Virginia Tech, Blacksburg, Virginia 24061; Schaefer, Henry F.

    2014-02-14

    A local density fitting scheme is considered in which atomic orbital (AO) products are approximated using only auxiliary AOs located on one of the nuclei in that product. The possibility of variational collapse to an unphysical “attractive electron” state that can affect such density fitting [P. Merlot, T. Kjærgaard, T. Helgaker, R. Lindh, F. Aquilante, S. Reine, and T. B. Pedersen, J. Comput. Chem. 34, 1486 (2013)] is alleviated by including atom-wise semidiagonal integrals exactly. Our approach leads to a significant decrease in the computational cost of density fitting for Hartree–Fock theory while still producing results with errors 2–5 timesmore » smaller than standard, nonlocal density fitting. Our method allows for large Hartree–Fock and density functional theory computations with exact exchange to be carried out efficiently on large molecules, which we demonstrate by benchmarking our method on 200 of the most widely used prescription drug molecules. Our new fitting scheme leads to smooth and artifact-free potential energy surfaces and the possibility of relatively simple analytic gradients.« less

  9. VCF-Explorer: filtering and analysing whole genome VCF files.

    PubMed

    Akgün, Mete; Demirci, Hüseyin

    2017-11-01

    The decreasing cost in high-throughput technologies led to a number of sequencing projects consisting of thousands of whole genomes. The paradigm shift from exome to whole genome brings a significant increase in the size of output files. Most of the existing tools which are developed to analyse exome files are not adequate for larger VCF files produced by whole genome studies. In this work we present VCF-Explorer, a variant analysis software capable of handling large files. Memory efficiency and avoiding computationally costly pre-processing step enable to carry out the analysis to be performed with ordinary computers. VCF-Explorer provides an easy to use environment where users can define various types of queries based on variant and sample genotype level annotations. VCF-Explorer can be run in different environments and computational platforms ranging from a standard laptop to a high performance server. VCF-Explorer is freely available at: http://vcfexplorer.sourceforge.net/. mete.akgun@tubitak.gov.tr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Upgrading NASA/DOSE laser ranging system control computers

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.

    1993-01-01

    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  11. 78 FR 47336 - Privacy Act of 1974; Computer Matching Program Between the Department of Housing and Urban...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-05

    ... provides an updated cost/benefit analysis providing an assessment of the benefits attained by HUD through... the scope of the existing computer matching program to now include the updated cost/ benefit analysis... change, and find a continued favorable examination of benefit/cost results; and (2) All parties certify...

  12. 26 CFR 1.179-5 - Time and manner of making election.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... desktop computer costing $1,500. On Taxpayer's 2003 Federal tax return filed on April 15, 2004, Taxpayer elected to expense under section 179 the full cost of the laptop computer and the full cost of the desktop... provided by the Internal Revenue Code, the regulations under the Code, or other guidance published in the...

  13. Costs, needs must be balanced when buying computer systems.

    PubMed

    Krantz, G M; Doyle, J J; Stone, S G

    1989-06-01

    A healthcare institution must carefully examine its internal needs and external requirements before selecting an information system. The system's costs must be carefully weighed because significant computer cost overruns can cripple overall hospital finances. A New Jersey hospital carefully studied these issues and determined that a contract with a regional data center was its best option.

  14. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    PubMed

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. The Additional Costs and Health Effects of a Patient Having Overweight or Obesity: A Computational Model.

    PubMed

    Fallah-Fini, Saeideh; Adam, Atif; Cheskin, Lawrence J; Bartsch, Sarah M; Lee, Bruce Y

    2017-10-01

    This paper estimates specific additional disease outcomes and costs that could be prevented by helping a patient go from an obesity or overweight category to a normal weight category at different ages. This information could help physicians, other health care workers, patients, and third-party payers determine how to prioritize weight reduction. A computational Markov model was developed that represented the BMI status, chronic health states, health outcomes, and associated costs (from various perspectives) for an adult at different age points throughout his or her lifetime. Incremental costs were calculated for adult patients with obesity or overweight (vs. normal weight) at different starting ages. For example, for a metabolically healthy 20-year-old, having obesity (vs. normal weight) added lifetime third-party payer costs averaging $14,059 (95% range: $13,956-$14,163), productivity losses of $14,141 ($13,969-$14,312), and total societal costs of $28,020 ($27,751-$28,289); having overweight vs. normal weight added $5,055 ($4,967-$5,144), $5,358 ($5,199-$5,518), and $10,365 ($10,140-$10,590). For a metabolically healthy 50-year-old, having obesity added $15,925 ($15,831-$16,020), $20,120 ($19,887-$20,352), and $36,278 ($35,977-$36,579); having overweight added $5,866 ($5,779-$5,953), $10,205 ($9,980-$10,429), and $16,169 ($15,899-$16,438). Incremental lifetime costs of a patient with obesity or overweight (vs. normal weight) increased with the patient's age, peaked at age 50, and decreased with older ages. However, weight reduction even in older adults still yielded incremental cost savings. © 2017 The Obesity Society.

  16. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    PubMed

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not for nonsmokers despite a small life expectancy benefit. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. A parallel method of atmospheric correction for multispectral high spatial resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhao, Shaoshuai; Ni, Chen; Cao, Jing; Li, Zhengqiang; Chen, Xingfeng; Ma, Yan; Yang, Leiku; Hou, Weizhen; Qie, Lili; Ge, Bangyu; Liu, Li; Xing, Jin

    2018-03-01

    The remote sensing image is usually polluted by atmosphere components especially like aerosol particles. For the quantitative remote sensing applications, the radiative transfer model based atmospheric correction is used to get the reflectance with decoupling the atmosphere and surface by consuming a long computational time. The parallel computing is a solution method for the temporal acceleration. The parallel strategy which uses multi-CPU to work simultaneously is designed to do atmospheric correction for a multispectral remote sensing image. The parallel framework's flow and the main parallel body of atmospheric correction are described. Then, the multispectral remote sensing image of the Chinese Gaofen-2 satellite is used to test the acceleration efficiency. When the CPU number is increasing from 1 to 8, the computational speed is also increasing. The biggest acceleration rate is 6.5. Under the 8 CPU working mode, the whole image atmospheric correction costs 4 minutes.

  18. Augmenting Surgery via Multi-scale Modeling and Translational Systems Biology in the Era of Precision Medicine: A Multidisciplinary Perspective

    PubMed Central

    Kassab, Ghassan S.; An, Gary; Sander, Edward A.; Miga, Michael; Guccione, Julius M.; Ji, Songbai; Vodovotz, Yoram

    2016-01-01

    In this era of tremendous technological capabilities and increased focus on improving clinical outcomes, decreasing costs, and increasing precision, there is a need for a more quantitative approach to the field of surgery. Multiscale computational modeling has the potential to bridge the gap to the emerging paradigms of Precision Medicine and Translational Systems Biology, in which quantitative metrics and data guide patient care through improved stratification, diagnosis, and therapy. Achievements by multiple groups have demonstrated the potential for 1) multiscale computational modeling, at a biological level, of diseases treated with surgery and the surgical procedure process at the level of the individual and the population; along with 2) patient-specific, computationally-enabled surgical planning, delivery, and guidance and robotically-augmented manipulation. In this perspective article, we discuss these concepts, and cite emerging examples from the fields of trauma, wound healing, and cardiac surgery. PMID:27015816

  19. Security Approaches in Using Tablet Computers for Primary Data Collection in Clinical Research

    PubMed Central

    Wilcox, Adam B.; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project. PMID:25848559

  20. Sampling schemes and parameter estimation for nonlinear Bernoulli-Gaussian sparse models

    NASA Astrophysics Data System (ADS)

    Boudineau, Mégane; Carfantan, Hervé; Bourguignon, Sébastien; Bazot, Michael

    2016-06-01

    We address the sparse approximation problem in the case where the data are approximated by the linear combination of a small number of elementary signals, each of these signals depending non-linearly on additional parameters. Sparsity is explicitly expressed through a Bernoulli-Gaussian hierarchical model in a Bayesian framework. Posterior mean estimates are computed using Markov Chain Monte-Carlo algorithms. We generalize the partially marginalized Gibbs sampler proposed in the linear case in [1], and build an hybrid Hastings-within-Gibbs algorithm in order to account for the nonlinear parameters. All model parameters are then estimated in an unsupervised procedure. The resulting method is evaluated on a sparse spectral analysis problem. It is shown to converge more efficiently than the classical joint estimation procedure, with only a slight increase of the computational cost per iteration, consequently reducing the global cost of the estimation procedure.

  1. Security approaches in using tablet computers for primary data collection in clinical research.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  2. Accelerated simulation of stochastic particle removal processes in particle-resolved aerosol models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, J.H.; Michelotti, M.D.; Riemer, N.

    2016-10-01

    Stochastic particle-resolved methods have proven useful for simulating multi-dimensional systems such as composition-resolved aerosol size distributions. While particle-resolved methods have substantial benefits for highly detailed simulations, these techniques suffer from high computational cost, motivating efforts to improve their algorithmic efficiency. Here we formulate an algorithm for accelerating particle removal processes by aggregating particles of similar size into bins. We present the Binned Algorithm for particle removal processes and analyze its performance with application to the atmospherically relevant process of aerosol dry deposition. We show that the Binned Algorithm can dramatically improve the efficiency of particle removals, particularly for low removalmore » rates, and that computational cost is reduced without introducing additional error. In simulations of aerosol particle removal by dry deposition in atmospherically relevant conditions, we demonstrate about 50-times increase in algorithm efficiency.« less

  3. From photons to big-data applications: terminating terabits

    PubMed Central

    2016-01-01

    Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. PMID:26809573

  4. From photons to big-data applications: terminating terabits.

    PubMed

    Zilberman, Noa; Moore, Andrew W; Crowcroft, Jon A

    2016-03-06

    Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. © 2016 The Authors.

  5. DEP : a computer program for evaluating lumber drying costs and investments

    Treesearch

    Stewart Holmes; George B. Harpole; Edward Bilek

    1983-01-01

    The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...

  6. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  7. Weight and cost estimating relationships for heavy lift airships

    NASA Technical Reports Server (NTRS)

    Gray, D. W.

    1979-01-01

    Weight and cost estimating relationships, including additional parameters that influence the cost and performance of heavy-lift airships (HLA), are discussed. Inputs to a closed loop computer program, consisting of useful load, forward speed, lift module positive or negative thrust, and rotors and propellers, are examined. Detail is given to the HLA cost and weight program (HLACW), which computes component weights, vehicle size, buoyancy lift, rotor and propellar thrust, and engine horse power. This program solves the problem of interrelating the different aerostat, rotors, engines and propeller sizes. Six sets of 'default parameters' are left for the operator to change during each computer run enabling slight data manipulation without altering the program.

  8. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  9. Application of the System Identification Technique to Goal-Directed Saccades.

    DTIC Science & Technology

    1984-07-30

    1983 to May 31, 1984 by the AFOSR under Grant No. AFOSR-83-0187. 1. Salaries & Wages $7,257 2. Employee Benefits $ 4186 3. Indirect Costs $1,177 *’ 1...Equipment $2,127 DEC VT100 Terminal Computer Terminal Table & Chair Computer Interface 5. Travel $ 672 6. Miscellaneous Expenses 281 Computer Costs ...Telephone Xeroxing Report Costs Total $12,000 A 1cc;3t Ion r . ;. ., ’o n. e, Ef V r CI3 k.i *r 7’r’ ’ - s-I - . CLef • -- * 0 - -- -, r ~ 𔄁 . r w

  10. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  11. [Health technology assessment report. Use of liquid-based cytology for cervical cancer precursors screening].

    PubMed

    Ronco, Guglielmo; Confortini, Massimo; Maccallini, Vincenzo; Naldoni, Carlo; Segnan, Nereo; Sideri, Mario; Zappa, Marco; Zorzi, Manuel; Calvia, Maria; Giorgi Rossi, Paolo

    2012-01-01

    OBJECTIVE OF THE PROJECT: Purpose of this Report is to evaluate the impact of the introduction of liquid-based cytology (LBC) in cervical cancer screening in terms of efficacy, undesired effects, costs and implications for organisation. EFFICACY AND UNDESIRED EFFECTS: LBC WITH MANUAL INTERPRETATION: The estimates of cross-sectional accuracy for high-grade intraepithelial neoplasia (CIN2 or more severe and CIN3 or more severe) obtained by a systematic review and meta-analysis published in 2008 were used. This review considered only studies in which all women underwent colposcopy or randomised controlled trials (RCTs) with complete verification of test positives. A systematic search of RCTs published thereafter was performed. Three RCTs were identified. One of these studies was conducted in 6 Italian regions and was of large size (45,174 women randomised); a second one was conducted in another Italian region (Abruzzo) and was of smaller size (8,654 women randomised); a third RCT was conducted in the Netherlands and was of large size (89,784 women randomised). No longitudinal study was available. There is currently no clear evidence that LBC increases the sensitivity of cytology and even less that its introduction increases the efficacy of cervical screening in preventing invasive cancers. The Italian randomised study NTCC showed a decrease in specificity, which was not observed in the other two RCTs available. In addition, the 2008 meta-analysis observed a reduction - even if minimal - in specificity just at the ASC-US cytological cut-off, but also a remarkable heterogeneity between studies. These results suggest that the effect of LBC on specificity is variable and plausibly related to the local style of cytology interpretation. There is evidence that LBC reduces the proportion of unsatisfactory slides, although the size of this effect varies remarkably. LBC WITH COMPUTER-ASSISTED INTERPRETATION: An Australian study, based on double testing, showed a statistically significant increase of the sensitivity for CIN2 or more of LBC with computer-assisted interpretation vs. conventional cytology with manual interpretation. However, an English RCT estimated that LBC with computer-assisted interpretation has a lower sensitivity than LBC with manual interpretation. COST AND ECONOMIC EVALUATION: In the current Italian situation the use of liquid-based cytology for primary screening is estimated to increase the costs of cytological screening. Liquid-based cytology needs shorter time for interpretation than conventional cytology. However, in the Italian situation, savings obtained from this time reduction and from the decreased number of repeats due to unsatisfactory slides are not currently sufficient to compensate the cost increase due to the prices currently applied by producers and to a possible greater number of colposcopies caused by LBC. In any case, at current prices, cost is estimated to increase even when assuming a referral rate to colposcopy with LBC similar or slightly lower than that with conventional cytology. For the costs of computer-assisted interpretation of liquid-based cytology, readers are referred to the relative HTA report (Epidemiol Prev 2012;36(5) Suppl 3:e1-43). ORGANISATIONAL AND ETHICAL ASPECTS: Ethical, legal and communication problems are judged to remain unchanged when compared to screening with conventional cytology. After having used the test for some time, interpreters prefer liquid-based to conventional cytology. Reduced time for interpretation makes the adoption of LBC a possible approach to deal with shortenings of cytology interpreters which is happening in Italy. However, alternative solutions, such as computer-assisted interpretation of cytology and the use of HPV as primary screening test, should be considered. Liquid-based cytology allows performing molecular tests, in particular the HPV test. This property allows triaging women with borderline or mild cytology by "reflex" molecular or immunocytochemical tests with no need to recall them. LBC sampling can be used also if HPV is applied as the primary screening test, allowing "reflex" triaging of HPV positive women by cytology with no need to recall them nor to take two samples, one for HPV testing and one for conventional cytology. This represents a remarkable advantage in terms of organization. However, costs are high because only 5-7% of women screened with this approach need interpretation of cytology. In addition, HPV testing with the Hybrid Capture assay on material preserved in LBC transport media needs a preliminary conversion phase, which limits the use of LBC for triaging HPV positive women. It is advisable that in the near future industry develops sampling/transport systems that allow performing both the HPV test and cytology or other validated triage tests without additional manipulations and at sustainable costs.

  12. HIV cure strategies: how good must they be to improve on current antiretroviral therapy?

    PubMed

    Sax, Paul E; Sypek, Alexis; Berkowitz, Bethany K; Morris, Bethany L; Losina, Elena; Paltiel, A David; Kelly, Kathleen A; Seage, George R; Walensky, Rochelle P; Weinstein, Milton C; Eron, Joseph; Freedberg, Kenneth A

    2014-01-01

    We examined efficacy, toxicity, relapse, cost, and quality-of-life thresholds of hypothetical HIV cure interventions that would make them cost-effective compared to life-long antiretroviral therapy (ART). We used a computer simulation model to assess three HIV cure strategies: Gene Therapy, Chemotherapy, and Stem Cell Transplantation (SCT), each compared to ART. Efficacy and cost parameters were varied widely in sensitivity analysis. Outcomes included quality-adjusted life expectancy, lifetime cost, and cost-effectiveness in dollars/quality-adjusted life year ($/QALY) gained. Strategies were deemed cost-effective with incremental cost-effectiveness ratios <$100,000/QALY. For patients on ART, discounted quality-adjusted life expectancy was 16.4 years and lifetime costs were $591,400. Gene Therapy was cost-effective with efficacy of 10%, relapse rate 0.5%/month, and cost $54,000. Chemotherapy was cost-effective with efficacy of 88%, relapse rate 0.5%/month, and cost $12,400/month for 24 months. At $150,000/procedure, SCT was cost-effective with efficacy of 79% and relapse rate 0.5%/month. Moderate efficacy increases and cost reductions made Gene Therapy cost-saving, but substantial efficacy/cost changes were needed to make Chemotherapy or SCT cost-saving. Depending on efficacy, relapse rate, and cost, cure strategies could be cost-effective compared to current ART and potentially cost-saving. These results may help provide performance targets for developing cure strategies for HIV.

  13. VCSEL-based optical transceiver module for high-speed short-reach interconnect

    NASA Astrophysics Data System (ADS)

    Yagisawa, Takatoshi; Oku, Hideki; Mori, Tatsuhiro; Tsudome, Rie; Tanaka, Kazuhiro; Daikuhara, Osamu; Komiyama, Takeshi; Ide, Satoshi

    2017-02-01

    Interconnects have been more important in high-performance computing systems and high-end servers beside its improvements in computing capability. Recently, active optical cables (AOCs) have started being used for this purpose instead of conventionally used copper cables. The AOC enables to extend the transmission distance of the high-speed signals dramatically by its broadband characteristics, however, it tend to increase the cost. In this paper, we report our developed quad small form-factor pluggable (QSFP) AOC utilizing cost-effective optical-module technologies. These are a unique structure using generally used flexible printed circuit (FPC) in combination with an optical waveguide that enables low-cost high-precision assembly with passive alignment, a lens-integrated ferrule that improves productivity by eliminating a polishing process for physical contact of standard PMT connector for the optical waveguide, and an overdrive technology that enables 100 Gb/s (25 Gb/s × 4-channel) operation with low-cost 14 Gb/s vertical-cavity surfaceemitting laser (VCSEL) array. The QSFP AOC demonstrated clear eye opening and error-free operation at 100 Gb/s with high yield rate even though the 14 Gb/s VCSEL was used thanks to the low-coupling loss resulting from the highprecision alignment of optical devices and the over-drive technology.

  14. The role of radiation hard solar cells in minimizing the costs of global satellite communications systems

    NASA Technical Reports Server (NTRS)

    Summers, Geoffrey P.; Walters, Robert J.; Messenger, Scott R.; Burke, Edward A.

    1995-01-01

    An analysis embodied in a PC computer program is presented which quantitatively demonstrates how the availability of radiation hard solar cells can minimize the cost of a global satellite communication system. The chief distinction between the currently proposed systems, such as Iridium Odyssey and Ellipsat, is the number of satellites employed and their operating altitudes. Analysis of the major costs associated with implementing these systems shows that operation within the earth's radiation belts can reduce the total system cost by as much as a factor of two, so long as radiation hard components including solar cells, can be used. A detailed evaluation of several types of planar solar cells is given, including commercially available Si and GaAs/Ge cells, and InP/Si cells which are under development. The computer program calculates the end of life (EOL) power density of solar arrays taking into account the cell geometry, coverglass thickness, support frame, electrical interconnects, etc. The EOL power density can be determined for any altitude from low earth orbit (LEO) to geosynchronous (GEO) and for equatorial to polar planes of inclination. The mission duration can be varied over the entire range planned for the proposed satellite systems. An algorithm is included in the program for determining the degradation of cell efficiency for different cell technologies due to proton and electron irradiation. The program can be used to determine the optimum configuration for any cell technology for a particular orbit and for a specified mission life. Several examples of applying the program are presented, in which it is shown that the EOL power density of different technologies can vary by an order of magnitude for certain missions. Therefore, although a relatively radiation soft technology can be made to provide the required EOL power by simply increasing the size of the array, the impact on the total system budget could be unacceptable, due to increased launch and hardware costs. In aggregate these factors can account for more than a 10% increase in the total system cost. Since the estimated total costs of proposed global coverage systems range from $1 Billion to $9 Billion, the availability of radiation hard solar cells could make a decisive difference in the selection of a particular constellation architecture.

  15. A simplified financial model for automatic meter reading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, S.M.

    1994-01-15

    The financial model proposed here (which can be easily adapted for electric, gas, or water) combines aspects of [open quotes]life cycle,[close quotes] [open quotes]consumer value[close quotes] and [open quotes]revenue based[close quotes] approaches and addresses intangible benefits. A simple value tree of one-word descriptions clarifies the relationship between level of investment and level of value, visually relating increased value to increased cost. The model computes the numerical present values of capital costs, recurring costs, and revenue benefits over a 15-year period for the seven configurations: manual reading of existing or replacement standard meters (MMR), manual reading using electronic, hand-held retrievers (EMR),more » remote reading of inaccessible meters via hard-wired receptacles (RMR), remote reading of meters adapted with pulse generators (RMR-P), remote reading of meters adapted with absolute dial encoders (RMR-E), offsite reading over a few hundred feet with mobile radio (OMR), and fully automatic reading using telephone or an equivalent network (AMR). In the model, of course, the costs of installing the configurations are clearly listed under each column. The model requires only four annualized inputs and seven fixed-cost inputs that are rather easy to obtain.« less

  16. Advanced vehicles: Costs, energy use, and macroeconomic impacts

    NASA Astrophysics Data System (ADS)

    Wang, Guihua

    Advanced vehicles and alternative fuels could play an important role in reducing oil use and changing the economy structure. We developed the Costs for Advanced Vehicles and Energy (CAVE) model to investigate a vehicle portfolio scenario in California during 2010-2030. Then we employed a computable general equilibrium model to estimate macroeconomic impacts of the advanced vehicle scenario on the economy of California. Results indicate that, due to slow fleet turnover, conventional vehicles are expected to continue to dominate the on-road fleet and gasoline is the major transportation fuel over the next two decades. However, alternative fuels could play an increasingly important role in gasoline displacement. Advanced vehicle costs are expected to decrease dramatically with production volume and technological progress; e.g., incremental costs for fuel cell vehicles and hydrogen could break even with gasoline savings in 2028. Overall, the vehicle portfolio scenario is estimated to have a slightly negative influence on California's economy, because advanced vehicles are very costly and, therefore, the resulting gasoline savings generally cannot offset the high incremental expenditure on vehicles and alternative fuels. Sensitivity analysis shows that an increase in gasoline price or a drop in alternative fuel prices could offset a portion of the negative impact.

  17. New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program

    NASA Technical Reports Server (NTRS)

    Strain, D.; Levy, R.

    1986-01-01

    The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.

  18. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  19. The Quake Catcher Network: Cyberinfrastructure Bringing Seismology into Schools and Homes

    NASA Astrophysics Data System (ADS)

    Lawrence, J. F.; Cochran, E. S.

    2007-12-01

    We propose to implement a high density, low cost strong-motion network for rapid response and early warning by placing sensors in schools, homes, and offices. The Quake Catcher Network (QCN) will employ existing networked laptops and desktops to form the world's largest high-density, distributed computing seismic network. Costs for this network will be minimal because the QCN will use 1) strong motion sensors (accelerometers) already internal to many laptops and 2) nearly identical low-cost universal serial bus (USB) accelerometers for use with desktops. The Berkeley Open Infrastructure for Network Computing (BOINC!) provides a free, proven paradigm for involving the public in large-scale computational research projects. As evidenced by the SETI@home program and others, individuals are especially willing to donate their unused computing power to projects that they deem relevant, worthwhile, and educational. The client- and server-side software will rapidly monitor incoming seismic signals, detect the magnitudes and locations of significant earthquakes, and may even provide early warnings to other computers and users before they can feel the earthquake. The software will provide the client-user with a screen-saver displaying seismic data recorded on their laptop, recently detected earthquakes, and general information about earthquakes and the geosciences. Furthermore, this project will install USB sensors in K-12 classrooms as an educational tool for teaching science. Through a variety of interactive experiments students will learn about earthquakes and the hazards earthquakes pose. For example, students can learn how the vibrations of an earthquake decrease with distance by jumping up and down at increasing distances from the sensor and plotting the decreased amplitude of the seismic signal measured on their computer. We hope to include an audio component so that students can hear and better understand the difference between low and high frequency seismic signals. The QCN will provide a natural way to engage students and the public in earthquake detection and research.

  20. A Systems Engineering Framework for Implementing a Security and Critical Patch Management Process in Diverse Environments (Academic Departments' Workstations)

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hadi

    Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to create and design a Security and Critical Patch Management Process (SCPMP) framework based on Systems Engineering (SE) principles. This framework will assist Information Technology Department Staff (ITDS) to reduce IT operating time and costs and mitigate the risk of security and vulnerability attacks. Further, this study evaluates implementation of the SCPMP in the networked computing systems of an academic environment in order to: 1. Meet patch management requirements by applying SE principles. 2. Reduce the cost of IT operations and PVM cycles. 3. Improve the current PVM methodologies to prevent networked computing systems from becoming the targets of security vulnerability attacks. 4. Embed a Maintenance Optimization Tool (MOT) in the proposed framework. The MOT allows IT managers to make the most practicable choice of methods for deploying and installing released patches and vulnerability remediation. In recent years, there has been a variety of frameworks for security practices in every networked computing system to protect computer workstations from becoming compromised or vulnerable to security attacks, which can expose important information and critical data. I have developed a new mechanism for implementing PVM for maximizing security-vulnerability maintenance, protecting OS and software packages, and minimizing SCPMP cost. To increase computing system security in any diverse environment, particularly in academia, one must apply SCPMP. I propose an optimal maintenance policy that will allow ITDS to measure and estimate the variation of PVM cycles based on their department's requirements. My results demonstrate that MOT optimizes the process of implementing SCPMP in academic workstations.

  1. Scalability of Parallel Spatial Direct Numerical Simulations on Intel Hypercube and IBM SP1 and SP2

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Hanebutte, Ulf R.; Zubair, Mohammad

    1995-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube and IBM SP1 and SP2 parallel computers is documented. Spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows are computed with the PSDNS code. The feasibility of using the PSDNS to perform transition studies on these computers is examined. The results indicate that PSDNS approach can effectively be parallelized on a distributed-memory parallel machine by remapping the distributed data structure during the course of the calculation. Scalability information is provided to estimate computational costs to match the actual costs relative to changes in the number of grid points. By increasing the number of processors, slower than linear speedups are achieved with optimized (machine-dependent library) routines. This slower than linear speedup results because the computational cost is dominated by FFT routine, which yields less than ideal speedups. By using appropriate compile options and optimized library routines on the SP1, the serial code achieves 52-56 M ops on a single node of the SP1 (45 percent of theoretical peak performance). The actual performance of the PSDNS code on the SP1 is evaluated with a "real world" simulation that consists of 1.7 million grid points. One time step of this simulation is calculated on eight nodes of the SP1 in the same time as required by a Cray Y/MP supercomputer. For the same simulation, 32-nodes of the SP1 and SP2 are required to reach the performance of a Cray C-90. A 32 node SP1 (SP2) configuration is 2.9 (4.6) times faster than a Cray Y/MP for this simulation, while the hypercube is roughly 2 times slower than the Y/MP for this application. KEY WORDS: Spatial direct numerical simulations; incompressible viscous flows; spectral methods; finite differences; parallel computing.

  2. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.

  3. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    PubMed

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  4. Eleven quick tips for architecting biomedical informatics workflows with cloud computing

    PubMed Central

    Moore, Jason H.

    2018-01-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416

  5. GPU-accelerated computing for Lagrangian coherent structures of multi-body gravitational regimes

    NASA Astrophysics Data System (ADS)

    Lin, Mingpei; Xu, Ming; Fu, Xiaoyu

    2017-04-01

    Based on a well-established theoretical foundation, Lagrangian Coherent Structures (LCSs) have elicited widespread research on the intrinsic structures of dynamical systems in many fields, including the field of astrodynamics. Although the application of LCSs in dynamical problems seems straightforward theoretically, its associated computational cost is prohibitive. We propose a block decomposition algorithm developed on Compute Unified Device Architecture (CUDA) platform for the computation of the LCSs of multi-body gravitational regimes. In order to take advantage of GPU's outstanding computing properties, such as Shared Memory, Constant Memory, and Zero-Copy, the algorithm utilizes a block decomposition strategy to facilitate computation of finite-time Lyapunov exponent (FTLE) fields of arbitrary size and timespan. Simulation results demonstrate that this GPU-based algorithm can satisfy double-precision accuracy requirements and greatly decrease the time needed to calculate final results, increasing speed by approximately 13 times. Additionally, this algorithm can be generalized to various large-scale computing problems, such as particle filters, constellation design, and Monte-Carlo simulation.

  6. A Mass Computation Model for Lightweight Brayton Cycle Regenerator Heat Exchangers

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2010-01-01

    Based on a theoretical analysis of convective heat transfer across large internal surface areas, this paper discusses the design implications for generating lightweight gas-gas heat exchanger designs by packaging such areas into compact three-dimensional shapes. Allowances are made for hot and cold inlet and outlet headers for assembly of completed regenerator (or recuperator) heat exchanger units into closed cycle gas turbine flow ducting. Surface area and resulting volume and mass requirements are computed for a range of heat exchanger effectiveness values and internal heat transfer coefficients. Benefit cost curves show the effect of increasing heat exchanger effectiveness on Brayton cycle thermodynamic efficiency on the plus side, while also illustrating the cost in heat exchanger required surface area, volume, and mass requirements as effectiveness is increased. The equations derived for counterflow and crossflow configurations show that as effectiveness values approach unity, or 100 percent, the required surface area, and hence heat exchanger volume and mass tend toward infinity, since the implication is that heat is transferred at a zero temperature difference. To verify the dimensional accuracy of the regenerator mass computational procedure, calculation of a regenerator specific mass, that is, heat exchanger weight per unit working fluid mass flow, is performed in both English and SI units. Identical numerical values for the specific mass parameter, whether expressed in lb/(lb/sec) or kg/(kg/sec), show the dimensional consistency of overall results.

  7. A Mass Computation Model for Lightweight Brayton Cycle Regenerator Heat Exchangers

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2010-01-01

    Based on a theoretical analysis of convective heat transfer across large internal surface areas, this paper discusses the design implications for generating lightweight gas-gas heat exchanger designs by packaging such areas into compact three-dimensional shapes. Allowances are made for hot and cold inlet and outlet headers for assembly of completed regenerator (or recuperator) heat exchanger units into closed cycle gas turbine flow ducting. Surface area and resulting volume and mass requirements are computed for a range of heat exchanger effectiveness values and internal heat transfer coefficients. Benefit cost curves show the effect of increasing heat exchanger effectiveness on Brayton cycle thermodynamic efficiency on the plus side, while also illustrating the cost in heat exchanger required surface area, volume, and mass requirements as effectiveness is increased. The equations derived for counterflow and crossflow configurations show that as effectiveness values approach unity, or 100 percent, the required surface area, and hence heat exchanger volume and mass tend toward infinity, since the implication is that heat is transferred at a zero temperature difference. To verify the dimensional accuracy of the regenerator mass computational procedure, calculation of a regenerator specific mass, that is, heat exchanger weight per unit working fluid mass flow, is performed in both English and SI units. Identical numerical values for the specific mass parameter, whether expressed in lb/(lb/sec) or kg/ (kg/sec), show the dimensional consistency of overall results.

  8. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  9. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  10. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  11. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  12. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  13. Cost-effectiveness of Standard vs. a Navigated Intervention on Colorectal Cancer Screening Use in Primary Care

    PubMed Central

    Lairson, David; DiCarlo, Melissa; Deshmuk, Ashish A.; Fagan, Heather B.; Sifri, Randa; Katurakes, Nora; Cocroft, James; Sendecki, Jocelyn; Swan, Heidi; Vernon, Sally W.; Myers, Ronald E.

    2014-01-01

    Background Colorectal cancer (CRC) screening is cost-effective but underutilized. This study aimed to determine the cost-effectiveness of mailed standard intervention (SI) and tailored navigation interventions (TNI) to increase CRC screening use in the context of a randomized trial among primary care patients. Methods Participants (n=945) were randomized either to a usual care Control Group (n=317), SI Group (n=316), or TNI Group (n=312). The SI Group was sent both colonoscopy instructions and stool blood tests irrespective of baseline preference. TNI Group participants were sent instructions for scheduling a colonoscopy, a stool blood test, or both based on their test preference as determined at baseline, and then received a navigation telephone call. Activity cost estimation was used to determine the cost of each intervention and compute incremental cost-effectiveness ratios . Statistical uncertainty within the base case was assessed with 95 percent confidence intervals derived from net benefit regression analysis. Effects of uncertain parameters such as the cost of planning, training, and involvement of those receiving “investigator salaries” were assessed with sensitivity analyses. Results Program costs of the SI were $167 per participant. Average cost of the TNI was $289 per participant. Conclusion The TNI was more effective than the SI, but substantially increased the cost per additional person screened. Decision-makers need to consider cost structure, level of planning, and training required to implement these two intervention strategies, and their willingness to pay for additional persons screened, to determine whether tailored navigation would be justified and feasible. PMID:24435411

  14. Spaceborne Processor Array

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Schatzel, Donald V.; Whitaker, William D.; Sterling, Thomas

    2008-01-01

    A Spaceborne Processor Array in Multifunctional Structure (SPAMS) can lower the total mass of the electronic and structural overhead of spacecraft, resulting in reduced launch costs, while increasing the science return through dynamic onboard computing. SPAMS integrates the multifunctional structure (MFS) and the Gilgamesh Memory, Intelligence, and Network Device (MIND) multi-core in-memory computer architecture into a single-system super-architecture. This transforms every inch of a spacecraft into a sharable, interconnected, smart computing element to increase computing performance while simultaneously reducing mass. The MIND in-memory architecture provides a foundation for high-performance, low-power, and fault-tolerant computing. The MIND chip has an internal structure that includes memory, processing, and communication functionality. The Gilgamesh is a scalable system comprising multiple MIND chips interconnected to operate as a single, tightly coupled, parallel computer. The array of MIND components shares a global, virtual name space for program variables and tasks that are allocated at run time to the distributed physical memory and processing resources. Individual processor- memory nodes can be activated or powered down at run time to provide active power management and to configure around faults. A SPAMS system is comprised of a distributed Gilgamesh array built into MFS, interfaces into instrument and communication subsystems, a mass storage interface, and a radiation-hardened flight computer.

  15. Pharmacy costs associated with nonformulary drug requests.

    PubMed

    Sweet, B V; Stevenson, J G

    2001-09-15

    Pharmacy costs associated with handling nonformulary drug requests were studied. Data for all nonformulary drug orders received at a university hospital between August 1 and October 31, 1999, were evaluated to determine their outcome and the cost differential between the nonformulary drug and formulary alternative. Two sets of data were used to analyze medication costs: data from nonformulary medication request forms, which allowed the cost of nonformulary drugs and their formulary alternatives to be calculated, and data from the pharmacy computer system, which enabled actual nonformulary drug use to be captured. Labor costs associated with processing these requests were determined through time analysis, which included the potential for orders to be received at different times of the day and with different levels of technician and pharmacist support. Economic analysis revealed that the greatest cost saving occurred when converting nonformulary injectable products to formulary alternatives. Interventions were least costly during normal business hours, when all the satellite pharmacies were open and fully staffed. Pharmacists' interventions in oral product orders resulted in a net increase in expenditures. Incremental pharmacy costs associated with processing nonformulary medication requests in an inpatient setting are greater than the drug acquisition cost saving for most agents, particularly oral medications.

  16. [Orthogonal Vector Projection Algorithm for Spectral Unmixing].

    PubMed

    Song, Mei-ping; Xu, Xing-wei; Chang, Chein-I; An, Ju-bai; Yao, Li

    2015-12-01

    Spectrum unmixing is an important part of hyperspectral technologies, which is essential for material quantity analysis in hyperspectral imagery. Most linear unmixing algorithms require computations of matrix multiplication and matrix inversion or matrix determination. These are difficult for programming, especially hard for realization on hardware. At the same time, the computation costs of the algorithms increase significantly as the number of endmembers grows. Here, based on the traditional algorithm Orthogonal Subspace Projection, a new method called. Orthogonal Vector Projection is prompted using orthogonal principle. It simplifies this process by avoiding matrix multiplication and inversion. It firstly computes the final orthogonal vector via Gram-Schmidt process for each endmember spectrum. And then, these orthogonal vectors are used as projection vector for the pixel signature. The unconstrained abundance can be obtained directly by projecting the signature to the projection vectors, and computing the ratio of projected vector length and orthogonal vector length. Compared to the Orthogonal Subspace Projection and Least Squares Error algorithms, this method does not need matrix inversion, which is much computation costing and hard to implement on hardware. It just completes the orthogonalization process by repeated vector operations, easy for application on both parallel computation and hardware. The reasonability of the algorithm is proved by its relationship with Orthogonal Sub-space Projection and Least Squares Error algorithms. And its computational complexity is also compared with the other two algorithms', which is the lowest one. At last, the experimental results on synthetic image and real image are also provided, giving another evidence for effectiveness of the method.

  17. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a) Required...

  18. 26 CFR 1.611-2 - Rules applicable to mines, oil and gas wells, and other natural deposits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Rules applicable to mines, oil and gas wells, and other natural deposits. (a) Computation of cost depletion of mines, oil and gas wells, and other natural deposits. (1) The basis upon which cost depletion... for the taxable year, the cost depletion for that year shall be computed by dividing such amount by...

  19. Development of instructional, interactive, multimedia anatomy dissection software: a student-led initiative.

    PubMed

    Inwood, Matthew J; Ahmad, Jamil

    2005-11-01

    Although dissection provides an unparalleled means of teaching gross anatomy, it constitutes a significant logistical and financial investment for educational institutions. The increasing availability and waning cost of computer equipment has enabled many institutions to supplement their anatomy curriculum with Computer Aided Learning (CAL) software. At the Royal College of Surgeons in Ireland, two undergraduate medical students designed and produced instructional anatomy dissection software for use by first and second year medical students. The software consists of full-motion, narrated, QuickTime MPG movies presented in a Macromedia environment. Forty-four movies, between 1-11 min in duration, were produced. Each movie corresponds to a dissection class and precisely demonstrates the dissection and educational objectives for that class. The software is distributed to students free of charge and they are encouraged to install it on their Apple iBook computers. Results of a student evaluation indicated that the software was useful, easy to use, and improved the students' experience in the dissection classes. The evaluation also indicated that only a minority of students regularly used the software or had it installed on their laptop computers. Accordingly, effort should also be directed toward making the software more accessible and increasing students' comfort and familiarity with novel instructional media. The successful design and implementation of this software demonstrates that CAL software can be employed to augment, enhance and improve anatomy instruction. In addition, effective, high quality, instructional multimedia software can be tailored to an educational institution's requirements and produced by novice programmers at minimal cost. Copyright 2005 Wiley-Liss, Inc

  20. A Scalable, Out-of-Band Diagnostics Architecture for International Space Station Systems Support

    NASA Technical Reports Server (NTRS)

    Fletcher, Daryl P.; Alena, Rick; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The computational infrastructure of the International Space Station (ISS) is a dynamic system that supports multiple vehicle subsystems such as Caution and Warning, Electrical Power Systems and Command and Data Handling (C&DH), as well as scientific payloads of varying size and complexity. The dynamic nature of the ISS configuration coupled with the increased demand for payload support places a significant burden on the inherently resource constrained computational infrastructure of the ISS. Onboard system diagnostics applications are hosted on computers that are elements of the avionics network while ground-based diagnostic applications receive only a subset of available telemetry, down-linked via S-band communications. In this paper we propose a scalable, out-of-band diagnostics architecture for ISS systems support that uses a read-only connection for C&DH data acquisition, which provides a lower cost of deployment and maintenance (versus a higher criticality readwrite connection). The diagnostics processing burden is off-loaded from the avionics network to elements of the on-board LAN that have a lower overall cost of operation and increased computational capacity. A superset of diagnostic data, richer in content than the configured telemetry, is made available to Advanced Diagnostic System (ADS) clients running on wireless handheld devices, affording the crew greater mobility for troubleshooting and providing improved insight into vehicle state. The superset of diagnostic data is made available to the ground in near real-time via an out-of band downlink, providing a high level of fidelity between vehicle state and test, training and operational facilities on the ground.

Top