Synthetic mixed-signal computation in living cells
Rubens, Jacob R.; Selvaggio, Gianluca; Lu, Timothy K.
2016-01-01
Living cells implement complex computations on the continuous environmental signals that they encounter. These computations involve both analogue- and digital-like processing of signals to give rise to complex developmental programs, context-dependent behaviours and homeostatic activities. In contrast to natural biological systems, synthetic biological systems have largely focused on either digital or analogue computation separately. Here we integrate analogue and digital computation to implement complex hybrid synthetic genetic programs in living cells. We present a framework for building comparator gene circuits to digitize analogue inputs based on different thresholds. We then demonstrate that comparators can be predictably composed together to build band-pass filters, ternary logic systems and multi-level analogue-to-digital converters. In addition, we interface these analogue-to-digital circuits with other digital gene circuits to enable concentration-dependent logic. We expect that this hybrid computational paradigm will enable new industrial, diagnostic and therapeutic applications with engineered cells. PMID:27255669
ERIC Educational Resources Information Center
Anderson, Greg; And Others
1996-01-01
Describes the Computer Science Technical Report Project, one of the earliest investigations into the system engineering of digital libraries which pioneered multiinstitutional collaborative research into technical, social, and legal issues related to the development and implementation of a large, heterogeneous, distributed digital library. (LRW)
Digital Maps, Matrices and Computer Algebra
ERIC Educational Resources Information Center
Knight, D. G.
2005-01-01
The way in which computer algebra systems, such as Maple, have made the study of complex problems accessible to undergraduate mathematicians with modest computational skills is illustrated by some large matrix calculations, which arise from representing the Earth's surface by digital elevation models. Such problems are often considered to lie in…
YF-12 cooperative airframe/propulsion control system program, volume 1
NASA Technical Reports Server (NTRS)
Anderson, D. L.; Connolly, G. F.; Mauro, F. M.; Reukauf, P. J.; Marks, R. (Editor)
1980-01-01
Several YF-12C airplane analog control systems were converted to a digital system. Included were the air data computer, autopilot, inlet control system, and autothrottle systems. This conversion was performed to allow assessment of digital technology applications to supersonic cruise aircraft. The digital system was composed of a digital computer and specialized interface unit. A large scale mathematical simulation of the airplane was used for integration testing and software checkout.
NASA Astrophysics Data System (ADS)
Grubert, Emily; Siders, Anne
2016-09-01
Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.
Sandgren, Buster; Crafoord, Joakim; Garellick, Göran; Carlsson, Lars; Weidenhielm, Lars; Olivecrona, Henrik
2013-10-01
Digital radiographic images in the anterior-posterior and lateral view have been gold standard for evaluation of peri-acetabular osteolysis for patients with an uncemented hip replacement. We compared digital radiographic images and computer tomography in detection of peri-acetabular osteolysis and devised a classification system based on computer tomography. Digital radiographs were compared with computer tomography on 206 hips, with a mean follow up 10 years after surgery. The patients had no clinical signs of osteolysis and none were planned for revision surgery. On digital radiographs, 192 cases had no osteolysis and only 14 cases had osteolysis. When using computer tomography there were 184 cases showing small or large osteolysis and only 22 patients had no osteolysis. A classification system for peri-acetabular osteolysis is proposed based on computer tomography that is easy to use on standard follow up evaluation. Copyright © 2013 Elsevier Inc. All rights reserved.
Study of a hybrid multispectral processor
NASA Technical Reports Server (NTRS)
Marshall, R. E.; Kriegler, F. J.
1973-01-01
A hybrid processor is described offering enough handling capacity and speed to process efficiently the large quantities of multispectral data that can be gathered by scanner systems such as MSDS, SKYLAB, ERTS, and ERIM M-7. Combinations of general-purpose and special-purpose hybrid computers were examined to include both analog and digital types as well as all-digital configurations. The current trend toward lower costs for medium-scale digital circuitry suggests that the all-digital approach may offer the better solution within the time frame of the next few years. The study recommends and defines such a hybrid digital computing system in which both special-purpose and general-purpose digital computers would be employed. The tasks of recognizing surface objects would be performed in a parallel, pipeline digital system while the tasks of control and monitoring would be handled by a medium-scale minicomputer system. A program to design and construct a small, prototype, all-digital system has been started.
MDA-image: an environment of networked desktop computers for teleradiology/pathology.
Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P
1991-04-01
MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.
Digital Image Access & Retrieval.
ERIC Educational Resources Information Center
Heidorn, P. Bryan, Ed.; Sandore, Beth, Ed.
Recent technological advances in computing and digital imaging technology have had immediate and permanent consequences for visual resource collections. Libraries are involved in organizing and managing large visual resource collections. The central challenges in working with digital image collections mirror those that libraries have sought to…
Issues for bringing digital libraries into public use
NASA Technical Reports Server (NTRS)
Flater, David W.; Yesha, Yelena
1993-01-01
In much the same way that the field of artificial intelligence produced a cult which fervently believed that computers would soon think like human beings, the existence of electronic books has resurrected the paperless society as a utopian vision to some, an apocalyptic horror to others. In this essay we have attempted to provide realistic notions of what digital libraries are likely to become if they are a popular success. E-books are capable of subsuming most of the media we use today and have the potential for added functionality by being interactive. The environmental impact of having millions more computers will be offset to some degree, perhaps even exceeded, by the fact that televisions, stereos, VCR's, CD players, newspapers, magazines, and books will become part of the computer system or be made redundant. On the whole, large-scale use of digital libraries is likely to be a winning proposition. Whether or not this comes to pass depends on the directions taken by today's researchers and software developers. By involving the public, the effort being put into digital libraries can be leveraged into something which is big enough to make a real change for the better. If digital libraries remain the exclusive property of government, universities, and large research firms, then large parts of the world will remain without digital libraries for years to come, just as they have remained without digital phone service for far too long. If software companies try to scuttle the project by patenting crucial algorithms and using proprietary data formats, all of us will suffer. Let us reverse the errors of the past and create a truly open digital library system.
NASA Technical Reports Server (NTRS)
1973-01-01
Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.
Fly-by-Wire Systems Enable Safer, More Efficient Flight
NASA Technical Reports Server (NTRS)
2012-01-01
Using the ultra-reliable Apollo Guidance Computer that enabled the Apollo Moon missions, Dryden Flight Research Center engineers, in partnership with industry leaders such as Cambridge, Massachusetts-based Draper Laboratory, demonstrated that digital computers could be used to fly aircraft. Digital fly-by-wire systems have since been incorporated into large airliners, military jets, revolutionary new aircraft, and even cars and submarines.
ERIC Educational Resources Information Center
Mikelic Preradovic, Nives; Lešin, Gordana; Šagud, Mirjana
2016-01-01
The aim of this study is to investigate perceptions of parents in Croatia towards advantages and disadvantages of computer use in general as well as their children's computer use and to reveal parents' concerns and opinions about digital technology (DT) education in kindergarten. The paper reports on research findings from one of the large public…
Storage and distribution of pathology digital images using integrated web-based viewing systems.
Marchevsky, Alberto M; Dulbandzhyan, Ronda; Seely, Kevin; Carey, Steve; Duncan, Raymond G
2002-05-01
Health care providers have expressed increasing interest in incorporating digital images of gross pathology specimens and photomicrographs in routine pathology reports. To describe the multiple technical and logistical challenges involved in the integration of the various components needed for the development of a system for integrated Web-based viewing, storage, and distribution of digital images in a large health system. An Oracle version 8.1.6 database was developed to store, index, and deploy pathology digital photographs via our Intranet. The database allows for retrieval of images by patient demographics or by SNOMED code information. The Intranet of a large health system accessible from multiple computers located within the medical center and at distant private physician offices. The images can be viewed using any of the workstations of the health system that have authorized access to our Intranet, using a standard browser or a browser configured with an external viewer or inexpensive plug-in software, such as Prizm 2.0. The images can be printed on paper or transferred to film using a digital film recorder. Digital images can also be displayed at pathology conferences by using wireless local area network (LAN) and secure remote technologies. The standardization of technologies and the adoption of a Web interface for all our computer systems allows us to distribute digital images from a pathology database to a potentially large group of users distributed in multiple locations throughout a large medical center.
Computers in Electrical Engineering Education at Virginia Polytechnic Institute.
ERIC Educational Resources Information Center
Bennett, A. Wayne
1982-01-01
Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…
Low-cost space-varying FIR filter architecture for computational imaging systems
NASA Astrophysics Data System (ADS)
Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.
2010-01-01
Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
Scalable hybrid computation with spikes.
Sarpeshkar, Rahul; O'Halloran, Micah
2002-09-01
We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moderate-precision analog units to collectively compute a precise answer to a computation. Second, frequent discrete signal restoration of the analog information prevents analog noise and offset from degrading the computation. And, third, a state machine enables complex computations to be created using a sequence of elementary computations. A natural choice for implementing this hybrid scheme is one based on spikes because spike-count codes are digital, while spike-time codes are analog. We illustrate how spikes afford easy ways to implement all three components of scalable hybrid computation. First, as an important example of distributed analog computation, we show how spikes can create a distributed modular representation of an analog number by implementing digital carry interactions between spiking analog neurons. Second, we show how signal restoration may be performed by recursive spike-count quantization of spike-time codes. And, third, we use spikes from an analog dynamical system to trigger state transitions in a digital dynamical system, which reconfigures the analog dynamical system using a binary control vector; such feedback interactions between analog and digital dynamical systems create a hybrid state machine (HSM). The HSM extends and expands the concept of a digital finite-state-machine to the hybrid domain. We present experimental data from a two-neuron HSM on a chip that implements error-correcting analog-to-digital conversion with the concurrent use of spike-time and spike-count codes. We also present experimental data from silicon circuits that implement HSM-based pattern recognition using spike-time synchrony. We outline how HSMs may be used to perform learning, vector quantization, spike pattern recognition and generation, and how they may be reconfigured.
On the impact of approximate computation in an analog DeSTIN architecture.
Young, Steven; Lu, Junjie; Holleman, Jeremy; Arel, Itamar
2014-05-01
Deep machine learning (DML) holds the potential to revolutionize machine learning by automating rich feature extraction, which has become the primary bottleneck of human engineering in pattern recognition systems. However, the heavy computational burden renders DML systems implemented on conventional digital processors impractical for large-scale problems. The highly parallel computations required to implement large-scale deep learning systems are well suited to custom hardware. Analog computation has demonstrated power efficiency advantages of multiple orders of magnitude relative to digital systems while performing nonideal computations. In this paper, we investigate typical error sources introduced by analog computational elements and their impact on system-level performance in DeSTIN--a compositional deep learning architecture. These inaccuracies are evaluated on a pattern classification benchmark, clearly demonstrating the robustness of the underlying algorithm to the errors introduced by analog computational elements. A clear understanding of the impacts of nonideal computations is necessary to fully exploit the efficiency of analog circuits.
The use of self-organising maps for anomalous behaviour detection in a digital investigation.
Fei, B K L; Eloff, J H P; Olivier, M S; Venter, H S
2006-10-16
The dramatic increase in crime relating to the Internet and computers has caused a growing need for digital forensics. Digital forensic tools have been developed to assist investigators in conducting a proper investigation into digital crimes. In general, the bulk of the digital forensic tools available on the market permit investigators to analyse data that has been gathered from a computer system. However, current state-of-the-art digital forensic tools simply cannot handle large volumes of data in an efficient manner. With the advent of the Internet, many employees have been given access to new and more interesting possibilities via their desktop. Consequently, excessive Internet usage for non-job purposes and even blatant misuse of the Internet have become a problem in many organisations. Since storage media are steadily growing in size, the process of analysing multiple computer systems during a digital investigation can easily consume an enormous amount of time. Identifying a single suspicious computer from a set of candidates can therefore reduce human processing time and monetary costs involved in gathering evidence. The focus of this paper is to demonstrate how, in a digital investigation, digital forensic tools and the self-organising map (SOM)--an unsupervised neural network model--can aid investigators to determine anomalous behaviours (or activities) among employees (or computer systems) in a far more efficient manner. By analysing the different SOMs (one for each computer system), anomalous behaviours are identified and investigators are assisted to conduct the analysis more efficiently. The paper will demonstrate how the easy visualisation of the SOM enhances the ability of the investigators to interpret and explore the data generated by digital forensic tools so as to determine anomalous behaviours.
Do pre-trained deep learning models improve computer-aided classification of digital mammograms?
NASA Astrophysics Data System (ADS)
Aboutalib, Sarah S.; Mohamed, Aly A.; Zuley, Margarita L.; Berg, Wendie A.; Luo, Yahong; Wu, Shandong
2018-02-01
Digital mammography screening is an important exam for the early detection of breast cancer and reduction in mortality. False positives leading to high recall rates, however, results in unnecessary negative consequences to patients and health care systems. In order to better aid radiologists, computer-aided tools can be utilized to improve distinction between image classifications and thus potentially reduce false recalls. The emergence of deep learning has shown promising results in the area of biomedical imaging data analysis. This study aimed to investigate deep learning and transfer learning methods that can improve digital mammography classification performance. In particular, we evaluated the effect of pre-training deep learning models with other imaging datasets in order to boost classification performance on a digital mammography dataset. Two types of datasets were used for pre-training: (1) a digitized film mammography dataset, and (2) a very large non-medical imaging dataset. By using either of these datasets to pre-train the network initially, and then fine-tuning with the digital mammography dataset, we found an increase in overall classification performance in comparison to a model without pre-training, with the very large non-medical dataset performing the best in improving the classification accuracy.
Using Swarming Agents for Scalable Security in Large Network Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crouse, Michael; White, Jacob L.; Fulp, Errin W.
2011-09-23
The difficulty of securing computer infrastructures increases as they grow in size and complexity. Network-based security solutions such as IDS and firewalls cannot scale because of exponentially increasing computational costs inherent in detecting the rapidly growing number of threat signatures. Hostbased solutions like virus scanners and IDS suffer similar issues, and these are compounded when enterprises try to monitor these in a centralized manner. Swarm-based autonomous agent systems like digital ants and artificial immune systems can provide a scalable security solution for large network environments. The digital ants approach offers a biologically inspired design where each ant in the virtualmore » colony can detect atoms of evidence that may help identify a possible threat. By assembling the atomic evidences from different ant types the colony may detect the threat. This decentralized approach can require, on average, fewer computational resources than traditional centralized solutions; however there are limits to its scalability. This paper describes how dividing a large infrastructure into smaller managed enclaves allows the digital ant framework to effectively operate in larger environments. Experimental results will show that using smaller enclaves allows for more consistent distribution of agents and results in faster response times.« less
Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis
NASA Astrophysics Data System (ADS)
Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi
2017-03-01
Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.
Computers as an Instrument for Data Analysis. Technical Report No. 11.
ERIC Educational Resources Information Center
Muller, Mervin E.
A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…
Computer Exercises in Systems and Fields Experiments
ERIC Educational Resources Information Center
Bacon, C. M.; McDougal, J. R.
1971-01-01
Laboratory activities give students an opportunity to interact with computers in modes ranging from remote terminal use in laboratory experimentation to the direct hands-on use of a small digital computer with disk memory and on-line plotter, and finally to the use of a large computer under closed-shop operation. (Author/TS)
NASA Technical Reports Server (NTRS)
Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.
1973-01-01
This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.
Reliability modeling of fault-tolerant computer based systems
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1987-01-01
Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.
Does Mood Change How We Organize Digital Files?
ERIC Educational Resources Information Center
Massey, Charlotte
2017-01-01
Retrieving files from one's computer is done daily and is an essential part of completing most tasks at work, yet surprisingly little research has examined the ways that people structure and organize their files. Management of personal digital information is a challenging task that users approach idiosyncratically. Large individual differences…
NASA Astrophysics Data System (ADS)
Saxena, Nishank; Hofmann, Ronny; Alpak, Faruk O.; Berg, Steffen; Dietderich, Jesse; Agarwal, Umang; Tandon, Kunj; Hunter, Sander; Freeman, Justin; Wilson, Ove Bjorn
2017-11-01
We generate a novel reference dataset to quantify the impact of numerical solvers, boundary conditions, and simulation platforms. We consider a variety of microstructures ranging from idealized pipes to digital rocks. Pore throats of the digital rocks considered are large enough to be well resolved with state-of-the-art micro-computerized tomography technology. Permeability is computed using multiple numerical engines, 12 in total, including, Lattice-Boltzmann, computational fluid dynamics, voxel based, fast semi-analytical, and known empirical models. Thus, we provide a measure of uncertainty associated with flow computations of digital media. Moreover, the reference and standards dataset generated is the first of its kind and can be used to test and improve new fluid flow algorithms. We find that there is an overall good agreement between solvers for idealized cross-section shape pipes. As expected, the disagreement increases with increase in complexity of the pore space. Numerical solutions for pipes with sinusoidal variation of cross section show larger variability compared to pipes of constant cross-section shapes. We notice relatively larger variability in computed permeability of digital rocks with coefficient of variation (of up to 25%) in computed values between various solvers. Still, these differences are small given other subsurface uncertainties. The observed differences between solvers can be attributed to several causes including, differences in boundary conditions, numerical convergence criteria, and parameterization of fundamental physics equations. Solvers that perform additional meshing of irregular pore shapes require an additional step in practical workflows which involves skill and can introduce further uncertainty. Computation times for digital rocks vary from minutes to several days depending on the algorithm and available computational resources. We find that more stringent convergence criteria can improve solver accuracy but at the expense of longer computation time.
Digital hand atlas and computer-aided bone age assessment via the Web
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente
1999-07-01
A frequently used assessment method of bone age is atlas matching by a radiological examination of a hand image against a reference set of atlas patterns of normal standards. We are in a process of developing a digital hand atlas with a large standard set of normal hand and wrist images that reflect the skeletal maturity, race and sex difference, and current child development. The digital hand atlas will be used for a computer-aided bone age assessment via Web. We have designed and partially implemented a computer-aided diagnostic (CAD) system for Web-based bone age assessment. The system consists of a digital hand atlas, a relational image database and a Web-based user interface. The digital atlas is based on a large standard set of normal hand an wrist images with extracted bone objects and quantitative features. The image database uses a content- based indexing to organize the hand images and their attributes and present to users in a structured way. The Web-based user interface allows users to interact with the hand image database from browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, will be extracted and compared with patterns from the atlas database to assess the bone age. The relevant reference imags and the final assessment report will be sent back to the user's browser via Web. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. In this paper, we present the system design and Web-based client-server model for computer-assisted bone age assessment and our initial implementation of the digital atlas database.
Removing the center from computing: biology's new mode of digital knowledge production.
November, Joseph
2011-06-01
This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.
Extraction of drainage networks from large terrain datasets using high throughput computing
NASA Astrophysics Data System (ADS)
Gong, Jianya; Xie, Jibo
2009-02-01
Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.
Review of integrated digital systems: evolution and adoption
NASA Astrophysics Data System (ADS)
Fritz, Lawrence W.
The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.
1974-01-01
The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Computer programs for large systems of normal equations, an interactive digital signal process, structural analysis of cylindrical thrust chambers, swirling turbulent axisymmetric recirculating flows in practical isothermal combustor geometrics, computation of three dimensional combustor performance, a thermal radiation analysis system, transient response analysis, and a software design analysis are summarized.
Large-scale feature searches of collections of medical imagery
NASA Astrophysics Data System (ADS)
Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.
1993-09-01
Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.
Science Teachers' Response to the Digital Education Revolution
ERIC Educational Resources Information Center
Nielsen, Wendy; Miller, K. Alex; Hoban, Garry
2015-01-01
We report a case study of two highly qualified science teachers as they implemented laptop computers in their Years 9 and 10 science classes at the beginning of the "Digital Education Revolution," Australia's national one-to-one laptop program initiated in 2009. When a large-scale investment is made in a significant educational change,…
Matsushima, Kyoji; Sonobe, Noriaki
2018-01-01
Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.
Web-Based Consumer Health Information: Public Access, Digital Division, and Remainders
Lorence, Daniel; Park, Heeyoung
2006-01-01
Public access Internet portals and decreasing costs of personal computers have created a growing consensus that unequal access to information, or a “digital divide,” has largely disappeared for US consumers. A series of technology initiatives in the late 1990s were believed to have largely eliminated the divide. For healthcare patients, access to information is an essential part of the consumer-centric framework outlined in the recently proposed national health information initiative. Data from a recent study of health information-seeking behaviors on the Internet suggest that a “digitally underserved group” persists, effectively limiting the planned national health information infrastructure to wealthier Americans. PMID:16926743
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1974-01-01
The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1974-01-01
The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.
ERIC Educational Resources Information Center
Smolinski, Tomasz G.
2010-01-01
Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…
Local sharpening and subspace wavefront correction with predictive dynamic digital holography
NASA Astrophysics Data System (ADS)
Sulaiman, Sennan; Gibson, Steve
2017-09-01
Digital holography holds several advantages over conventional imaging and wavefront sensing, chief among these being significantly fewer and simpler optical components and the retrieval of complex field. Consequently, many imaging and sensing applications including microscopy and optical tweezing have turned to using digital holography. A significant obstacle for digital holography in real-time applications, such as wavefront sensing for high energy laser systems and high speed imaging for target racking, is the fact that digital holography is computationally intensive; it requires iterative virtual wavefront propagation and hill-climbing to optimize some sharpness criteria. It has been shown recently that minimum-variance wavefront prediction can be integrated with digital holography and image sharpening to reduce significantly large number of costly sharpening iterations required to achieve near-optimal wavefront correction. This paper demonstrates further gains in computational efficiency with localized sharpening in conjunction with predictive dynamic digital holography for real-time applications. The method optimizes sharpness of local regions in a detector plane by parallel independent wavefront correction on reduced-dimension subspaces of the complex field in a spectral plane.
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
Low power signal processing research at Stanford
NASA Technical Reports Server (NTRS)
Burr, J.; Williamson, P. R.; Peterson, A.
1991-01-01
This paper gives an overview of the research being conducted at Stanford University's Space, Telecommunications, and Radioscience Laboratory in the area of low energy computation. It discusses the work we are doing in large scale digital VLSI neural networks, interleaved processor and pipelined memory architectures, energy estimation and optimization, multichip module packaging, and low voltage digital logic.
NASA Technical Reports Server (NTRS)
Kandelman, A.; Nelson, D. J.
1977-01-01
Simplified mathematical model simulates large hydraulic systems on either analog or digital computers. Models of pumps, servoactuators, reservoirs, accumulators, and valves are connected generating systems containing six hundred elements.
External validation of Medicare claims codes for digital mammography and computer-aided detection.
Fenton, Joshua J; Zhu, Weiwei; Balch, Steven; Smith-Bindman, Rebecca; Lindfors, Karen K; Hubbard, Rebecca A
2012-08-01
While Medicare claims are a potential resource for clinical mammography research or quality monitoring, the validity of key data elements remains uncertain. Claims codes for digital mammography and computer-aided detection (CAD), for example, have not been validated against a credible external reference standard. We matched Medicare mammography claims for women who received bilateral mammograms from 2003 to 2006 to corresponding mammography data from the Breast Cancer Surveillance Consortium (BCSC) registries in four U.S. states (N = 253,727 mammograms received by 120,709 women). We assessed the accuracy of the claims-based classifications of bilateral mammograms as either digital versus film and CAD versus non-CAD relative to a reference standard derived from BCSC data. Claims data correctly classified the large majority of film and digital mammograms (97.2% and 97.3%, respectively), yielding excellent agreement beyond chance (κ = 0.90). Claims data correctly classified the large majority of CAD mammograms (96.6%) but a lower percentage of non-CAD mammograms (86.7%). Agreement beyond chance remained high for CAD classification (κ = 0.83). From 2003 to 2006, the predictive values of claims-based digital and CAD classifications increased as the sample prevalences of each technology increased. Medicare claims data can accurately distinguish film and digital bilateral mammograms and mammograms conducted with and without CAD. The validity of Medicare claims data regarding film versus digital mammography and CAD suggests that these data elements can be useful in research and quality improvement. ©2012 AACR.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
NASA Technical Reports Server (NTRS)
Strong, J. P., III
1973-01-01
Tse computers have the potential of operating four or five orders of magnitude faster than present digital computers. The computers of the new design use binary images as their basic computational entity. The word 'tse' is the transliteration of the Chinese word for 'pictograph character.' Tse computers are large collections of devices that perform logical operations on binary images. The operations on binary images are to be performed over the entire image simultaneously.
PLANNING FOR OPTICAL DISK TECHNOLOGY WITH DIGITAL CARTOGRAPHY.
Light, Donald L.
1984-01-01
Progress in the computer field continues to suggest that the transition from traditional analog mapping systems to digital systems has become a practical possibility. A major shortfall that still exists in digital systems is the need for very large mass storage capacity. The decade of the 1980's has introduced laser optical disk storage technology, which may be the breakthrough needed for mass storage. This paper addresses system concepts for digital cartography during the transition period. Emphasis is placed on determining U. S. Geological Survey mass storage requirements and introducing laser optical disk technology for handling storage problems for digital data in this decade.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Xu, Qun; Wang, Xianchao; Xu, Chao
2017-06-01
Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.
Image database for digital hand atlas
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Dey, Partha S.; Gertych, Arkadiusz; Pospiech-Kurkowska, Sywia
2003-05-01
Bone age assessment is a procedure frequently performed in pediatric patients to evaluate their growth disorder. A commonly used method is atlas matching by a visual comparison of a hand radiograph with a small reference set of old Greulich-Pyle atlas. We have developed a new digital hand atlas with a large set of clinically normal hand images of diverse ethnic groups. In this paper, we will present our system design and implementation of the digital atlas database to support the computer-aided atlas matching for bone age assessment. The system consists of a hand atlas image database, a computer-aided diagnostic (CAD) software module for image processing and atlas matching, and a Web user interface. Users can use a Web browser to push DICOM images, directly or indirectly from PACS, to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, are then extracted and compared with patterns from the atlas image database to assess the bone age. The digital atlas method built on a large image database and current Internet technology provides an alternative to supplement or replace the traditional one for a quantitative, accurate and cost-effective assessment of bone age.
A high-accuracy optical linear algebra processor for finite element applications
NASA Technical Reports Server (NTRS)
Casasent, D.; Taylor, B. K.
1984-01-01
Optical linear processors are computationally efficient computers for solving matrix-matrix and matrix-vector oriented problems. Optical system errors limit their dynamic range to 30-40 dB, which limits their accuray to 9-12 bits. Large problems, such as the finite element problem in structural mechanics (with tens or hundreds of thousands of variables) which can exploit the speed of optical processors, require the 32 bit accuracy obtainable from digital machines. To obtain this required 32 bit accuracy with an optical processor, the data can be digitally encoded, thereby reducing the dynamic range requirements of the optical system (i.e., decreasing the effect of optical errors on the data) while providing increased accuracy. This report describes a new digitally encoded optical linear algebra processor architecture for solving finite element and banded matrix-vector problems. A linear static plate bending case study is described which quantities the processor requirements. Multiplication by digital convolution is explained, and the digitally encoded optical processor architecture is advanced.
Digital Mammography with a Mosaic of CCD-Arrays
NASA Technical Reports Server (NTRS)
Jalink, Antony, Jr. (Inventor); McAdoo, James A. (Inventor)
1996-01-01
The present invention relates generally to a mammography device and method and more particularly to a novel digital mammography device and method to detect microcalcifications of precancerous tissue. A digital mammography device uses a mosaic of electronic digital imaging arrays to scan an x-ray image. The mosaic of arrays is repositioned several times to expose different portions of the image, until the entire image is scanned. The data generated by the arrays during each exposure is stored in a computer. After the final exposure, the computer combines data of the several partial images to produce a composite of the original x-ray image. An aperture plate is used to reduce scatter and the overall exposure of the patient to x-rays. The novelty of this invention is that it provides a digital mammography device with large field coverage, high spatial resolution, scatter rejection, excellent contrast characteristics and lesion detectability under clinical conditions. This device also shields the patient from excessive radiation, can detect extremely small calcifications and allows manipulation and storage of the image.
NASA Technical Reports Server (NTRS)
Shiva, S. G.; Shah, A. M.
1980-01-01
The details of digital systems can be conveniently input into the design automation system by means of hardware description language (HDL). The computer aided design and test (CADAT) system at NASA MSFC is used for the LSI design. The digital design language (DDL) was selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. Problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system are addressed.
Information collection and processing of dam distortion in digital reservoir system
NASA Astrophysics Data System (ADS)
Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju
2007-06-01
The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.
Grajski, Kamil A.
2016-01-01
Mechanisms underlying the emergence and plasticity of representational discontinuities in the mammalian primary somatosensory cortical representation of the hand are investigated in a computational model. The model consists of an input lattice organized as a three-digit hand forward-connected to a lattice of cortical columns each of which contains a paired excitatory and inhibitory cell. Excitatory and inhibitory synaptic plasticity of feedforward and lateral connection weights is implemented as a simple covariance rule and competitive normalization. Receptive field properties are computed independently for excitatory and inhibitory cells and compared within and across columns. Within digit representational zones intracolumnar excitatory and inhibitory receptive field extents are concentric, single-digit, small, and unimodal. Exclusively in representational boundary-adjacent zones, intracolumnar excitatory and inhibitory receptive field properties diverge: excitatory cell receptive fields are single-digit, small, and unimodal; and the paired inhibitory cell receptive fields are bimodal, double-digit, and large. In simulated syndactyly (webbed fingers), boundary-adjacent intracolumnar receptive field properties reorganize to within-representation type; divergent properties are reacquired following syndactyly release. This study generates testable hypotheses for assessment of cortical laminar-dependent receptive field properties and plasticity within and between cortical representational zones. For computational studies, present results suggest that concurrent excitatory and inhibitory plasticity may underlie novel emergent properties. PMID:27504086
NASA Astrophysics Data System (ADS)
Wang, Rui
It is known that high intensity radiated fields (HIRF) can produce upsets in digital electronics, and thereby degrade the performance of digital flight control systems. Such upsets, either from natural or man-made sources, can change data values on digital buses and memory and affect CPU instruction execution. HIRF environments are also known to trigger common-mode faults, affecting nearly-simultaneously multiple fault containment regions, and hence reducing the benefits of n-modular redundancy and other fault-tolerant computing techniques. Thus, it is important to develop models which describe the integration of the embedded digital system, where the control law is implemented, as well as the dynamics of the closed-loop system. In this dissertation, theoretical tools are presented to analyze the relationship between the design choices for a class of distributed recoverable computing platforms and the tracking performance degradation of a digital flight control system implemented on such a platform while operating in a HIRF environment. Specifically, a tractable hybrid performance model is developed for a digital flight control system implemented on a computing platform inspired largely by the NASA family of fault-tolerant, reconfigurable computer architectures known as SPIDER (scalable processor-independent design for enhanced reliability). The focus will be on the SPIDER implementation, which uses the computer communication system known as ROBUS-2 (reliable optical bus). A physical HIRF experiment was conducted at the NASA Langley Research Center in order to validate the theoretical tracking performance degradation predictions for a distributed Boeing 747 flight control system subject to a HIRF environment. An extrapolation of these results for scenarios that could not be physically tested is also presented.
A parallel algorithm for viewshed analysis in three-dimensional Digital Earth
NASA Astrophysics Data System (ADS)
Feng, Wang; Gang, Wang; Deji, Pan; Yuan, Liu; Liuzhong, Yang; Hongbo, Wang
2015-02-01
Viewshed analysis, often supported by geographic information systems, is widely used in the three-dimensional (3D) Digital Earth system. Many of the analyzes involve the siting of features and real-timedecision-making. Viewshed analysis is usually performed at a large scale, which poses substantial computational challenges, as geographic datasets continue to become increasingly large. Previous research on viewshed analysis has been generally limited to a single data structure (i.e., DEM), which cannot be used to analyze viewsheds in complicated scenes. In this paper, a real-time algorithm for viewshed analysis in Digital Earth is presented using the parallel computing of graphics processing units (GPUs). An occlusion for each geometric entity in the neighbor space of the viewshed point is generated according to line-of-sight. The region within the occlusion is marked by a stencil buffer within the programmable 3D visualization pipeline. The marked region is drawn with red color concurrently. In contrast to traditional algorithms based on line-of-sight, the new algorithm, in which the viewshed calculation is integrated with the rendering module, is more efficient and stable. This proposed method of viewshed generation is closer to the reality of the virtual geographic environment. No DEM interpolation, which is seen as a computational burden, is needed. The algorithm was implemented in a 3D Digital Earth system (GeoBeans3D) with the DirectX application programming interface (API) and has been widely used in a range of applications.
Scaling up digital circuit computation with DNA strand displacement cascades.
Qian, Lulu; Winfree, Erik
2011-06-03
To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
Image analysis and machine learning in digital pathology: Challenges and opportunities.
Madabhushi, Anant; Lee, George
2016-10-01
With the rise in whole slide scanner technology, large numbers of tissue slides are being scanned and represented and archived digitally. While digital pathology has substantial implications for telepathology, second opinions, and education there are also huge research opportunities in image computing with this new source of "big data". It is well known that there is fundamental prognostic data embedded in pathology images. The ability to mine "sub-visual" image features from digital pathology slide images, features that may not be visually discernible by a pathologist, offers the opportunity for better quantitative modeling of disease appearance and hence possibly improved prediction of disease aggressiveness and patient outcome. However the compelling opportunities in precision medicine offered by big digital pathology data come with their own set of computational challenges. Image analysis and computer assisted detection and diagnosis tools previously developed in the context of radiographic images are woefully inadequate to deal with the data density in high resolution digitized whole slide images. Additionally there has been recent substantial interest in combining and fusing radiologic imaging and proteomics and genomics based measurements with features extracted from digital pathology images for better prognostic prediction of disease aggressiveness and patient outcome. Again there is a paucity of powerful tools for combining disease specific features that manifest across multiple different length scales. The purpose of this review is to discuss developments in computational image analysis tools for predictive modeling of digital pathology images from a detection, segmentation, feature extraction, and tissue classification perspective. We discuss the emergence of new handcrafted feature approaches for improved predictive modeling of tissue appearance and also review the emergence of deep learning schemes for both object detection and tissue classification. We also briefly review some of the state of the art in fusion of radiology and pathology images and also combining digital pathology derived image measurements with molecular "omics" features for better predictive modeling. The review ends with a brief discussion of some of the technical and computational challenges to be overcome and reflects on future opportunities for the quantitation of histopathology. Copyright © 2016 Elsevier B.V. All rights reserved.
Teaching and Learning Physics in a 1:1 Laptop School
NASA Astrophysics Data System (ADS)
Zucker, Andrew A.; Hug, Sarah T.
2008-12-01
1:1 laptop programs, in which every student is provided with a personal computer to use during the school year, permit increased and routine use of powerful, user-friendly computer-based tools. Growing numbers of 1:1 programs are reshaping the roles of teachers and learners in science classrooms. At the Denver School of Science and Technology, a public charter high school where a large percentage of students come from low-income families, 1:1 laptops are used often by teachers and students. This article describes the school's use of laptops, the Internet, and related digital tools, especially for teaching and learning physics. The data are from teacher and student surveys, interviews, classroom observations, and document analyses. Physics students and teachers use an interactive digital textbook; Internet-based simulations (some developed by a Nobel Prize winner); word processors; digital drop boxes; email; formative electronic assessments; computer-based and stand-alone graphing calculators; probes and associated software; and digital video cameras to explore hypotheses, collaborate, engage in scientific inquiry, and to identify strengths and weaknesses of students' understanding of physics. Technology provides students at DSST with high-quality tools to explore scientific concepts and the experiences of teachers and students illustrate effective uses of digital technology for high school physics.
NASA Technical Reports Server (NTRS)
Belcastro, C. M.
1983-01-01
Flight critical computer based control systems designed for advanced aircraft must exhibit ultrareliable performance in lightning charged environments. Digital system upset can occur as a result of lightning induced electrical transients, and a methodology was developed to test specific digital systems for upset susceptibility. Initial upset data indicates that there are several distinct upset modes and that the occurrence of upset is related to the relative synchronization of the transient input with the processing sate of the digital system. A large upset test data base will aid in the formulation and verification of analytical upset reliability modeling techniques which are being developed.
The role of digital cartographic data in the geosciences
Guptill, S.C.
1983-01-01
The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.
Digital quantum simulators in a scalable architecture of hybrid spin-photon qubits
Chiesa, Alessandro; Santini, Paolo; Gerace, Dario; Raftery, James; Houck, Andrew A.; Carretta, Stefano
2015-01-01
Resolving quantum many-body problems represents one of the greatest challenges in physics and physical chemistry, due to the prohibitively large computational resources that would be required by using classical computers. A solution has been foreseen by directly simulating the time evolution through sequences of quantum gates applied to arrays of qubits, i.e. by implementing a digital quantum simulator. Superconducting circuits and resonators are emerging as an extremely promising platform for quantum computation architectures, but a digital quantum simulator proposal that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is presently lacking. Here we propose a viable scheme to implement a universal quantum simulator with hybrid spin-photon qubits in an array of superconducting resonators, which is intrinsically scalable and allows for local control. As representative examples we consider the transverse-field Ising model, a spin-1 Hamiltonian, and the two-dimensional Hubbard model and we numerically simulate the scheme by including the main sources of decoherence. PMID:26563516
Digital mammography, cancer screening: Factors important for image compression
NASA Technical Reports Server (NTRS)
Clarke, Laurence P.; Blaine, G. James; Doi, Kunio; Yaffe, Martin J.; Shtern, Faina; Brown, G. Stephen; Winfield, Daniel L.; Kallergi, Maria
1993-01-01
The use of digital mammography for breast cancer screening poses several novel problems such as development of digital sensors, computer assisted diagnosis (CAD) methods for image noise suppression, enhancement, and pattern recognition, compression algorithms for image storage, transmission, and remote diagnosis. X-ray digital mammography using novel direct digital detection schemes or film digitizers results in large data sets and, therefore, image compression methods will play a significant role in the image processing and analysis by CAD techniques. In view of the extensive compression required, the relative merit of 'virtually lossless' versus lossy methods should be determined. A brief overview is presented here of the developments of digital sensors, CAD, and compression methods currently proposed and tested for mammography. The objective of the NCI/NASA Working Group on Digital Mammography is to stimulate the interest of the image processing and compression scientific community for this medical application and identify possible dual use technologies within the NASA centers.
NASA Astrophysics Data System (ADS)
Gonzalez, Javier
A full field method for visualizing deformation around the crack tip in a fracture process with large strains is developed. A digital image correlation program (DIC) is used to incrementally compute strains and displacements between two consecutive images of a deformation process. Values of strain and displacements for consecutive deformations are added, this way solving convergence problems in the DIC algorithm when large deformations are investigated. The method developed is used to investigate the strain distribution within 1 mm of the crack tip in a particulate composite solid (propellant) using microscopic visualization of the deformation process.
NASA Technical Reports Server (NTRS)
Christenson, D.; Gordon, M.; Kistler, R.; Kriegler, F.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1977-01-01
A third-generation, fast, low cost, multispectral recognition system (MIDAS) able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensots is described. The program can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principle objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in the overall program is described. The system contains a midi-computer to control the various high speed processing elements in the data path, a preprocessor to condition data, and a classifier which implements an all digital prototype multivariate Gaussian maximum likelihood or a Bayesian decision algorithm. Sufficient software was developed to perform signature extraction, control the preprocessor, compute classifier coefficients, control the classifier operation, operate the color display and printer, and diagnose operation.
DDC Systems for Searching for Near-Earth Asteroids
NASA Technical Reports Server (NTRS)
Harris, A.
1994-01-01
Large format CCD systems are superior to photographic systems in terms of quantum efficiency and that they yield digital output directly, which can be computer analyzed to detect moving objects and to obtain astrometric measurements.
Lystrom, David J.
1972-01-01
Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.
Open Source Live Distributions for Computer Forensics
NASA Astrophysics Data System (ADS)
Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele
Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.
Rossi, Ernest; Mortimer, Jane; Rossi, Kathryn
2013-04-01
Culturomics is a new scientific discipline of the digital humanities-the use of computer algorithms to search for meaning in large databases of text and media. This new digital discipline is used to explore 200 years of the history of hypnosis and psychotherapy in over five million digitized books from more than 40 university libraries around the world. It graphically compares the frequencies of English words about hypnosis, hypnotherapy, psychoanalysis, psychotherapy, and their founders from 1800 to 2008. This new perspective explore issues such as: Who were the major innovators in the history of therapeutic hypnosis, psychoanalysis, and psychotherapy? How well does this new digital approach to the humanities correspond to traditional histories of hypnosis and psychotherapy?
S-1 project. Volume I. Architecture. 1979 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-01-01
The US Navy is one of the world's largest users of digital computing equipment having a procurement cost of at least $50,000, and is the single largest such computer customer in the Department of Defense. Its projected acquisition plan for embedded computer systems during the first half of the 80s contemplates the installation of over 10,000 such systems at an estimated cost of several billions of dollars. This expenditure, though large, is dwarfed by the 85 billion dollars which DOD is projected to spend during the next half-decade on computer software, the near-majority of which will be spent by themore » Navy; the life-cycle costs of the 700,000+ lines of software for a single large Navy weapons systems application (e.g., AEGIS) have been conservatively estimated at most of a billion dollars. The S-1 Project is dedicated to realizing potentially large improvements in the efficiency with which such very large sums may be spent, so that greater military effectiveness may be secured earlier, and with smaller expenditures. The fundamental objectives of the S-1 Project's work are first to enable the Navy to be able to quickly, reliably and inexpensively evaluate at any time what is available from the state-of-the-art in digital processing systems and what the relevance of such systems may be to Navy data processing applications: and second to provide reference prototype systems to support possible competitive procurement action leading to deployment of such systems.« less
The future of medical diagnostics: large digitized databases.
Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron
2012-09-01
The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.
NASA Astrophysics Data System (ADS)
Broccard, Frédéric D.; Joshi, Siddharth; Wang, Jun; Cauwenberghs, Gert
2017-08-01
Objective. Computation in nervous systems operates with different computational primitives, and on different hardware, than traditional digital computation and is thus subjected to different constraints from its digital counterpart regarding the use of physical resources such as time, space and energy. In an effort to better understand neural computation on a physical medium with similar spatiotemporal and energetic constraints, the field of neuromorphic engineering aims to design and implement electronic systems that emulate in very large-scale integration (VLSI) hardware the organization and functions of neural systems at multiple levels of biological organization, from individual neurons up to large circuits and networks. Mixed analog/digital neuromorphic VLSI systems are compact, consume little power and operate in real time independently of the size and complexity of the model. Approach. This article highlights the current efforts to interface neuromorphic systems with neural systems at multiple levels of biological organization, from the synaptic to the system level, and discusses the prospects for future biohybrid systems with neuromorphic circuits of greater complexity. Main results. Single silicon neurons have been interfaced successfully with invertebrate and vertebrate neural networks. This approach allowed the investigation of neural properties that are inaccessible with traditional techniques while providing a realistic biological context not achievable with traditional numerical modeling methods. At the network level, populations of neurons are envisioned to communicate bidirectionally with neuromorphic processors of hundreds or thousands of silicon neurons. Recent work on brain-machine interfaces suggests that this is feasible with current neuromorphic technology. Significance. Biohybrid interfaces between biological neurons and VLSI neuromorphic systems of varying complexity have started to emerge in the literature. Primarily intended as a computational tool for investigating fundamental questions related to neural dynamics, the sophistication of current neuromorphic systems now allows direct interfaces with large neuronal networks and circuits, resulting in potentially interesting clinical applications for neuroengineering systems, neuroprosthetics and neurorehabilitation.
Modeling of a latent fault detector in a digital system
NASA Technical Reports Server (NTRS)
Nagel, P. M.
1978-01-01
Methods of modeling the detection time or latency period of a hardware fault in a digital system are proposed that explain how a computer detects faults in a computational mode. The objectives were to study how software reacts to a fault, to account for as many variables as possible affecting detection and to forecast a given program's detecting ability prior to computation. A series of experiments were conducted on a small emulated microprocessor with fault injection capability. Results indicate that the detecting capability of a program largely depends on the instruction subset used during computation and the frequency of its use and has little direct dependence on such variables as fault mode, number set, degree of branching and program length. A model is discussed which employs an analog with balls in an urn to explain the rate of which subsequent repetitions of an instruction or instruction set detect a given fault.
Ethics Regulation in Social Computing Research: Examining the Role of Institutional Review Boards.
Vitak, Jessica; Proferes, Nicholas; Shilton, Katie; Ashktorab, Zahra
2017-12-01
The parallel rise of pervasive data collection platforms and computational methods for collecting, analyzing, and drawing inferences from large quantities of user data has advanced social computing research, investigating digital traces to understand mediated behaviors of individuals, groups, and societies. At the same time, methods employed to access these data have raised questions about ethical research practices. This article provides insights into U.S. institutional review boards' (IRBs) attitudes and practices regulating social computing research. Through descriptive and inferential analysis of survey data from staff at 59 IRBs at research universities, we examine how IRBs evaluate the growing variety of studies using pervasive digital data. Findings unpack the difficulties IRB staff face evaluating increasingly technical research proposals while highlighting the belief in their ability to surmount these difficulties. They also indicate a lack of consensus among IRB staff about what should be reviewed and a willingness to work closely with researchers.
Computer Use by Preschool Children: Rethinking Practice as Digital Natives Come to Preschool
ERIC Educational Resources Information Center
Zevenbergen, Robyn; Logan, Helen
2008-01-01
This paper reports on the outcomes of a survey implemented in a large regional community of Australia. The survey was completed by parents of children aged four-five years and attending local early childhood centres. The survey identified the types of access and use of computers by preschool children. It was found that the children of the…
A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions
Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.
2017-01-01
Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945
A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.
Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J
2017-04-12
Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.
NASA Technical Reports Server (NTRS)
Low, M. D.; Baker, M.; Ferguson, R.; Frost, J. D., Jr.
1972-01-01
This paper describes a complete electroencephalographic acquisition and transmission system, designed to meet the needs of a large hospital with multiple critical care patient monitoring units. The system provides rapid and prolonged access to a centralized recording and computing area from remote locations within the hospital complex, and from locations in other hospitals and other cities. The system includes quick-on electrode caps, amplifier units and cable transmission for access from within the hospital, and EEG digitization and telephone transmission for access from other hospitals or cities.
NASA Astrophysics Data System (ADS)
Wen, Sy-Bor; Bhaskar, Arun; Zhang, Hongjie
2018-07-01
A scanning digital lithography system using computer controlled digital spatial light modulator, spatial filter, infinity correct optical microscope and high precision translation stage is proposed and examined. Through utilizing the spatial filter to limit orders of diffraction modes for light delivered from the spatial light modulator, we are able to achieve diffraction limited deep submicron spatial resolution with the scanning digital lithography system by using standard one inch level optical components with reasonable prices. Raster scanning of this scanning digital lithography system using a high speed high precision x-y translation stage and piezo mount to real time adjust the focal position of objective lens allows us to achieve large area sub-micron resolved patterning with high speed (compared with e-beam lithography). It is determined in this study that to achieve high quality stitching of lithography patterns with raster scanning, a high-resolution rotation stage will be required to ensure the x and y directions of the projected pattern are in the same x and y translation directions of the nanometer precision x-y translation stage.
Digital robust control law synthesis using constrained optimization
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivekananda
1989-01-01
Development of digital robust control laws for active control of high performance flexible aircraft and large space structures is a research area of significant practical importance. The flexible system is typically modeled by a large order state space system of equations in order to accurately represent the dynamics. The active control law must satisy multiple conflicting design requirements and maintain certain stability margins, yet should be simple enough to be implementable on an onboard digital computer. Described here is an application of a generic digital control law synthesis procedure for such a system, using optimal control theory and constrained optimization technique. A linear quadratic Gaussian type cost function is minimized by updating the free parameters of the digital control law, while trying to satisfy a set of constraints on the design loads, responses and stability margins. Analytical expressions for the gradients of the cost function and the constraints with respect to the control law design variables are used to facilitate rapid numerical convergence. These gradients can be used for sensitivity study and may be integrated into a simultaneous structure and control optimization scheme.
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)
1975-01-01
The author has identified the following significant results. It was found that the high speed man machine interaction capability is a distinct advantage of the image 100; however, the small size of the digital computer in the system is a definite limitation. The system can be highly useful in an analysis mode in which it complements a large general purpose computer. The image 100 was found to be extremely valuable in the analysis of aircraft MSS data where the spatial resolution begins to approach photographic quality and the analyst can exercise interpretation judgements and readily interact with the machine.
Single stock dynamics on high-frequency data: from a compressed coding perspective.
Fushing, Hsieh; Chen, Shu-Chun; Hwang, Chii-Ruey
2014-01-01
High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS), and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors.
Single Stock Dynamics on High-Frequency Data: From a Compressed Coding Perspective
Fushing, Hsieh; Chen, Shu-Chun; Hwang, Chii-Ruey
2014-01-01
High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS), and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors. PMID:24586235
Landes, Constantin A; Weichert, Frank; Geis, Philipp; Wernstedt, Katrin; Wilde, Anja; Fritsch, Helga; Wagner, Mathias
2005-08-01
This study analyses tissue-plastinated vs. celloidin-embedded large serial sections, their inherent artefacts and aptitude with common video, analog or digital photographic on-screen reproduction. Subsequent virtual 3D microanatomical reconstruction will increase our knowledge of normal and pathological microanatomy for cleft-lip-palate (clp) reconstructive surgery. Of 18 fetal (six clp, 12 control) specimens, six randomized specimens (two clp) were BiodurE12-plastinated, sawn, burnished 90 microm thick transversely (five) or frontally (one), stained with azureII/methylene blue, and counterstained with basic-fuchsin (TP-AMF). Twelve remaining specimens (four clp) were celloidin-embedded, microtome-sectioned 75 microm thick transversely (ten) or frontally (two), and stained with haematoxylin-eosin (CE-HE). Computed-planimetry gauged artefacts, structure differentiation was compared with light microscopy on video, analog and digital photography. Total artefact was 0.9% (TP-AMF) and 2.1% (CE-HE); TP-AMF showed higher colour contrast, gamut and luminance, and CE-HE more red contrast, saturation and hue (P < 0.4). All (100%) structures of interest were light microscopically discerned, 83% on video, 76% on analog photography and 98% in digital photography. Computed image analysis assessed the greatest colour contrast, gamut, luminance and saturation on video; the most detailed, colour-balanced and sharpest images were obtained with digital photography (P < 0.02). TP-AMF retained spatial oversight, covered the entire area of interest and should be combined in different specimens with CE-HE which enables more refined muscle fibre reproduction. Digital photography is preferred for on-screen analysis.
Processing Ocean Images to Detect Large Drift Nets
NASA Technical Reports Server (NTRS)
Veenstra, Tim
2009-01-01
A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.
Modern Methods for fast generation of digital holograms
NASA Astrophysics Data System (ADS)
Tsang, P. W. M.; Liu, J. P.; Cheung, K. W. K.; Poon, T.-C.
2010-06-01
With the advancement of computers, digital holography (DH) has become an area of interest that has gained much popularity. Research findings derived from this technology enables holograms representing three dimensional (3-D) scenes to be acquired with optical means, or generated with numerical computation. In both cases, the holograms are in the form of numerical data that can be recorded, transmitted, and processed with digital techniques. On top of that, the availability of high capacity digital storage and wide-band communication technologies also cast light on the emergence of real time video holographic systems, enabling animated 3-D contents to be encoded as holographic data, and distributed via existing medium. At present, development in DH has reached a reasonable degree of maturity, but at the same time the heavy computation involved also imposes difficulty in practical applications. In this paper, a summary on a number of successful accomplishments that have been made recently in overcoming this problem is presented. Subsequently, we shall propose an economical framework that is suitable for real time generation and transmission of holographic video signals over existing distribution media. The proposed framework includes an aspect of extending the depth range of the object scene, which is important for the display of large-scale objects. [Figure not available: see fulltext.
An automated procedure for developing hybrid computer simulations of turbofan engines
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Krosel, S. M.
1980-01-01
A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all of the calculations and date manipulations needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self contained engine model to match specified design point information. A test case is described and comparisons between hybrid simulation and specified engine performance data are presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
..., ``Configuration Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., Reviews, and Audits for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This...
A system for the input and storage of data in the Besm-6 digital computer
NASA Technical Reports Server (NTRS)
Schmidt, K.; Blenke, L.
1975-01-01
Computer programs used for the decoding and storage of large volumes of data on the the BESM-6 computer are described. The following factors are discussed: the programming control language allows the programs to be run as part of a modular programming system used in data processing; data control is executed in a hierarchically built file on magnetic tape with sequential index storage; and the programs are not dependent on the structure of the data.
Exploring digital professionalism.
Ellaway, Rachel H; Coral, Janet; Topps, David; Topps, Maureen
2015-01-01
The widespread use of digital media (both computing devices and the services they access) has blurred the boundaries between our personal and professional lives. Contemporary students are the last to remember a time before the widespread use of the Internet and they will be the first to practice in a largely e-health environment. This article explores concepts of digital professionalism and their place in contemporary medical education, and proposes a series of principles of digital professionalism to guide teaching, learning and practice in the healthcare professions. Despite the many risks and fears surrounding their use, digital media are not an intrinsic threat to medical professionalism. Professionals should maintain the capacity for deliberate, ethical, and accountable practice when using digital media. The authors describe a digital professionalism framework structured around concepts of proficiency, reputation, and responsibility. Digital professionalism can be integrated into medical education using strategies based on awareness, alignment, assessment, and accountability. These principles of digital professionalism provide a way for medical students and medical practitioners to embrace the positive aspects of digital media use while being mindful and deliberate in its use to avoid or minimize any negative consequences.
The Identity Mapping Project: Demographic differences in patterns of distributed identity.
Gilbert, Richard L; Dionisio, John David N; Forney, Andrew; Dorin, Philip
2015-01-01
The advent of cloud computing and a multi-platform digital environment is giving rise to a new phase of human identity called "The Distributed Self." In this conception, aspects of the self are distributed into a variety of 2D and 3D digital personas with the capacity to reflect any number of combinations of now malleable personality traits. In this way, the source of human identity remains internal and embodied, but the expression or enactment of the self becomes increasingly external, disembodied, and distributed on demand. The Identity Mapping Project (IMP) is an interdisciplinary collaboration between psychology and computer Science designed to empirically investigate the development of distributed forms of identity. Methodologically, it collects a large database of "identity maps" - computerized graphical representations of how active someone is online and how their identity is expressed and distributed across 7 core digital domains: email, blogs/personal websites, social networks, online forums, online dating sites, character based digital games, and virtual worlds. The current paper reports on gender and age differences in online identity based on an initial database of distributed identity profiles.
NASA Technical Reports Server (NTRS)
Savaglio, Clare
1989-01-01
A realistic simulation of an aircraft in the flight using the AD 100 digital computer is presented. The implementation of three model features is specifically discussed: (1) a large aerodynamic data base (130,00 function values) which is evaluated using function interpolation to obtain the aerodynamic coefficients; (2) an option to trim the aircraft in longitudinal flight; and (3) a flight control system which includes a digital controller. Since the model includes a digital controller the simulation implements not only continuous time equations but also discrete time equations, thus the model has a mixed-data structure.
Escott, Edward J; Rubinstein, David
2004-01-01
It is often necessary for radiologists to use digital images in presentations and conferences. Most imaging modalities produce images in the Digital Imaging and Communications in Medicine (DICOM) format. The image files tend to be large and thus cannot be directly imported into most presentation software, such as Microsoft PowerPoint; the large files also consume storage space. There are many free programs that allow viewing and processing of these files on a personal computer, including conversion to more common file formats such as the Joint Photographic Experts Group (JPEG) format. Free DICOM image viewing and processing software for computers running on the Microsoft Windows operating system has already been evaluated. However, many people use the Macintosh (Apple Computer) platform, and a number of programs are available for these users. The World Wide Web was searched for free DICOM image viewing or processing software that was designed for the Macintosh platform or is written in Java and is therefore platform independent. The features of these programs and their usability were evaluated. There are many free programs for the Macintosh platform that enable viewing and processing of DICOM images. (c) RSNA, 2004.
Converting laserdisc video to digital video: a demonstration project using brain animations.
Jao, C S; Hier, D B; Brint, S U
1995-01-01
Interactive laserdiscs are of limited value in large group learning situations due to the expense of establishing multiple workstations. The authors implemented an alternative to laserdisc video by using indexed digital video combined with an expert system. High-quality video was captured from a laserdisc player and combined with waveform audio into an audio-video-interleave (AVI) file format in the Microsoft Video-for-Windows environment (Microsoft Corp., Seattle, WA). With the use of an expert system, a knowledge-based computer program provided random access to these indexed AVI files. The program can be played on any multimedia computer without the need for laserdiscs. This system offers a high level of interactive video without the overhead and cost of a laserdisc player.
Kelleher, Maureen E; Puchalski, Sarah M; Drake, Christiana; le Jeune, Sarah S
2014-07-01
To evaluate the sensitivity and specificity of direct digital abdominal radiography for the diagnosis of enterolithiasis in equids and to assess the effect of the number and anatomic location of enteroliths and gas distention of the gastrointestinal tract on diagnostic sensitivity of the technique. Retrospective case series. 238 horses and ponies ≥ 1 year old that underwent digital abdominal radiography with subsequent exploratory celiotomy or postmortem examination. For each case, 3 reviewers independently evaluated radiographic views. Radiographic images were evaluated for presence or absence and location of enteroliths and the degree of gas distention. Signalment, definitive diagnosis based on exploratory celiotomy or postmortem examination findings, and number and anatomic location of enteroliths were obtained from the medical records. 70 of the 238 (29.4%) equids had confirmed enterolithiasis. With regard to diagnosis of enterolithiasis via digital radiography, overall sensitivity and specificity for the 3 reviewers were 84% and 96%, respectively. Sensitivity was lower for small colon enteroliths (61.5%) than for large colon enteroliths (88.9%) and was negatively affected by gas distention of the gastrointestinal tract. Sensitivity was not affected by the number of enteroliths. Sensitivity and specificity of digital radiography for the diagnosis of large colon enterolithiasis in equids was high. Sensitivity of digital radiography for detection of small colon enteroliths was lower than that for large colon enteroliths, but was higher than that typically associated with computed radiography. In geographic regions in which enterolithiasis in equids is endemic, digital abdominal radiography could be used as a diagnostic test for equids with colic.
Hybrid techniques for the digital control of mechanical and optical systems
NASA Astrophysics Data System (ADS)
Acernese, Fausto; Barone, Fabrizio; De Rosa, Rosario; Eleuteri, Antonio; Milano, Leopoldo; Pardi, Silvio; Ricciardi, Iolanda; Russo, Guido
2004-07-01
One of the main requirements of a digital system for the control of interferometric detectors of gravitational waves is the computing power, that is a direct consequence of the increasing complexity of the digital algorithms necessary for the control signals generation. For this specific task many specialised non standard real-time architectures have been developed, often very expensive and difficult to upgrade. On the other hand, such computing power is generally fully available for off-line applications on standard Pc based systems. Therefore, a possible and obvious solution may be provided by the integration of both the the real-time and off-line architecture resulting in a hybrid control system architecture based on standards available components, trying to get both the advantages of the perfect data synchronization provided by the real-time systems and by the large computing power available on Pc based systems. Such integration may be provided by the implementation of the link between the two different architectures through the standard Ethernet network, whose data transfer speed is largely increasing in these years, using the TCP/IP and UDP protocols. In this paper we describe the architecture of an hybrid Ethernet based real-time control system protoype we implemented in Napoli, discussing its characteristics and performances. Finally we discuss a possible application to the real-time control of a suspended mass of the mode cleaner of the 3m prototype optical interferometer for gravitational wave detection (IDGW-3P) operational in Napoli.
'I'm good, but not that good': digitally-skilled young people's identity in computing
NASA Astrophysics Data System (ADS)
Wong, Billy
2016-12-01
Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their views and aspirations in computing, with a focus on the identities and discourses that these youngsters articulate in relation to this field. Our findings suggest that, even among digitally skilled young people, traditional identities of computing as people who are clever but antisocial still prevail, which can be unattractive for youths, especially girls. Digitally skilled youths identify with computing in different ways and for different reasons. Most enjoy doing computing but few aspired to being a computer person. Implications of our findings for computing education are discussed especially the continued need to broaden identities in computing, even for the digitally skilled.
NASA Astrophysics Data System (ADS)
Gilbert, B. K.; Robb, R. A.; Chu, A.; Kenue, S. K.; Lent, A. H.; Swartzlander, E. E., Jr.
1981-02-01
Rapid advances during the past ten years of several forms of computer-assisted tomography (CT) have resulted in the development of numerous algorithms to convert raw projection data into cross-sectional images. These reconstruction algorithms are either 'iterative,' in which a large matrix algebraic equation is solved by successive approximation techniques; or 'closed form'. Continuing evolution of the closed form algorithms has allowed the newest versions to produce excellent reconstructed images in most applications. This paper will review several computer software and special-purpose digital hardware implementations of closed form algorithms, either proposed during the past several years by a number of workers or actually implemented in commercial or research CT scanners. The discussion will also cover a number of recently investigated algorithmic modifications which reduce the amount of computation required to execute the reconstruction process, as well as several new special-purpose digital hardware implementations under development in laboratories at the Mayo Clinic.
ERIC Educational Resources Information Center
Benedis-Grab, Gregory
2011-01-01
Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…
Large-Scale Document Automation: The Systems Integration Issue.
ERIC Educational Resources Information Center
Kalthoff, Robert J.
1985-01-01
Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…
Design of a high-speed digital processing element for parallel simulation
NASA Technical Reports Server (NTRS)
Milner, E. J.; Cwynar, D. S.
1983-01-01
A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.
Exploration of operator method digital optical computers for application to NASA
NASA Technical Reports Server (NTRS)
1990-01-01
Digital optical computer design has been focused primarily towards parallel (single point-to-point interconnection) implementation. This architecture is compared to currently developing VHSIC systems. Using demonstrated multichannel acousto-optic devices, a figure of merit can be formulated. The focus is on a figure of merit termed Gate Interconnect Bandwidth Product (GIBP). Conventional parallel optical digital computer architecture demonstrates only marginal competitiveness at best when compared to projected semiconductor implements. Global, analog global, quasi-digital, and full digital interconnects are briefly examined as alternative to parallel digital computer architecture. Digital optical computing is becoming a very tough competitor to semiconductor technology since it can support a very high degree of three dimensional interconnect density and high degrees of Fan-In without capacitive loading effects at very low power consumption levels.
Large Scale Document Inversion using a Multi-threaded Computing System
Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won
2018-01-01
Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. CCS Concepts •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations. PMID:29861701
Large Scale Document Inversion using a Multi-threaded Computing System.
Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won
2017-06-01
Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations.
Hardware realization of an SVM algorithm implemented in FPGAs
NASA Astrophysics Data System (ADS)
Wiśniewski, Remigiusz; Bazydło, Grzegorz; Szcześniak, Paweł
2017-08-01
The paper proposes a technique of hardware realization of a space vector modulation (SVM) of state function switching in matrix converter (MC), oriented on the implementation in a single field programmable gate array (FPGA). In MC the SVM method is based on the instantaneous space-vector representation of input currents and output voltages. The traditional computation algorithms usually involve digital signal processors (DSPs) which consumes the large number of power transistors (18 transistors and 18 independent PWM outputs) and "non-standard positions of control pulses" during the switching sequence. Recently, hardware implementations become popular since computed operations may be executed much faster and efficient due to nature of the digital devices (especially concurrency). In the paper, we propose a hardware algorithm of SVM computation. In opposite to the existing techniques, the presented solution applies COordinate Rotation DIgital Computer (CORDIC) method to solve the trigonometric operations. Furthermore, adequate arithmetic modules (that is, sub-devices) used for intermediate calculations, such as code converters or proper sectors selectors (for output voltages and input current) are presented in detail. The proposed technique has been implemented as a design described with the use of Verilog hardware description language. The preliminary results of logic implementation oriented on the Xilinx FPGA (particularly, low-cost device from Artix-7 family from Xilinx was used) are also presented.
Event management for large scale event-driven digital hardware spiking neural networks.
Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean
2013-09-01
The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of computerized exercise ECG digitization. Interpretation in large clinical trials.
Caralis, D G; Shaw, L; Bilgere, B; Younis, L; Stocke, K; Wiens, R D; Chaitman, B R
1992-04-01
The authors report on a semiautomated program that incorporates both visual identification of fiducial points and digital determination of the ST-segment at 60 ms and 80 ms from the J point, ST slope, changes in R wave, and baseline drift. The off-line program can enhance the accuracy of detecting electrocardiographic (ECG) changes, as well as reproducibility of the exercise and postexercise ECG, as a marker of myocardial ischemia. The analysis program is written in Microsoft QuickBASIC 2.0 for an IBM personal computer interfaced to a Summagraphics mm1201 microgrid II digitizer. The program consists of the following components: (1) alphanumeric data entry, (2) ECG wave form digitization, (2) calculation of test results, (4) physician overread, and (5) editor function for remeasurements. This computerized exercise ECG digitization-interpretation program is accurate and reproducible for the quantitative assessment of ST changes and requires minimal time allotment for physician overread. The program is suitable for analysis and interpretation of large volumes of exercise tests in multicenter clinical trials and is currently utilized in the TIMI II, TIMI III, and BARI studies sponsored by the National Institutes of Health.
NASA Technical Reports Server (NTRS)
Habiby, Sarry F.
1987-01-01
The design and implementation of a digital (numerical) optical matrix-vector multiplier are presented. The objective is to demonstrate the operation of an optical processor designed to minimize computation time in performing a practical computing application. This is done by using the large array of processing elements in a Hughes liquid crystal light valve, and relying on the residue arithmetic representation, a holographic optical memory, and position coded optical look-up tables. In the design, all operations are performed in effectively one light valve response time regardless of matrix size. The features of the design allowing fast computation include the residue arithmetic representation, the mapping approach to computation, and the holographic memory. In addition, other features of the work include a practical light valve configuration for efficient polarization control, a model for recording multiple exposures in silver halides with equal reconstruction efficiency, and using light from an optical fiber for a reference beam source in constructing the hologram. The design can be extended to implement larger matrix arrays without increasing computation time.
Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L
2013-02-12
Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.
Optical transmission modules for multi-channel superconducting quantum interference device readouts.
Kim, Jin-Mok; Kwon, Hyukchan; Yu, Kwon-kyu; Lee, Yong-Ho; Kim, Kiwoong
2013-12-01
We developed an optical transmission module consisting of 16-channel analog-to-digital converter (ADC), digital-noise filter, and one-line serial transmitter, which transferred Superconducting Quantum Interference Device (SQUID) readout data to a computer by a single optical cable. A 16-channel ADC sent out SQUID readouts data with 32-bit serial data of 8-bit channel and 24-bit voltage data at a sample rate of 1.5 kSample/s. A digital-noise filter suppressed digital noises generated by digital clocks to obtain SQUID modulation as large as possible. One-line serial transmitter reformed 32-bit serial data to the modulated data that contained data and clock, and sent them through a single optical cable. When the optical transmission modules were applied to 152-channel SQUID magnetoencephalography system, this system maintained a field noise level of 3 fT/√Hz @ 100 Hz.
Economizing Education: Assessment Algorithms and Calculative Agencies
ERIC Educational Resources Information Center
O'Keeffe, Cormac
2017-01-01
International Large Scale Assessments have been producing data about educational attainment for over 60 years. More recently however, these assessments as tests have become digitally and computationally complex and increasingly rely on the calculative work performed by algorithms. In this article I first consider the coordination of relations…
Digital information management: a progress report on the National Digital Mammography Archive
NASA Astrophysics Data System (ADS)
Beckerman, Barbara G.; Schnall, Mitchell D.
2002-05-01
Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.
Three-dimensional image signals: processing methods
NASA Astrophysics Data System (ADS)
Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru
2010-11-01
Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.
Umeda, Akira; Iwata, Yasushi; Okada, Yasumasa; Shimada, Megumi; Baba, Akiyasu; Minatogawa, Yasuyuki; Yamada, Takayasu; Chino, Masao; Watanabe, Takafumi; Akaishi, Makoto
2004-12-01
The high cost of digital echocardiographs and the large size of data files hinder the adoption of remote diagnosis of digitized echocardiography data. We have developed a low-cost digital filing system for echocardiography data. In this system, data from a conventional analog echocardiograph are captured using a personal computer (PC) equipped with an analog-to-digital converter board. Motion picture data are promptly compressed using a moving pictures expert group (MPEG) 4 codec. The digitized data with preliminary reports obtained in a rural hospital are then sent to cardiologists at distant urban general hospitals via the internet. The cardiologists can evaluate the data using widely available movie-viewing software (Windows Media Player). The diagnostic accuracy of this double-check system was confirmed by comparison with ordinary super-VHS videotapes. We have demonstrated that digitization of echocardiography data from a conventional analog echocardiograph and MPEG 4 compression can be performed using an ordinary PC-based system, and that this system enables highly efficient digital storage and remote diagnosis at low cost.
Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware
Knight, James C.; Tully, Philip J.; Kaplan, Bernhard A.; Lansner, Anders; Furber, Steve B.
2016-01-01
SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45× more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models. PMID:27092061
Digital robust active control law synthesis for large order systems using constrained optimization
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
1987-01-01
This paper presents a direct digital control law synthesis procedure for a large order, sampled data, linear feedback system using constrained optimization techniques to meet multiple design requirements. A linear quadratic Gaussian type cost function is minimized while satisfying a set of constraints on the design loads and responses. General expressions for gradients of the cost function and constraints, with respect to the digital control law design variables are derived analytically and computed by solving a set of discrete Liapunov equations. The designer can choose the structure of the control law and the design variables, hence a stable classical control law as well as an estimator-based full or reduced order control law can be used as an initial starting point. Selected design responses can be treated as constraints instead of lumping them into the cost function. This feature can be used to modify a control law, to meet individual root mean square response limitations as well as minimum single value restrictions. Low order, robust digital control laws were synthesized for gust load alleviation of a flexible remotely piloted drone aircraft.
Educational Video Recording and Editing for The Hand Surgeon
Rehim, Shady A.; Chung, Kevin C.
2016-01-01
Digital video recordings are increasingly used across various medical and surgical disciplines including hand surgery for documentation of patient care, resident education, scientific presentations and publications. In recent years, the introduction of sophisticated computer hardware and software technology has simplified the process of digital video production and improved means of disseminating large digital data files. However, the creation of high quality surgical video footage requires basic understanding of key technical considerations, together with creativity and sound aesthetic judgment of the videographer. In this article we outline the practical steps involved with equipment preparation, video recording, editing and archiving as well as guidance for the choice of suitable hardware and software equipment. PMID:25911212
A digitally implemented preambleless demodulator for maritime and mobile data communications
NASA Astrophysics Data System (ADS)
Chalmers, Harvey; Shenoy, Ajit; Verahrami, Farhad B.
The hardware design and software algorithms for a low-bit-rate, low-cost, all-digital preambleless demodulator are described. The demodulator operates under severe high-noise conditions, fast Doppler frequency shifts, large frequency offsets, and multipath fading. Sophisticated algorithms, including a fast Fourier transform (FFT)-based burst acquisition algorithm, a cycle-slip resistant carrier phase tracker, an innovative Doppler tracker, and a fast acquisition symbol synchronizer, were developed and extensively simulated for reliable burst reception. The compact digital signal processor (DSP)-based demodulator hardware uses a unique personal computer test interface for downloading test data files. The demodulator test results demonstrate a near-ideal performance within 0.2 dB of theory.
Flood damage assessment using computer-assisted analysis of color infrared photography
Anderson, William H.
1978-01-01
Use of digitized aerial photographs for flood damage assessment in agriculture is new and largely untested. However, under flooding circumstances similar to the 1975 Red River Valley flood, computer-assisted techniques can be extremely useful, especially if detailed crop damage estimates are needed within a relatively short period of time.Airphoto interpretation techniques, manual or computer-assisted, are not intended to replace conventional ground survey and sampling procedures. But their use should be considered a valuable addition to the tools currently available for assessing agricultural flood damage.
The factorization of large composite numbers on the MPP
NASA Technical Reports Server (NTRS)
Mckurdy, Kathy J.; Wunderlich, Marvin C.
1987-01-01
The continued fraction method for factoring large integers (CFRAC) was an ideal algorithm to be implemented on a massively parallel computer such as the Massively Parallel Processor (MPP). After much effort, the first 60 digit number was factored on the MPP using about 6 1/2 hours of array time. Although this result added about 10 digits to the size number that could be factored using CFRAC on a serial machine, it was already badly beaten by the implementation of Davis and Holdridge on the CRAY-1 using the quadratic sieve, an algorithm which is clearly superior to CFRAC for large numbers. An algorithm is illustrated which is ideally suited to the single instruction multiple data (SIMD) massively parallel architecture and some of the modifications which were needed in order to make the parallel implementation effective and efficient are described.
Organization and Management of Project Athena.
ERIC Educational Resources Information Center
Champine, George A.
1991-01-01
Project Athena is a $100 million, eight-year project to install a large network of high performance computer work stations for education and research at the Massachusetts Institute of Technology (MIT). Organizational, legal, and administrative aspects of the project allow two competitors (Digital Equipment Corporation and IBM) to work together…
Digital circuits for computer applications: A compilation
NASA Technical Reports Server (NTRS)
1972-01-01
The innovations in this updated series of compilations dealing with electronic technology represent a carefully selected collection of digital circuits which have direct application in computer oriented systems. In general, the circuits have been selected as representative items of each section and have been included on their merits of having universal applications in digital computers and digital data processing systems. As such, they should have wide appeal to the professional engineer and scientist who encounter the fundamentals of digital techniques in their daily activities. The circuits are grouped as digital logic circuits, analog to digital converters, and counters and shift registers.
ERIC Educational Resources Information Center
Onaral, Banu; And Others
This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…
NASA Technical Reports Server (NTRS)
Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt
1991-01-01
The USDA presently uses labor-intensive photographic interpretation procedures to delineate large geographical areas into manageable size sampling units for the estimation of domestic crop and livestock production. Computer software to automate the boundary delineation procedure, called the computer-assisted stratification and sampling (CASS) system, was developed using a Hewlett Packard color-graphics workstation. The CASS procedures display Thematic Mapper (TM) satellite digital imagery on a graphics display workstation as the backdrop for the onscreen delineation of sampling units. USGS Digital Line Graph (DLG) data for roads and waterways are displayed over the TM imagery to aid in identifying potential sample unit boundaries. Initial analysis conducted with three Missouri counties indicated that CASS was six times faster than the manual techniques in delineating sampling units.
Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment
NASA Astrophysics Data System (ADS)
Ritsch, E.; Atlas Collaboration
2014-06-01
The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.
The Domain Shared by Computational and Digital Ontology: A Phenomenological Exploration and Analysis
ERIC Educational Resources Information Center
Compton, Bradley Wendell
2009-01-01
The purpose of this dissertation is to explore and analyze a domain of research thought to be shared by two areas of philosophy: computational and digital ontology. Computational ontology is philosophy used to develop information systems also called computational ontologies. Digital ontology is philosophy dealing with our understanding of Being…
Digital and biological computing in organizations.
Kampfner, Roberto R
2002-01-01
Michael Conrad unveiled many of the fundamental characteristics of biological computing. Underlying the behavioral variability and the adaptability of biological systems are these characteristics, including the ability of biological information processing to exploit quantum features at the atomic level, the powerful 3-D pattern recognition capabilities of macromolecules, the computational efficiency, and the ability to support biological function. Among many other things, Conrad formalized and explicated the underlying principles of biological adaptability, characterized the differences between biological and digital computing in terms of a fundamental tradeoff between adaptability and programmability of information processing, and discussed the challenges of interfacing digital computers and human society. This paper is about the encounter of biological and digital computing. The focus is on the nature of the biological information processing infrastructure of organizations and how it can be extended effectively with digital computing. In order to achieve this goal effectively, however, we need to embed properly digital computing into the information processing aspects of human and social behavior and intelligence, which are fundamentally biological. Conrad's legacy provides a firm, strong, and inspiring foundation for this endeavor.
A method for analyzing dynamic stall of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Crimi, P.; Reeves, B. L.
1972-01-01
A model for each of the basic flow elements involved in the unsteady stall of a two-dimensional airfoil in incompressible flow is presented. The interaction of these elements is analyzed using a digital computer. Computations of the loading during transient and sinusoidal pitching motions are in good qualitative agreement with measured loads. The method was used to confirm that large torsional response of helicopter blades detected in flight tests can be attributed to dynamic stall.
Multi-discipline resource inventory of soils, vegetation and geology
NASA Technical Reports Server (NTRS)
Simonson, G. H. (Principal Investigator); Paine, D. P.; Lawrence, R. D.; Norgren, J. A.; Pyott, W. Y.; Herzog, J. H.; Murray, R. J.; Rogers, R.
1973-01-01
The author has identified the following significant results. Computer classification of natural vegetation, in the vicinity of Big Summit Prairie, Crook County, Oregon was carried out using MSS digital data. Impure training sets, representing eleven vegetation types plus water, were selected from within the area to be classified. Close correlations were visually observed between vegetation types mapped from the large scale photographs and the computer classification of the ERTS data (Frame 1021-18151, 13 August 1972).
Baranowski, Tom; Baranowski, Janice C; Watson, Kathleen B; Martin, Shelby; Beltran, Alicia; Islam, Noemi; Dadabhoy, Hafza; Adame, Su-heyla; Cullen, Karen; Thompson, Debbe; Buday, Richard; Subar, Amy
2011-03-01
To test the effect of image size and presence of size cues on the accuracy of portion size estimation by children. Children were randomly assigned to seeing images with or without food size cues (utensils and checked tablecloth) and were presented with sixteen food models (foods commonly eaten by children) in varying portion sizes, one at a time. They estimated each food model's portion size by selecting a digital food image. The same food images were presented in two ways: (i) as small, graduated portion size images all on one screen or (ii) by scrolling across large, graduated portion size images, one per sequential screen. Laboratory-based with computer and food models. Volunteer multi-ethnic sample of 120 children, equally distributed by gender and ages (8 to 13 years) in 2008-2009. Average percentage of correctly classified foods was 60·3 %. There were no differences in accuracy by any design factor or demographic characteristic. Multiple small pictures on the screen at once took half the time to estimate portion size compared with scrolling through large pictures. Larger pictures had more overestimation of size. Multiple images of successively larger portion sizes of a food on one computer screen facilitated quicker portion size responses with no decrease in accuracy. This is the method of choice for portion size estimation on a computer.
Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology
Rohde, Gustavo K.; Ozolek, John A.; Parwani, Anil V.; Pantanowitz, Liron
2014-01-01
Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI) which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University's (CMU) Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms. PMID:25250190
Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology.
Rohde, Gustavo K; Ozolek, John A; Parwani, Anil V; Pantanowitz, Liron
2014-01-01
Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI) which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University's (CMU) Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.
NASA Astrophysics Data System (ADS)
Budiardja, R. D.; Lingerfelt, E. J.; Guidry, M. W.
2003-05-01
Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) programs and controlling data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. Another allows one to control and monitor a computation done on a Beowulf cluster by changing the parameters of the computation remotely and retrieving the result when the computation is done. The presentation will include hands-on demonstrations with real devices. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
The Correlates of the Digital Divide and Their Impact on College Student Learning
ERIC Educational Resources Information Center
Tien, Flora F.; Fu, Tsu-Tan
2008-01-01
By focusing on two dimensions of the digital divide--computer use and computer knowledge, this study explores four research questions: (1) What are the undergraduates doing with the computers they use at colleges? (2) How do undergraduates perform in regard to computer knowledge and skills? (3) With what is the digital divide among college…
Two-Way Communication Using RFID Equipment and Techniques
NASA Technical Reports Server (NTRS)
Jedry, Thomas; Archer, Eric
2007-01-01
Equipment and techniques used in radio-frequency identification (RFID) would be extended, according to a proposal, to enable short-range, two-way communication between electronic products and host computers. In one example of a typical contemplated application, the purpose of the short-range radio communication would be to transfer image data from a user s digital still or video camera to the user s computer for recording and/or processing. The concept is also applicable to consumer electronic products other than digital cameras (for example, cellular telephones, portable computers, or motion sensors in alarm systems), and to a variety of industrial and scientific sensors and other devices that generate data. Until now, RFID has been used to exchange small amounts of mostly static information for identifying and tracking assets. Information pertaining to an asset (typically, an object in inventory to be tracked) is contained in miniature electronic circuitry in an RFID tag attached to the object. Conventional RFID equipment and techniques enable a host computer to read data from and, in some cases, to write data to, RFID tags, but they do not enable such additional functions as sending commands to, or retrieving possibly large quantities of dynamic data from, RFID-tagged devices. The proposal would enable such additional functions. The figure schematically depicts an implementation of the proposal for a sensory device (e.g., a digital camera) that includes circuitry that converts sensory information to digital data. In addition to the basic sensory device, there would be a controller and a memory that would store the sensor data and/or data from the controller. The device would also be equipped with a conventional RFID chipset and antenna, which would communicate with a host computer via an RFID reader. The controller would function partly as a communication interface, implementing two-way communication protocols at all levels (including RFID if needed) between the sensory device and the memory and between the host computer and the memory. The controller would perform power V
3D digitization methods based on laser excitation and active triangulation: a comparison
NASA Astrophysics Data System (ADS)
Aubreton, Olivier; Mériaudeau, Fabrice; Truchetet, Frédéric
2016-04-01
3D reconstruction of surfaces is an important topic in computer vision and corresponds to a large field of applications: industrial inspection, reverse engineering, object recognition, biometry, archeology… Because of the large varieties of applications, one can find in the literature a lot of approaches which can be classified into two families: passive and active [1]. Certainly because of their reliability, active approaches, using imaging system with an additional controlled light source, seem to be the most commonly used in the industrial field. In this domain, the 3D digitization approach based on active 3D triangulation has had important developments during the last ten years [2] and seems to be mature today if considering the important number of systems proposed by manufacturers. Unfortunately, the performances of active 3D scanners depend on the optical properties of the surface to digitize. As an example, on Fig 1.a, a 3D shape with a diffuse surface has been digitized with Comet V scanner (Steinbichler). The 3D reconstruction is presented on Fig 1.b. The same experiment was carried out on a similar object (same shape) but presenting a specular surface (Fig 1.c and Fig 1.d) ; it can clearly be observed, that the specularity influences of the performance of the digitization.
Digital computer technique for setup and checkout of an analog computer
NASA Technical Reports Server (NTRS)
Ambaruch, R.
1968-01-01
Computer program technique, called Analog Computer Check-Out Routine Digitally /ACCORD/, generates complete setup and checkout data for an analog computer. In addition, the correctness of the analog program implementation is validated.
Evolving Better Cars: Teaching Evolution by Natural Selection with a Digital Inquiry Activity
ERIC Educational Resources Information Center
Royer, Anne M.; Schultheis, Elizabeth H.
2014-01-01
Evolutionary experiments are usually difficult to perform in the classroom because of the large sizes and long timescales of experiments testing evolutionary hypotheses. Computer applications give students a window to observe evolution in action, allowing them to gain comfort with the process of natural selection and facilitating inquiry…
Design and Evaluation of Simulations for the Development of Complex Decision-Making Skills.
ERIC Educational Resources Information Center
Hartley, Roger; Varley, Glen
2002-01-01
Command and Control Training Using Simulation (CACTUS) is a computer digital mapping system used by police to manage large-scale public events. Audio and video records of adaptive training scenarios using CACTUS show how the simulation develops decision-making skills for strategic and tactical event management. (SK)
ERA 1103 UNIVAC 2 Calculating Machine
1955-09-21
The new 10-by 10-Foot Supersonic Wind Tunnel at the Lewis Flight Propulsion Laboratory included high tech data acquisition and analysis systems. The reliable gathering of pressure, speed, temperature, and other data from test runs in the facilities was critical to the research process. Throughout the 1940s and early 1950s female employees, known as computers, recorded all test data and performed initial calculations by hand. The introduction of punch card computers in the late 1940s gradually reduced the number of hands-on calculations. In the mid-1950s new computational machines were installed in the office building of the 10-by 10-Foot tunnel. The new systems included this UNIVAC 1103 vacuum tube computer—the lab’s first centralized computer system. The programming was done on paper tape and fed into the machine. The 10-by 10 computer center also included the Lewis-designed Computer Automated Digital Encoder (CADDE) and Digital Automated Multiple Pressure Recorder (DAMPR) systems which converted test data to binary-coded decimal numbers and recorded test pressures automatically, respectively. The systems primarily served the 10-by 10, but were also applied to the other large facilities. Engineering Research Associates (ERA) developed the initial UNIVAC computer for the Navy in the late 1940s. In 1952 the company designed a commercial version, the UNIVAC 1103. The 1103 was the first computer designed by Seymour Cray and the first commercially successful computer.
Three-dimensional holographic display of ultrasound computed tomograms
NASA Astrophysics Data System (ADS)
Andre, Michael P.; Janee, Helmar S.; Ysrael, Mariana Z.; Hodler, Jeurg; Olson, Linda K.; Leopold, George R.; Schulz, Raymond
1997-05-01
Breast ultrasound is a valuable adjunct to mammography but is limited by a very small field of view, particularly with high-resolution transducers necessary for breast diagnosis. We have been developing an ultrasound system based on a diffraction tomography method that provides slices through the breast on a large 20-cm diameter circular field of view. Eight to fifteen images are typically produced in sequential coronal planes from the nipple to the chest wall with either 0.25 or 0.5 mm pixels. As a means to simplify the interpretation of this large set of images, we report experience with 3D life-sized displays of the entire breast of human volunteers using a digital holographic technique. The compound 3D holographic images are produced from the digital image matrix, recorded on 14 X 17 inch transparency and projected on a special white-light viewbox. Holographic visualization of the entire breast has proved to be the preferred method for 3D display of ultrasound computed tomography images. It provides a unique perspective on breast anatomy and may prove useful for biopsy guidance and surgical planning.
Smolinski, Tomasz G
2010-01-01
Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.
NASA Astrophysics Data System (ADS)
Brown, Gail Laverne
The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.
Bone age maturity assessment using hand-held device
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Gilsanz, Vicente; Liu, Xiaodong; Boechat, M. I.
2004-04-01
Purpose: Assessment of bone maturity is traditionally performed through visual comparison of hand and wrist radiograph with existing reference images in textbooks. Our goal was to develop a digital index based on idealized hand Xray images that can be incorporated in a hand held computer and used for visual assessment of bone age for patients. Material and methods: Due to the large variability in bone maturation in normals, we generated a set of "ideal" images obtained by computer combinations of images from our normal reference data sets. Software for hand-held PDA devices was developed for easy navigation through the set of images and visual selection of matching images. A formula based on our statistical analysis provides the standard deviation from normal based on the chronological age of the patient. The accuracy of the program was compared to traditional interpretation by two radiologists in a double blind reading of 200 normal Caucasian children (100 boys, 100 girls). Results: Strong correlations were present between chronological age and bone age (r > 0.9) with no statistical difference between the digital and traditional assessment methods. Determinations of carpal bone maturity in adolescents was slightly more accurate using the digital system. The users did praise the convenience and effectiveness of the digital Palm Index in clinical practice. Conclusion: An idealized digital Palm Bone Age Index provides a convenient and effective alternative to conventional atlases for the assessment of skeletal maturity.
High density processing electronics for superconducting tunnel junction x-ray detector arrays
NASA Astrophysics Data System (ADS)
Warburton, W. K.; Harris, J. T.; Friedrich, S.
2015-06-01
Superconducting tunnel junctions (STJs) are excellent soft x-ray (100-2000 eV) detectors, particularly for synchrotron applications, because of their ability to obtain energy resolutions below 10 eV at count rates approaching 10 kcps. In order to achieve useful solid detection angles with these very small detectors, they are typically deployed in large arrays - currently with 100+ elements, but with 1000 elements being contemplated. In this paper we review a 5-year effort to develop compact, computer controlled low-noise processing electronics for STJ detector arrays, focusing on the major issues encountered and our solutions to them. Of particular interest are our preamplifier design, which can set the STJ operating points under computer control and achieve 2.7 eV energy resolution; our low noise power supply, which produces only 2 nV/√Hz noise at the preamplifier's critical cascode node; our digital processing card that digitizes and digitally processes 32 channels; and an STJ I-V curve scanning algorithm that computes noise as a function of offset voltage, allowing an optimum operating point to be easily selected. With 32 preamplifiers laid out on a custom 3U EuroCard, and the 32 channel digital card in a 3U PXI card format, electronics for a 128 channel array occupy only two small chassis, each the size of a National Instruments 5-slot PXI crate, and allow full array control with simple extensions of existing beam line data collection packages.
Storage and retrieval of large digital images
Bradley, J.N.
1998-01-20
Image compression and viewing are implemented with (1) a method for performing DWT-based compression on a large digital image with a computer system possessing a two-level system of memory and (2) a method for selectively viewing areas of the image from its compressed representation at multiple resolutions and, if desired, in a client-server environment. The compression of a large digital image I(x,y) is accomplished by first defining a plurality of discrete tile image data subsets T{sub ij}(x,y) that, upon superposition, form the complete set of image data I(x,y). A seamless wavelet-based compression process is effected on I(x,y) that is comprised of successively inputting the tiles T{sub ij}(x,y) in a selected sequence to a DWT routine, and storing the resulting DWT coefficients in a first primary memory. These coefficients are periodically compressed and transferred to a secondary memory to maintain sufficient memory in the primary memory for data processing. The sequence of DWT operations on the tiles T{sub ij}(x,y) effectively calculates a seamless DWT of I(x,y). Data retrieval consists of specifying a resolution and a region of I(x,y) for display. The subset of stored DWT coefficients corresponding to each requested scene is determined and then decompressed for input to an inverse DWT, the output of which forms the image display. The repeated process whereby image views are specified may take the form an interaction with a computer pointing device on an image display from a previous retrieval. 6 figs.
Storage and retrieval of large digital images
Bradley, Jonathan N.
1998-01-01
Image compression and viewing are implemented with (1) a method for performing DWT-based compression on a large digital image with a computer system possessing a two-level system of memory and (2) a method for selectively viewing areas of the image from its compressed representation at multiple resolutions and, if desired, in a client-server environment. The compression of a large digital image I(x,y) is accomplished by first defining a plurality of discrete tile image data subsets T.sub.ij (x,y) that, upon superposition, form the complete set of image data I(x,y). A seamless wavelet-based compression process is effected on I(x,y) that is comprised of successively inputting the tiles T.sub.ij (x,y) in a selected sequence to a DWT routine, and storing the resulting DWT coefficients in a first primary memory. These coefficients are periodically compressed and transferred to a secondary memory to maintain sufficient memory in the primary memory for data processing. The sequence of DWT operations on the tiles T.sub.ij (x,y) effectively calculates a seamless DWT of I(x,y). Data retrieval consists of specifying a resolution and a region of I(x,y) for display. The subset of stored DWT coefficients corresponding to each requested scene is determined and then decompressed for input to an inverse DWT, the output of which forms the image display. The repeated process whereby image views are specified may take the form an interaction with a computer pointing device on an image display from a previous retrieval.
Wavelet-enabled progressive data Access and Storage Protocol (WASP)
NASA Astrophysics Data System (ADS)
Clyne, J.; Frank, L.; Lesperance, T.; Norton, A.
2015-12-01
Current practices for storing numerical simulation outputs hail from an era when the disparity between compute and I/O performance was not as great as it is today. The memory contents for every sample, computed at every grid point location, are simply saved at some prescribed temporal frequency. Though straightforward, this approach fails to take advantage of the coherency in neighboring grid points that invariably exists in numerical solutions to mathematical models. Exploiting such coherence is essential to digital multimedia; DVD-Video, digital cameras, streaming movies and audio are all possible today because of transform-based compression schemes that make substantial reductions in data possible by taking advantage of the strong correlation between adjacent samples in both space and time. Such methods can also be exploited to enable progressive data refinement in a manner akin to that used in ubiquitous digital mapping applications: views from far away are shown in coarsened detail to provide context, and can be progressively refined as the user zooms in on a localized region of interest. The NSF funded WASP project aims to provide a common, NetCDF-compatible software framework for supporting wavelet-based, multi-scale, progressive data, enabling interactive exploration of large data sets for the geoscience communities. This presentation will provide an overview of this work in progress to develop community cyber-infrastructure for the efficient analysis of very large data sets.
Digital ocular fundus imaging: a review.
Bernardes, Rui; Serranho, Pedro; Lobo, Conceição
2011-01-01
Ocular fundus imaging plays a key role in monitoring the health status of the human eye. Currently, a large number of imaging modalities allow the assessment and/or quantification of ocular changes from a healthy status. This review focuses on the main digital fundus imaging modality, color fundus photography, with a brief overview of complementary techniques, such as fluorescein angiography. While focusing on two-dimensional color fundus photography, the authors address the evolution from nondigital to digital imaging and its impact on diagnosis. They also compare several studies performed along the transitional path of this technology. Retinal image processing and analysis, automated disease detection and identification of the stage of diabetic retinopathy (DR) are addressed as well. The authors emphasize the problems of image segmentation, focusing on the major landmark structures of the ocular fundus: the vascular network, optic disk and the fovea. Several proposed approaches for the automatic detection of signs of disease onset and progression, such as microaneurysms, are surveyed. A thorough comparison is conducted among different studies with regard to the number of eyes/subjects, imaging modality, fundus camera used, field of view and image resolution to identify the large variation in characteristics from one study to another. Similarly, the main features of the proposed classifications and algorithms for the automatic detection of DR are compared, thereby addressing computer-aided diagnosis and computer-aided detection for use in screening programs. Copyright © 2011 S. Karger AG, Basel.
Correlative Feature Analysis for Multimodality Breast CAD
2009-09-01
Imaging 20, 1275–1284 2001. 22V. Caselles, R . Kimmel, and G. Sapiro, “Geodesic active contours,” Int. J. Comput. Vis. 22, 61–79 1997. 23R. Malladi , J...A. R . Jamieson, C. A. Sennett, and S. A. Jensen, “Evaluation of computer-aided diagnosis on a large clinical full-field digital mammographic dataset...Academic Radiology, 15, 1437-1445 (2008). Conference Proceeding Papers [1] Y. Yuan, M. L. Giger, K. Suzuki, H. Li, and A. R . Jamieson, “A
Wetland mapping from digitized aerial photography. [Sheboygen Marsh, Sheboygen County, Wisconsin
NASA Technical Reports Server (NTRS)
Scarpace, F. L.; Quirk, B. K.; Kiefer, R. W.; Wynn, S. L.
1981-01-01
Computer assisted interpretation of small scale aerial imagery was found to be a cost effective and accurate method of mapping complex vegetation patterns if high resolution information is desired. This type of technique is suited for problems such as monitoring changes in species composition due to environmental factors and is a feasible method of monitoring and mapping large areas of wetlands. The technique has the added advantage of being in a computer compatible form which can be transformed into any georeference system of interest.
Program Processes Thermocouple Readings
NASA Technical Reports Server (NTRS)
Quave, Christine A.; Nail, William, III
1995-01-01
Digital Signal Processor for Thermocouples (DART) computer program implements precise and fast method of converting voltage to temperature for large-temperature-range thermocouple applications. Written using LabVIEW software. DART available only as object code for use on Macintosh II FX or higher-series computers running System 7.0 or later and IBM PC-series and compatible computers running Microsoft Windows 3.1. Macintosh version of DART (SSC-00032) requires LabVIEW 2.2.1 or 3.0 for execution. IBM PC version (SSC-00031) requires LabVIEW 3.0 for Windows 3.1. LabVIEW software product of National Instruments and not included with program.
Diamond, Alan; Nowotny, Thomas; Schmuker, Michael
2016-01-01
Neuromorphic computing employs models of neuronal circuits to solve computing problems. Neuromorphic hardware systems are now becoming more widely available and “neuromorphic algorithms” are being developed. As they are maturing toward deployment in general research environments, it becomes important to assess and compare them in the context of the applications they are meant to solve. This should encompass not just task performance, but also ease of implementation, speed of processing, scalability, and power efficiency. Here, we report our practical experience of implementing a bio-inspired, spiking network for multivariate classification on three different platforms: the hybrid digital/analog Spikey system, the digital spike-based SpiNNaker system, and GeNN, a meta-compiler for parallel GPU hardware. We assess performance using a standard hand-written digit classification task. We found that whilst a different implementation approach was required for each platform, classification performances remained in line. This suggests that all three implementations were able to exercise the model's ability to solve the task rather than exposing inherent platform limits, although differences emerged when capacity was approached. With respect to execution speed and power consumption, we found that for each platform a large fraction of the computing time was spent outside of the neuromorphic device, on the host machine. Time was spent in a range of combinations of preparing the model, encoding suitable input spiking data, shifting data, and decoding spike-encoded results. This is also where a large proportion of the total power was consumed, most markedly for the SpiNNaker and Spikey systems. We conclude that the simulation efficiency advantage of the assessed specialized hardware systems is easily lost in excessive host-device communication, or non-neuronal parts of the computation. These results emphasize the need to optimize the host-device communication architecture for scalability, maximum throughput, and minimum latency. Moreover, our results indicate that special attention should be paid to minimize host-device communication when designing and implementing networks for efficient neuromorphic computing. PMID:26778950
Computer Storage and Retrieval of Position - Dependent Data.
1982-06-01
This thesis covers the design of a new digital database system to replace the merged (observation and geographic location) record, one file per cruise...68 "The Digital Data Library System: Library Storage and Retrieval of Digital Geophysical Data" by Robert C. Groan) provided a relatively simple...dependent, ’geophysical’ data. The system is operational on a Digital Equipment Corporation VAX-11/780 computer. Values of measured and computed
Improved digital filters for evaluating Fourier and Hankel transform integrals
Anderson, Walter L.
1975-01-01
New algorithms are described for evaluating Fourier (cosine, sine) and Hankel (J0,J1) transform integrals by means of digital filters. The filters have been designed with extended lengths so that a variable convolution operation can be applied to a large class of integral transforms having the same system transfer function. A f' lagged-convolution method is also presented to significantly decrease the computation time when computing a series of like-transforms over a parameter set spaced the same as the filters. Accuracy of the new filters is comparable to Gaussian integration, provided moderate parameter ranges and well-behaved kernel functions are used. A collection of Fortran IV subprograms is included for both real and complex functions for each filter type. The algorithms have been successfully used in geophysical applications containing a wide variety of integral transforms
Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping
NASA Astrophysics Data System (ADS)
Mah, S. B.; Cryderman, C. S.
2015-08-01
Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.
Electromechanical quantum simulators
NASA Astrophysics Data System (ADS)
Tacchino, F.; Chiesa, A.; LaHaye, M. D.; Carretta, S.; Gerace, D.
2018-06-01
Digital quantum simulators are among the most appealing applications of a quantum computer. Here we propose a universal, scalable, and integrated quantum computing platform based on tunable nonlinear electromechanical nano-oscillators. It is shown that very high operational fidelities for single- and two-qubits gates can be achieved in a minimal architecture, where qubits are encoded in the anharmonic vibrational modes of mechanical nanoresonators, whose effective coupling is mediated by virtual fluctuations of an intermediate superconducting artificial atom. An effective scheme to induce large single-phonon nonlinearities in nanoelectromechanical devices is explicitly discussed, thus opening the route to experimental investigation in this direction. Finally, we explicitly show the very high fidelities that can be reached for the digital quantum simulation of model Hamiltonians, by using realistic experimental parameters in state-of-the-art devices, and considering the transverse field Ising model as a paradigmatic example.
Readout Electronics for the Central Drift Chamber of the Belle-II Detector
NASA Astrophysics Data System (ADS)
Uchida, Tomohisa; Taniguchi, Takashi; Ikeno, Masahiro; Iwasaki, Yoshihito; Saito, Masatoshi; Shimazaki, Shoichi; Tanaka, Manobu M.; Taniguchi, Nanae; Uno, Shoji
2015-08-01
We have developed readout electronics for the central drift chamber (CDC) of the Belle-II detector. The space near the endplate of the CDC for installation of the electronics was limited by the detector structure. Due to the large amounts of data generated by the CDC, a high-speed data link, with a greater than one gigabit transfer rate, was required to transfer the data to a back-end computer. A new readout module was required to satisfy these requirements. This module processes 48 signals from the CDC, converts them to digital data and transfers it directly to the computer. All functions that transfer digital data via the high speed link were implemented on the single module. We have measured its electrical characteristics and confirmed that the results satisfy the requirements of the Belle-II experiment.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Test Documentation for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1207, ``Test Documentation for Digital... practices for test documentation for software and computer systems as described in the Institute of...
Stapleton, Brandon M; Lin, Wei-Shao; Ntounis, Athanasios; Harris, Bryan T; Morton, Dean
2014-09-01
This clinical report demonstrated the use of an implant-supported fixed dental prosthesis fabricated with a contemporary digital approach. The digital diagnostic data acquisition was completed with a digital diagnostic impression with an intraoral scanner and cone-beam computed tomography with a prefabricated universal radiographic template to design a virtual prosthetically driven implant surgical plan. A surgical template fabricated with computer-aided design and computer-aided manufacturing (CAD/CAM) was used to perform computer-guided implant surgery. The definitive digital data were then used to design the definitive CAD/CAM-fabricated fixed dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Spectroscopic analysis and control
Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles
2017-04-18
Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.
Digital signal processing algorithms for automatic voice recognition
NASA Technical Reports Server (NTRS)
Botros, Nazeih M.
1987-01-01
The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.
An application of digital network technology to medical image management.
Chu, W K; Smith, C L; Wobig, R K; Hahn, F A
1997-01-01
With the advent of network technology, there is considerable interest within the medical community to manage the storage and distribution of medical images by digital means. Higher workflow efficiency leading to better patient care is one of the commonly cited outcomes [1,2]. However, due to the size of medical image files and the unique requirements in detail and resolution, medical image management poses special challenges. Storage requirements are usually large, which implies expenses or investment costs make digital networking projects financially out of reach for many clinical institutions. New advances in network technology and telecommunication, in conjunction with the decreasing cost in computer devices, have made digital image management achievable. In our institution, we have recently completed a pilot project to distribute medical images both within the physical confines of the clinical enterprise as well as outside the medical center campus. The design concept and the configuration of a comprehensive digital image network is described in this report.
Environment and health: Probes and sensors for environment digital control
NASA Astrophysics Data System (ADS)
Schettini, Chiara
2014-05-01
The idea of studying the environment using New Technologies (NT) came from a MIUR (Ministry of Education of the Italian Government) notice that allocated funds for the realization of innovative school science projects. The "Environment and Health" project uses probes and sensors for digital control of environment (water, air and soil). The working group was composed of 4 Science teachers from 'Liceo Statale G. Mazzini ', under the coordination of teacher Chiara Schettini. The Didactic Section of Naples City of Sciences helped the teachers in developing the project and it organized a refresher course for them on the utilization of digital control sensors. The project connects Environment and Technology because the study of the natural aspects and the analysis of the chemical-physical parameters give students and teachers skills for studying the environment based on the utilization of NT in computing data elaboration. During the practical project, samples of air, water and soil are gathered in different contexts. Sample analysis was done in the school's scientific laboratory with digitally controlled sensors. The data are elaborated with specific software and the results have been written in a booklet and in a computing database. During the first year, the project involved 6 school classes (age of the students 14—15 years), under the coordination of Science teachers. The project aims are: 1) making students more aware about environmental matters 2) achieving basic skills for evaluating air, water and soil quality. 3) achieving strong skills for the utilization of digitally controlled sensors. 4) achieving computing skills for elaborating and presenting data. The project aims to develop a large environmental conscience and the need of a ' good ' environment for defending our health. Moreover it would increase the importance of NT as an instrument of knowledge.
Implementing digital skills training in care homes: a literature review.
Wild, Deidre; Kydd, Angela; Szczepura, Ala
2016-05-01
This article is the first of a two-part series that informs and describes digital skills training using a dedicated console computer provided for staff and residents in a care home setting. This was part of a programme of culture change in a large care home with nursing in Glasgow, Scotland. The literature review shows that over the past decade there has been a gradual increase in the use of digital technology by staff and older people in community settings including care homes. Policy from the European Commission presents a persuasive argument for the advancement of technology-enabled care to counter the future impact of an increased number of people of advanced age on finite health and social care resources. The psychosocial and environmental issues that inhibit or enhance the acquisition of digital skills in care homes are considered and include the identification of exemplar schemes and the support involved.
A TREETOPS simulation of the Hubble Space Telescope-High Gain Antenna interaction
NASA Technical Reports Server (NTRS)
Sharkey, John P.
1987-01-01
Virtually any project dealing with the control of a Large Space Structure (LSS) will involve some level of verification by digital computer simulation. While the Hubble Space Telescope might not normally be included in a discussion of LSS, it is presented to highlight a recently developed simulation and analysis program named TREETOPS. TREETOPS provides digital simulation, linearization, and control system interaction of flexible, multibody spacecraft which admit to a point-connected tree topology. The HST application of TREETOPS is intended to familiarize the LSS community with TREETOPS by presenting a user perspective of its key features.
NASA Technical Reports Server (NTRS)
Neiner, G. H.; Cole, G. L.; Arpasi, D. J.
1972-01-01
Digital computer control of a mixed-compression inlet is discussed. The inlet was terminated with a choked orifice at the compressor face station to dynamically simulate a turbojet engine. Inlet diffuser exit airflow disturbances were used. A digital version of a previously tested analog control system was used for both normal shock and restart control. Digital computer algorithms were derived using z-transform and finite difference methods. Using a sample rate of 1000 samples per second, the digital normal shock and restart controls essentially duplicated the inlet analog computer control results. At a sample rate of 100 samples per second, the control system performed adequately but was less stable.
Product definition data interface
NASA Technical Reports Server (NTRS)
Birchfield, B.; Downey, P.
1984-01-01
The development and application of advanced Computer Aided Design/Computer Aided Manufacturing (CAD/CAM) technology in aerospace industry is discussed. New CAD/CAM capabilities provide the engineer and production worker with tools to produce better products and significantly improve productivity. This technology is expanding in all phases of engineering and manufacturing with large potential for improvements in productivity. The integration of CAD and CAM systematically to insure maximum utility throughout the U.S. Aerospace Industry, its large community of supporting suppliers, and the Department of Defense aircraft overhaul and repair facilities is outlined. The need for a framework for exchange of digital product definition data, which serves the function of the conventional engineering drawing is emphasized.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-24
... Digital Computer-Based Instrumentation and Control Systems.'' This BTP is to be cited as the acceptance criteria for Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems... Evaluation of Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
2013-01-01
Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934 PMID:23402499
Computational imaging of sperm locomotion.
Daloglu, Mustafa Ugur; Ozcan, Aydogan
2017-08-01
Not only essential for scientific research, but also in the analysis of male fertility and for animal husbandry, sperm tracking and characterization techniques have been greatly benefiting from computational imaging. Digital image sensors, in combination with optical microscopy tools and powerful computers, have enabled the use of advanced detection and tracking algorithms that automatically map sperm trajectories and calculate various motility parameters across large data sets. Computational techniques are driving the field even further, facilitating the development of unconventional sperm imaging and tracking methods that do not rely on standard optical microscopes and objective lenses, which limit the field of view and volume of the semen sample that can be imaged. As an example, a holographic on-chip sperm imaging platform, only composed of a light-emitting diode and an opto-electronic image sensor, has emerged as a high-throughput, low-cost and portable alternative to lens-based traditional sperm imaging and tracking methods. In this approach, the sample is placed very close to the image sensor chip, which captures lensfree holograms generated by the interference of the background illumination with the light scattered from sperm cells. These holographic patterns are then digitally processed to extract both the amplitude and phase information of the spermatozoa, effectively replacing the microscope objective lens with computation. This platform has further enabled high-throughput 3D imaging of spermatozoa with submicron 3D positioning accuracy in large sample volumes, revealing various rare locomotion patterns. We believe that computational chip-scale sperm imaging and 3D tracking techniques will find numerous opportunities in both sperm related research and commercial applications. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
NASA Astrophysics Data System (ADS)
Acernese, Fausto; Barone, Fabrizio; De Rosa, Rosario; Eleuteri, Antonio; Milano, Leopoldo; Pardi, Silvio; Ricciardi, Iolanda; Russo, Guido
2004-09-01
One of the main requirements of a digital system for the control of interferometric detectors of gravitational waves is the computing power, that is a direct consequence of the increasing complexity of the digital algorithms necessary for the control signals generation. For this specific task many specialized non standard real-time architectures have been developed, often very expensive and difficult to upgrade. On the other hand, such computing power is generally fully available for off-line applications on standard Pc based systems. Therefore, a possible and obvious solution may be provided by the integration of both the real-time and off-line architecture resulting in a hybrid control system architecture based on standards available components, trying to get both the advantages of the perfect data synchronization provided by the real-time systems and by the large computing power available on Pc based systems. Such integration may be provided by the implementation of the link between the two different architectures through the standard Ethernet network, whose data transfer speed is largely increasing in these years, using the TCP/IP, UDP and raw Ethernet protocols. In this paper we describe the architecture of an hybrid Ethernet based real-time control system prototype we implemented in Napoli, discussing its characteristics and performances. Finally we discuss a possible application to the real-time control of a suspended mass of the mode cleaner of the 3m prototype optical interferometer for gravitational wave detection (IDGW-3P) operational in Napoli.
Simple video format for mobile applications
NASA Astrophysics Data System (ADS)
Smith, John R.; Miao, Zhourong; Li, Chung-Sheng
2000-04-01
With the advent of pervasive computing, there is a growing demand for enabling multimedia applications on mobile devices. Large numbers of pervasive computing devices, such as personal digital assistants (PDAs), hand-held computer (HHC), smart phones, portable audio players, automotive computing devices, and wearable computers are gaining access to online information sources. However, the pervasive computing devices are often constrained along a number of dimensions, such as processing power, local storage, display size and depth, connectivity, and communication bandwidth, which makes it difficult to access rich image and video content. In this paper, we report on our initial efforts in designing a simple scalable video format with low-decoding and transcoding complexity for pervasive computing. The goal is to enable image and video access for mobile applications such as electronic catalog shopping, video conferencing, remote surveillance and video mail using pervasive computing devices.
Collen, M F
1994-01-01
This article summarizes the origins of informatics, which is based on the science, engineering, and technology of computer hardware, software, and communications. In just four decades, from the 1950s to the 1990s, computer technology has progressed from slow, first-generation vacuum tubes, through the invention of the transistor and its incorporation into microprocessor chips, and ultimately, to fast, fourth-generation very-large-scale-integrated silicon chips. Programming has undergone a parallel transformation, from cumbersome, first-generation, machine languages to efficient, fourth-generation application-oriented languages. Communication has evolved from simple copper wires to complex fiberoptic cables in computer-linked networks. The digital computer has profound implications for the development and practice of clinical medicine. PMID:7719803
Creating Joint Representations of Collaborative Problem Solving with Multi-Touch Technology
ERIC Educational Resources Information Center
Mercier, E.; Higgins, S.
2014-01-01
Multi-touch surfaces have the potential to change the nature of computer-supported collaborative learning, allowing more equitable access to shared digital content. In this paper, we explore how large multi-touch tables can be used by groups of students as an external representation of their group interaction processes. Video data from 24 groups…
Teachers Left Behind: Acceptance and Use of Technology in Lebanese Public High Schools
ERIC Educational Resources Information Center
Baytiyeh, Hoda
2014-01-01
Nowadays, the use of computers in education is increasing worldwide. Information technology is deemed essential for the digital generation's classrooms. However, the adoption of technology in teaching and learning largely depends on the culture and social context. The aim of this research study is to evaluate the acceptance and use of technology…
New Media Literacy Education (NMLE): A Developmental Approach
ERIC Educational Resources Information Center
Graber, Diana
2012-01-01
The digital world is full of both possibility and peril, with rules of engagement being hashed out as we go. While schools are still "hesitant to embrace new technologies as a backlash from the significant, and largely ineffectual, investment in classroom computers as an instructional panacea during in the mid-1990's" (Collins and Halverson 2009),…
Recognizing User Identity by Touch on Tabletop Displays: An Interactive Authentication Method
ERIC Educational Resources Information Center
Torres Peralta, Raquel
2012-01-01
Multi-touch tablets allow users to interact with computers through intuitive, natural gestures and direct manipulation of digital objects. One advantage of these devices is that they can offer a large, collaborative space where several users can work on a task at the same time. However the lack of privacy in these situations makes standard…
Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Bruton, W. M.
1974-01-01
A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.
Digital Documentation: Using Computers to Create Multimedia Reports.
ERIC Educational Resources Information Center
Speitel, Tom; And Others
1996-01-01
Describes methods for creating integrated multimedia documents using recent advances in print, audio, and video digitization that bring added usefulness to computers as data acquisition, processing, and presentation tools. Discusses advantages of digital documentation. (JRH)
A study of digital holographic filter generation
NASA Technical Reports Server (NTRS)
Calhoun, M.; Ingels, F.
1976-01-01
Problems associated with digital computer generation of holograms are discussed along with a criteria for producing optimum digital holograms. This criteria revolves around amplitude resolution and spatial frequency limitations induced by the computer and plotter process.
Digital receiver study and implementation
NASA Technical Reports Server (NTRS)
Fogle, D. A.; Lee, G. M.; Massey, J. C.
1972-01-01
Computer software was developed which makes it possible to use any general purpose computer with A/D conversion capability as a PSK receiver for low data rate telemetry processing. Carrier tracking, bit synchronization, and matched filter detection are all performed digitally. To aid in the implementation of optimum computer processors, a study of general digital processing techniques was performed which emphasized various techniques for digitizing general analog systems. In particular, the phase-locked loop was extensively analyzed as a typical non-linear communication element. Bayesian estimation techniques for PSK demodulation were studied. A hardware implementation of the digital Costas loop was developed.
Automatic Mexican sign language and digits recognition using normalized central moments
NASA Astrophysics Data System (ADS)
Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina
2016-09-01
This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.
Conversion of cardiac performance data in analog form for digital computer entry
NASA Technical Reports Server (NTRS)
Miller, R. L.
1972-01-01
A system is presented which will reduce analog cardiac performance data and convert the results to digital form for direct entry into a commercial time-shared computer. Circuits are discussed which perform the measurement and digital conversion of instantaneous systolic and diastolic parameters from the analog blood pressure waveform. Digital averaging over a selected number of heart cycles is performed on these measurements, as well as those of flow and heart rate. The determination of average cardiac output and peripheral resistance, including trends, is the end result after processing by digital computer.
Non-parametric PCM to ADM conversion. [Pulse Code to Adaptive Delta Modulation
NASA Technical Reports Server (NTRS)
Locicero, J. L.; Schilling, D. L.
1977-01-01
An all-digital technique to convert pulse code modulated (PCM) signals into adaptive delta modulation (ADM) format is presented. The converter developed is shown to be independent of the statistical parameters of the encoded signal and can be constructed with only standard digital hardware. The structure of the converter is simple enough to be fabricated on a large scale integrated circuit where the advantages of reliability and cost can be optimized. A concise evaluation of this PCM to ADM translation technique is presented and several converters are simulated on a digital computer. A family of performance curves is given which displays the signal-to-noise ratio for sinusoidal test signals subjected to the conversion process, as a function of input signal power for several ratios of ADM rate to Nyquist rate.
NASA Astrophysics Data System (ADS)
Naldi, G.; Bartolini, M.; Mattana, A.; Pupillo, G.; Hickish, J.; Foster, G.; Bianchi, G.; Lingua, A.; Monari, J.; Montebugnoli, S.; Perini, F.; Rusticelli, S.; Schiaffino, M.; Virone, G.; Zarb Adami, K.
In radio astronomy Field Programmable Gate Array (FPGA) technology is largely used for the implementation of digital signal processing techniques applied to antenna arrays. This is mainly due to the good trade-off among computing resources, power consumption and cost offered by FPGA chip compared to other technologies like ASIC, GPU and CPU. In the last years several digital backend systems based on such devices have been developed at the Medicina radio astronomical station (INAF-IRA, Bologna, Italy). Instruments like FX correlator, direct imager, beamformer, multi-beam system have been successfully designed and realized on CASPER (Collaboration for Astronomy Signal Processing and Electronics Research, https://casper.berkeley.edu) processing boards. In this paper we present the gained experience in this kind of applications.
Large scale digital atlases in neuroscience
NASA Astrophysics Data System (ADS)
Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.
2014-03-01
Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.
NASA Technical Reports Server (NTRS)
Tranter, W. H.; Turner, M. D.
1977-01-01
Techniques are developed to estimate power gain, delay, signal-to-noise ratio, and mean square error in digital computer simulations of lowpass and bandpass systems. The techniques are applied to analog and digital communications. The signal-to-noise ratio estimates are shown to be maximum likelihood estimates in additive white Gaussian noise. The methods are seen to be especially useful for digital communication systems where the mapping from the signal-to-noise ratio to the error probability can be obtained. Simulation results show the techniques developed to be accurate and quite versatile in evaluating the performance of many systems through digital computer simulation.
Digital video technology, today and tomorrow
NASA Astrophysics Data System (ADS)
Liberman, J.
1994-10-01
Digital video is probably computing's fastest moving technology today. Just three years ago, the zenith of digital video technology on the PC was the successful marriage of digital text and graphics with analog audio and video by means of expensive analog laser disc players and video overlay boards. The state of the art involves two different approaches to fully digital video on computers: hardware-assisted and software-only solutions.
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Krosel, S. M.; Bruton, W. M.
1982-01-01
A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology that is pesented makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all the calculations and data manipulations that are needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self-contained engine model to match specified design-point information. Part I contains a general discussion of the methodology, describes a test case, and presents comparisons between hybrid simulation and specified engine performance data. Part II, a companion document, contains documentation, in the form of computer printouts, for the test case.
Digital pathology in nephrology clinical trials, research, and pathology practice.
Barisoni, Laura; Hodgin, Jeffrey B
2017-11-01
In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.
Utilization of KSC Present Broadband Communications Data System for Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2002-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
Digital Tools: Enhancing Painting Skills among Malaysian Secondary School Students
ERIC Educational Resources Information Center
Samah, Azimah A.; Putih, Abu Talib; Hussin, Zaharah
2016-01-01
Digital tools refer to software applications in the production of artworks particularly in painting. Digital art work is materialized by using computers, software and a combination of computer peripherals such as tablet support. With the aid of electronic equipment, digital artists manipulate pixels or coloring with light to compose the work and…
ERIC Educational Resources Information Center
van Langeveld, Mark Christensen
2009-01-01
Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…
Utilization of KSC Present Broadband Communications Data System For Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2001-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
Computational scalability of large size image dissemination
NASA Astrophysics Data System (ADS)
Kooper, Rob; Bajcsy, Peter
2011-01-01
We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.
Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas
Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas
Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Lewis hybrid computing system, users manual
NASA Technical Reports Server (NTRS)
Bruton, W. M.; Cwynar, D. S.
1979-01-01
The Lewis Research Center's Hybrid Simulation Lab contains a collection of analog, digital, and hybrid (combined analog and digital) computing equipment suitable for the dynamic simulation and analysis of complex systems. This report is intended as a guide to users of these computing systems. The report describes the available equipment' and outlines procedures for its use. Particular is given to the operation of the PACER 100 digital processor. System software to accomplish the usual digital tasks such as compiling, editing, etc. and Lewis-developed special purpose software are described.
Combining high performance simulation, data acquisition, and graphics display computers
NASA Technical Reports Server (NTRS)
Hickman, Robert J.
1989-01-01
Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.
High-resolution mapping of bifurcations in nonlinear biochemical circuits
NASA Astrophysics Data System (ADS)
Genot, A. J.; Baccouche, A.; Sieskind, R.; Aubert-Kato, N.; Bredeche, N.; Bartolo, J. F.; Taly, V.; Fujii, T.; Rondelez, Y.
2016-08-01
Analog molecular circuits can exploit the nonlinear nature of biochemical reaction networks to compute low-precision outputs with fewer resources than digital circuits. This analog computation is similar to that employed by gene-regulation networks. Although digital systems have a tractable link between structure and function, the nonlinear and continuous nature of analog circuits yields an intricate functional landscape, which makes their design counter-intuitive, their characterization laborious and their analysis delicate. Here, using droplet-based microfluidics, we map with high resolution and dimensionality the bifurcation diagrams of two synthetic, out-of-equilibrium and nonlinear programs: a bistable DNA switch and a predator-prey DNA oscillator. The diagrams delineate where function is optimal, dynamics bifurcates and models fail. Inverse problem solving on these large-scale data sets indicates interference from enzymatic coupling. Additionally, data mining exposes the presence of rare, stochastically bursting oscillators near deterministic bifurcations.
NASA Technical Reports Server (NTRS)
Masuoka, E.; Rose, J.; Quattromani, M.
1981-01-01
Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.
V/STOLAND digital avionics system for XV-15 tilt rotor
NASA Technical Reports Server (NTRS)
Liden, S.
1980-01-01
A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.
High-performance computing in image registration
NASA Astrophysics Data System (ADS)
Zanin, Michele; Remondino, Fabio; Dalla Mura, Mauro
2012-10-01
Thanks to the recent technological advances, a large variety of image data is at our disposal with variable geometric, radiometric and temporal resolution. In many applications the processing of such images needs high performance computing techniques in order to deliver timely responses e.g. for rapid decisions or real-time actions. Thus, parallel or distributed computing methods, Digital Signal Processor (DSP) architectures, Graphical Processing Unit (GPU) programming and Field-Programmable Gate Array (FPGA) devices have become essential tools for the challenging issue of processing large amount of geo-data. The article focuses on the processing and registration of large datasets of terrestrial and aerial images for 3D reconstruction, diagnostic purposes and monitoring of the environment. For the image alignment procedure, sets of corresponding feature points need to be automatically extracted in order to successively compute the geometric transformation that aligns the data. The feature extraction and matching are ones of the most computationally demanding operations in the processing chain thus, a great degree of automation and speed is mandatory. The details of the implemented operations (named LARES) exploiting parallel architectures and GPU are thus presented. The innovative aspects of the implementation are (i) the effectiveness on a large variety of unorganized and complex datasets, (ii) capability to work with high-resolution images and (iii) the speed of the computations. Examples and comparisons with standard CPU processing are also reported and commented.
System design and implementation of digital-image processing using computational grids
NASA Astrophysics Data System (ADS)
Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping
2005-06-01
As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.
Classified one-step high-radix signed-digit arithmetic units
NASA Astrophysics Data System (ADS)
Cherri, Abdallah K.
1998-08-01
High-radix number systems enable higher information storage density, less complexity, fewer system components, and fewer cascaded gates and operations. A simple one-step fully parallel high-radix signed-digit arithmetic is proposed for parallel optical computing based on new joint spatial encodings. This reduces hardware requirements and improves throughput by reducing the space-bandwidth produce needed. The high-radix signed-digit arithmetic operations are based on classifying the neighboring input digit pairs into various groups to reduce the computation rules. A new joint spatial encoding technique is developed to present both the operands and the computation rules. This technique increases the spatial bandwidth product of the spatial light modulators of the system. An optical implementation of the proposed high-radix signed-digit arithmetic operations is also presented. It is shown that our one-step trinary signed-digit and quaternary signed-digit arithmetic units are much simpler and better than all previously reported high-radix signed-digit techniques.
NASA Technical Reports Server (NTRS)
Ladson, C. L.; Brooks, Cuyler W., Jr.
1975-01-01
A computer program developed to calculate the ordinates and surface slopes of any thickness, symmetrical or cambered NACA airfoil of the 4-digit, 4-digit modified, 5-digit, and 16-series airfoil families is presented. The program produces plots of the airfoil nondimensional ordinates and a punch card output of ordinates in the input format of a readily available program for determining the pressure distributions of arbitrary airfoils in subsonic potential viscous flow.
Evaluation of Digital Technology and Software Use among Business Education Teachers
ERIC Educational Resources Information Center
Ellis, Richard S.; Okpala, Comfort O.
2004-01-01
Digital video cameras are part of the evolution of multimedia digital products that have positive applications for educators, students, and industry. Multimedia digital video can be utilized by any personal computer and it allows the user to control, combine, and manipulate different types of media, such as text, sound, video, computer graphics,…
Karavitis, G.A.
1984-01-01
The SIMSYS2D two-dimensional water-quality simulation system is a large-scale digital modeling software system used to simulate flow and transport of solutes in freshwater and estuarine environments. Due to the size, processing requirements, and complexity of the system, there is a need to easily move the system and its associated files between computer sites when required. A series of job control language (JCL) procedures was written to allow transferability between IBM and IBM-compatible computers. (USGS)
Distributed sensor networks: a cellular nonlinear network perspective.
Haenggi, Martin
2003-12-01
Large-scale networks of integrated wireless sensors become increasingly tractable. Advances in hardware technology and engineering design have led to dramatic reductions in size, power consumption, and cost for digital circuitry, and wireless communications. Networking, self-organization, and distributed operation are crucial ingredients to harness the sensing, computing, and computational capabilities of the nodes into a complete system. This article shows that those networks can be considered as cellular nonlinear networks (CNNs), and that their analysis and design may greatly benefit from the rich theoretical results available for CNNs.
Flight test validation of a design procedure for digital autopilots
NASA Technical Reports Server (NTRS)
Bryant, W. H.
1983-01-01
Commercially available general aviation autopilots are currently in transition from an analogue circuit system to a computer implemented digital flight control system. Well known advantages of the digital autopilot include enhanced modes, self-test capacity, fault detection, and greater computational capacity. A digital autopilot's computational capacity can be used to full advantage by increasing the sophistication of the digital autopilot's chief function, stability and control. NASA's Langley Research Center has been pursuing the development of direct digital design tools for aircraft stabilization systems for several years. This effort has most recently been directed towards the development and realization of multi-mode digital autopilots for GA aircraft, conducted under a SPIFR-related program called the General Aviation Terminal Operations Research (GATOR) Program. This presentation focuses on the implementation and testing of a candidate multi-mode autopilot designed using these newly developed tools.
Neuromorphic Hardware Architecture Using the Neural Engineering Framework for Pattern Recognition.
Wang, Runchun; Thakur, Chetan Singh; Cohen, Gregory; Hamilton, Tara Julia; Tapson, Jonathan; van Schaik, Andre
2017-06-01
We present a hardware architecture that uses the neural engineering framework (NEF) to implement large-scale neural networks on field programmable gate arrays (FPGAs) for performing massively parallel real-time pattern recognition. NEF is a framework that is capable of synthesising large-scale cognitive systems from subnetworks and we have previously presented an FPGA implementation of the NEF that successfully performs nonlinear mathematical computations. That work was developed based on a compact digital neural core, which consists of 64 neurons that are instantiated by a single physical neuron using a time-multiplexing approach. We have now scaled this approach up to build a pattern recognition system by combining identical neural cores together. As a proof of concept, we have developed a handwritten digit recognition system using the MNIST database and achieved a recognition rate of 96.55%. The system is implemented on a state-of-the-art FPGA and can process 5.12 million digits per second. The architecture and hardware optimisations presented offer high-speed and resource-efficient means for performing high-speed, neuromorphic, and massively parallel pattern recognition and classification tasks.
Performance of the all-digital data-transition tracking loop in the advanced receiver
NASA Astrophysics Data System (ADS)
Cheng, U.; Hinedi, S.
1989-11-01
The performance of the all-digital data-transition tracking loop (DTTL) with coherent or noncoherent sampling is described. The effects of few samples per symbol and of noncommensurate sampling rates and symbol rates are addressed and analyzed. Their impacts on the loop phase-error variance and the mean time to lose lock (MTLL) are quantified through computer simulations. The analysis and preliminary simulations indicate that with three to four samples per symbol, the DTTL can track with negligible jitter because of the presence of earth Doppler rate. Furthermore, the MTLL is also expected to be large engough to maintain lock over a Deep Space Network track.
Clinical evaluation of a 2K x 2K workstation for primary diagnosis in pediatric radiology
NASA Astrophysics Data System (ADS)
Razavi, Mahmood; Sayre, James W.; Simons, Margaret A.; Hamedaninia, Azar; Boechat, Maria I.; Hall, Theodore R.; Kangarloo, Hooshang; Taira, Ricky K.; Chuang, Keh-Shih; Kashifian, Payam
1991-07-01
Preliminary results of a large-scale ROC study evaluating the diagnostic performance of digital hardcopy film and 2K X 2K softcopy display for pediatric chest radiographs are presented. The pediatric disease categories studied were pneumothorax, linear atelectasis, air bronchograms, and interstitial disease. Digital images were obtained directly from a computed radiography system. Results from the readings of 239 chest radiographs by 4 radiologists show no significant difference between viewing images on film and softcopy display for the disease categories pneumothorax and air bronchograms. A slight performance edge for softcopy was seen for the disease categories of interstitial disease and linear atelectasis.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.
1975-01-01
The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.
Nonlinear ordinary difference equations
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1979-01-01
Future space vehicles will be relatively large and flexible, and active control will be necessary to maintain geometrical configuration. While the stresses and strains in these space vehicles are not expected to be excessively large, their cumulative effects will cause significant geometrical nonlinearities to appear in the equations of motion, in addition to the nonlinearities caused by material properties. Since the only effective tool for the analysis of such large complex structures is the digital computer, it will be necessary to gain a better understanding of the nonlinear ordinary difference equations which result from the time discretization of the semidiscrete equations of motion for such structures.
ERIC Educational Resources Information Center
Messina Dahlberg, Giulia; Bagga-Gupta, Sangeeta
2014-01-01
The use of digital tools like computers and tablets in institutional learning arenas give rise to forms of flexibility where time and space boundaries become diffuse. Online learning sites are understood as being crucial today, especially in large parts of the Global North, where anyone anywhere potentially can become a student and have access to…
Avionic Data Bus Integration Technology
1991-12-01
address the hardware-software interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion ...the SCP. In 1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error... MULTIVERSION PROGRAMMING. N-version programming. 226 N-VERSION PROGRAMMING. The independent coding of a number, N, of redundant computer programs that
Systolic VLSI Reed-Solomon Decoder
NASA Technical Reports Server (NTRS)
Shao, H. M.; Truong, T. K.; Deutsch, L. J.; Yuen, J. H.
1986-01-01
Decoder for digital communications provides high-speed, pipelined ReedSolomon (RS) error-correction decoding of data streams. Principal new feature of proposed decoder is modification of Euclid greatest-common-divisor algorithm to avoid need for time-consuming computations of inverse of certain Galois-field quantities. Decoder architecture suitable for implementation on very-large-scale integrated (VLSI) chips with negative-channel metaloxide/silicon circuitry.
ERIC Educational Resources Information Center
Bodén, Linnea
2013-01-01
An increasing number of Swedish municipalities use digital software to manage the registration of students' school absences. The software is regarded as a problem-solving tool to make registration more efficient, but its effects on the educational setting have been largely neglected. Focusing on an event with two students from a class of…
A New Way of Using the Interactive Whiteboard in a High School Physics Classroom: A Case Study
ERIC Educational Resources Information Center
Gregorcic, Bor; Etkina, Eugenia; Planinsic, Gorazd
2018-01-01
In recent decades, the interactive whiteboard (IWB) has become a relatively common educational tool in Western schools. The IWB is essentially a large touch screen, that enables the user to interact with digital content in ways that are not possible with an ordinary computer-projector-canvas setup. However, the unique possibilities of IWBs are…
Optical Fiber Transmission In A Picture Archiving And Communication System For Medical Applications
NASA Astrophysics Data System (ADS)
Aaron, Gilles; Bonnard, Rene
1984-03-01
In an hospital, the need for an electronic communication network is increasing along with the digitization of pictures. This local area network is intended to link some picture sources such as digital radiography, computed tomography, nuclear magnetic resonance, ultrasounds etc...with an archiving system. Interactive displays can be used in examination rooms, physicians offices and clinics. In such a system, three major requirements must be considered : bit-rate, cable length, and number of devices. - The bit-rate is very important because a maximum response time of a few seconds must be guaranteed for several mega-bit pictures. - The distance between nodes may be a few kilometers in some large hospitals. - The number of devices connected to the network is never greater than a few tens because picture sources and computers represent important hardware, and simple displays can be concentrated. All these conditions are fulfilled by optical fiber transmissions. Depending on the topology and the access protocol, two solutions are to be considered - Active ring - Active or passive star Finally Thomson-CSF developments of optical transmission devices for large networks of TV distribution bring us a technological support and a mass produc-tion which will cut down hardware costs.
Global detection of large lunar craters based on the CE-1 digital elevation model
NASA Astrophysics Data System (ADS)
Luo, Lei; Mu, Lingli; Wang, Xinyuan; Li, Chao; Ji, Wei; Zhao, Jinjin; Cai, Heng
2013-12-01
Craters, one of the most significant features of the lunar surface, have been widely researched because they offer us the relative age of the surface unit as well as crucial geological information. Research on crater detection algorithms (CDAs) of the Moon and other planetary bodies has concentrated on detecting them from imagery data, but the computational cost of detecting large craters using images makes these CDAs impractical. This paper presents a new approach to crater detection that utilizes a digital elevation model instead of images; this enables fully automatic global detection of large craters. Craters were delineated by terrain attributes, and then thresholding maps of terrain attributes were used to transform topographic data into a binary image, finally craters were detected by using the Hough Transform from the binary image. By using the proposed algorithm, we produced a catalog of all craters ⩾10 km in diameter on the lunar surface and analyzed their distribution and population characteristics.
1991-03-31
I AD-A232 768 I Annual Report Analysis of Polarizing Optical Systems for Digital Optical Computing with I ’ Symmetric Self Electrooptic Devices I To...TTU AND SuSiIU S. PUNDIN mUMBERS Polarizing Optical Systems for Digital Optical Computing with Symmetric Self Electrooptic Devices AFOSR-89-0542 C...UTION COO$ UNLIMITED 13. ABSTRACT (MAxnum00woUw Two architectural approaches have dominated the field of optical computing . The first appAch uses
ERIC Educational Resources Information Center
Ba, Harouna; Tally, Bill; Tsikalas, Kallen
The EDC (Educational Development Center) Center for Children and Technology (CCT) and Computers for Youth (CFY) completed a 1-year comparative study of children's use of computers in low- and middle-income homes. The study explores the digital divide as a literacy issue, rather than merely a technical one. Digital literacy is defined as a set of…
Digital data storage systems, computers, and data verification methods
Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.
2005-12-27
Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.
Exploration for porphyry copper deposits in Pakistan using digital processing of Landsat-1 data
NASA Technical Reports Server (NTRS)
Schmidt, R. G.
1976-01-01
Rock-type classification by digital-computer processing of Landsat-1 multispectral scanner data has been used to select 23 prospecting targets in the Chagai District, Pakistan, five of which have proved to be large areas of hydrothermally altered porphyry containing pyrite. Empirical maximum and minimum apparent reflectance limits were selected for each multispectral scanner band in each rock type classified, and a relatively unrefined classification table was prepared. Where the values for all four bands fitted within the limits designated for a particular class, a symbol for the presumed rock type was printed by the computer at the appropriate location. Drainage channels, areas of mineralized quartz diorite, areas of pyrite-rich rock, and the approximate limit of propylitic alteration were very well delineated on the computer-generated map of the test area. The classification method was used to evaluate 2,100 sq km in the Mashki Chah region. The results of the experiment show that outcrops of hydrothermally altered and mineralized rock can be identified from Landsat-1 data under favorable conditions.
NASA Astrophysics Data System (ADS)
Cunningham, Sally Jo
The current crop of digital libraries for the computing community are strongly grounded in the conventional library paradigm: they provide indexes to support searching of collections of research papers. As such, these digital libraries are relatively impoverished; the present computing digital libraries omit many of the documents and resources that are currently available to computing researchers, and offer few browsing structures. These computing digital libraries were built 'top down': the resources and collection contents are forced to fit an existing digital library architecture. A 'bottom up' approach to digital library development would begin with an investigation of a community's information needs and available documents, and then design a library to organize those documents in such a way as to fulfill the community's needs. The 'home grown', informal information resources developed by and for the machine learning community are examined as a case study, to determine the types of information and document organizations 'native' to this group of researchers. The insights gained in this type of case study can be used to inform construction of a digital library tailored to this community.
A special planning technique for stream-aquifer systems
Jenkins, C.T.; Taylor, O. James
1974-01-01
The potential effects of water-management plans on stream-aquifer systems in several countries have been simulated using electric-analog or digital-computer models. Many of the electric-analog models require large amounts of hardware preparation for each problem to be solved and some become so bulky that they present serious space and access problems. Digital-computer models require no special hardware preparation but often they require so many repetitive solutions of equations that they result in calculations that are unduly unwieldy and expensive, even on the latest generation of computers. Further, the more detailed digital models require a vast amount of core storage, leaving insufficient storage for evaluation of the many possible schemes of water-management. A concept introduced in 1968 by the senior author of this report offers a solution to these problems. The concept is that the effects on streamflow of ground-water withdrawal or recharge (stress) at any point in such a system can be approximated using two classical equations and a value of time that reflects the integrated effect of the following: irregular impermeable boundaries; stream meanders; aquifer properties and their areal variations; distance of the point from the stream; and imperfect hydraulic connection between the stream and the aquifer. The value of time is called the stream depletion factor (sdf). Results of a relatively few tests on detailed models can be summarized on maps showing lines through points of equal sdf. Sensitivity analyses of models of two large stream-aquifer systems in the State of Colorado show that the sdf technique described in this report provides results within tolerable ranges of error. The sdf technique is extremely versatile, allowing water managers to choose the degree of detail that best suits their needs and available computational hardware. Simple arithmetic, using, for example, only a slide rule and charts or tables of dimensionless values, will be sufficient for many calculations. If a large digital computer is available, detailed description of the system and its stresses will require only a fraction of the core storage, leaving the greater part of the storage available for sophisticated analyses, such as optimization. Once these analyses have been made, the model then is ready to perform its principal task--prediction of streamflow and changes in ground-water storage. In the two systems described in this report, direct diversion from the streams is the principal source of irrigation water, but it is supplemented by numerous wells. The streamflow depends largely on snowmelt. Estimates of both the amount and timing of runoff from snowmelt during the irrigation season are available on a monthly basis during the spring and early summer. These estimates become increasingly accurate as the season progresses, hence frequent changes of stress on the predictive model are necessary. The sdf technique is especially well suited to this purpose, because it is very easy to make such changes, resulting in more up-todate estimates of the availability of streamflow and ground-water storage. These estimates can be made for any time and any location in the system.
NASA Astrophysics Data System (ADS)
Cook, Perry R.
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).
Arunyanak, Sirikarn P; Harris, Bryan T; Grant, Gerald T; Morton, Dean; Lin, Wei-Shao
2016-07-01
This report describes a digital approach for computer-guided surgery and immediate provisionalization in a partially edentulous patient. With diagnostic data obtained from cone-beam computed tomography and intraoral digital diagnostic scans, a digital pathway of virtual diagnostic waxing, a virtual prosthetically driven surgical plan, a computer-aided design and computer-aided manufacturing (CAD/CAM) surgical template, and implant-supported screw-retained interim restorations were realized with various open-architecture CAD/CAM systems. The optional CAD/CAM diagnostic casts with planned implant placement were also additively manufactured to facilitate preoperative inspection of the surgical template and customization of the CAD/CAM-fabricated interim restorations. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.
2014-03-01
Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration.
Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.
2014-01-01
Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration. PMID:24599023
Optical memories in digital computing
NASA Technical Reports Server (NTRS)
Alford, C. O.; Gaylord, T. K.
1979-01-01
High capacity optical memories with relatively-high data-transfer rate and multiport simultaneous access capability may serve as basis for new computer architectures. Several computer structures that might profitably use memories are: a) simultaneous record-access system, b) simultaneously-shared memory computer system, and c) parallel digital processing structure.
Two-dimensional radiant energy array computers and computing devices
NASA Technical Reports Server (NTRS)
Schaefer, D. H.; Strong, J. P., III (Inventor)
1976-01-01
Two dimensional digital computers and computer devices operate in parallel on rectangular arrays of digital radiant energy optical signal elements which are arranged in ordered rows and columns. Logic gate devices receive two input arrays and provide an output array having digital states dependent only on the digital states of the signal elements of the two input arrays at corresponding row and column positions. The logic devices include an array of photoconductors responsive to at least one of the input arrays for either selectively accelerating electrons to a phosphor output surface, applying potentials to an electroluminescent output layer, exciting an array of discrete radiant energy sources, or exciting a liquid crystal to influence crystal transparency or reflectivity.
Research on digital city geographic information common services platform
NASA Astrophysics Data System (ADS)
Chen, Dequan; Wu, Qunyong; Wang, Qinmin
2008-10-01
Traditional GIS (Geographic Information System) software development mode exposes many defects that will largely slow down the city informational progress. It is urgent need to build a common application infrastructure for informational project to speed up the development pace of digital city. The advent of service-oriented architecture (SOA) has motivated the adoption of GIS functionality portals that can be executed in distributed computing environment. According to the SOA principle, we bring forward and design a digital city geographic information common services platform which provides application development service interfaces for field users that can be further extended relevant business application. In the end, a public-oriented Web GIS is developed based on the platform for helping public users to query geographic information in their daily life. It indicates that our platform have the capacity that can be integrated by other applications conveniently.
Will the digital computer transform classical mathematics?
Rotman, Brian
2003-08-15
Mathematics and machines have influenced each other for millennia. The advent of the digital computer introduced a powerfully new element that promises to transform the relation between them. This paper outlines the thesis that the effect of the digital computer on mathematics, already widespread, is likely to be radical and far-reaching. To articulate this claim, an abstract model of doing mathematics is introduced based on a triad of actors of which one, the 'agent', corresponds to the function performed by the computer. The model is used to frame two sorts of transformation. The first is pragmatic and involves the alterations and progressive colonization of the content and methods of enquiry of various mathematical fields brought about by digital methods. The second is conceptual and concerns a fundamental antagonism between the infinity enshrined in classical mathematics and physics (continuity, real numbers, asymptotic definitions) and the inherently real and material limit of processes associated with digital computation. An example which lies in the intersection of classical mathematics and computer science, the P=NP problem, is analysed in the light of this latter issue.
[Overall digitalization: leading innovation of endodontics in big data era].
Ling, J Q
2016-04-09
In big data era, digital technologies bring great challenges and opportunities to modern stomatology. The applications of digital technologies, such as cone-beam CT(CBCT), computer aided design,(CAD)and computer aided manufacture(CAM), 3D printing and digital approaches for education , provide new concepts and patterns to the treatment and study of endodontic diseases. This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics.
Comparison of digital controllers used in magnetic suspension and balance systems
NASA Technical Reports Server (NTRS)
Kilgore, William A.
1990-01-01
Dynamic systems that were once controlled by analog circuits are now controlled by digital computers. Presented is a comparison of the digital controllers presently used with magnetic suspension and balance systems. The overall responses of the systems are compared using a computer simulation of the magnetic suspension and balance system and the digital controllers. The comparisons include responses to both simulated force and position inputs. A preferred digital controller is determined from the simulated responses.
Digital Workflow for Computer-Guided Implant Surgery in Edentulous Patients: A Case Report.
Oh, Ji-Hyeon; An, Xueyin; Jeong, Seung-Mi; Choi, Byung-Ho
2017-12-01
The purpose of this article was to describe a fully digital workflow used to perform computer-guided flapless implant placement in an edentulous patient without the use of conventional impressions, models, or a radiographic guide. Digital data for the workflow were acquired using an intraoral scanner and cone-beam computed tomography (CBCT). The image fusion of the intraoral scan data and CBCT data was performed by matching resin markers placed in the patient's mouth. The definitive digital data were used to design a prosthetically driven implant position, surgical template, and computer-aided design and computer-aided manufacturing fabricated fixed dental prosthesis. The authors believe this is the first published case describing such a technique in computer-guided flapless implant surgery for edentulous patients. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
On the Rapid Computation of Various Polylogarithmic Constants
NASA Technical Reports Server (NTRS)
Bailey, David H.; Borwein, Peter; Plouffe, Simon
1996-01-01
We give algorithms for the computation of the d-th digit of certain transcendental numbers in various bases. These algorithms can be easily implemented (multiple precision arithmetic is not needed), require virtually no memory, and feature run times that scale nearly linearly with the order of the digit desired. They make it feasible to compute, for example, the billionth binary digit of log(2) or pi on a modest workstation in a few hours run time. We demonstrate this technique by computing the ten billionth hexadecimal digit of pi, the billionth hexadecimal digits of pi-squared, log(2) and log-squared(2), and the ten billionth decimal digit of log(9/10). These calculations rest on the observation that very special types of identities exist for certain numbers like pi, pi-squared, log(2) and log-squared(2). These are essentially polylogarithmic ladders in an integer base. A number of these identities that we derive in this work appear to be new, for example a critical identity for pi.
Matrix-vector multiplication using digital partitioning for more accurate optical computing
NASA Technical Reports Server (NTRS)
Gary, C. K.
1992-01-01
Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.
Digitized adiabatic quantum computing with a superconducting circuit.
Barends, R; Shabani, A; Lamata, L; Kelly, J; Mezzacapo, A; Las Heras, U; Babbush, R; Fowler, A G; Campbell, B; Chen, Yu; Chen, Z; Chiaro, B; Dunsworth, A; Jeffrey, E; Lucero, E; Megrant, A; Mutus, J Y; Neeley, M; Neill, C; O'Malley, P J J; Quintana, C; Roushan, P; Sank, D; Vainsencher, A; Wenner, J; White, T C; Solano, E; Neven, H; Martinis, John M
2016-06-09
Quantum mechanics can help to solve complex problems in physics and chemistry, provided they can be programmed in a physical device. In adiabatic quantum computing, a system is slowly evolved from the ground state of a simple initial Hamiltonian to a final Hamiltonian that encodes a computational problem. The appeal of this approach lies in the combination of simplicity and generality; in principle, any problem can be encoded. In practice, applications are restricted by limited connectivity, available interactions and noise. A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction, but uses quantum circuit algorithms that are problem-specific. Here we combine the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. We tomographically probe the system during the digitized evolution and explore the scaling of errors with system size. We then let the full system find the solution to random instances of the one-dimensional Ising problem as well as problem Hamiltonians that involve more complex interactions. This digital quantum simulation of the adiabatic algorithm consists of up to nine qubits and up to 1,000 quantum logic gates. The demonstration of digitized adiabatic quantum computing in the solid state opens a path to synthesizing long-range correlations and solving complex computational problems. When combined with fault-tolerance, our approach becomes a general-purpose algorithm that is scalable.
A Digital Motion Control System for Large Telescopes
NASA Astrophysics Data System (ADS)
Hunter, T. R.; Wilson, R. W.; Kimberk, R.; Leiker, P. S.
2001-05-01
We have designed and programmed a digital motion control system for large telescopes, in particular, the 6-meter antennas of the Submillimeter Array on Mauna Kea. The system consists of a single robust, high-reliability microcontroller board which implements a two-axis velocity servo while monitoring and responding to critical safety parameters. Excellent tracking performance has been achieved with this system (0.3 arcsecond RMS at sidereal rate). The 24x24 centimeter four-layer printed circuit board contains a multitude of hardware devices: 40 digital inputs (for limit switches and fault indicators), 32 digital outputs (to enable/disable motor amplifiers and brakes), a quad 22-bit ADC (to read the motor tachometers), four 16-bit DACs (that provide torque signals to the motor amplifiers), a 32-LED status panel, a serial port to the LynxOS PowerPC antenna computer (RS422/460kbps), a serial port to the Palm Vx handpaddle (RS232/115kbps), and serial links to the low-resolution absolute encoders on the azimuth and elevation axes. Each section of the board employs independent ground planes and power supplies, with optical isolation on all I/O channels. The processor is an Intel 80C196KC 16-bit microcontroller running at 20MHz on an 8-bit bus. This processor executes an interrupt-driven, scheduler-based software system written in C and assembled into an EPROM with user-accessible variables stored in NVSRAM. Under normal operation, velocity update requests arrive at 100Hz from the position-loop servo process running independently on the antenna computer. A variety of telescope safety checks are performed at 279Hz including routine servicing of a 6 millisecond watchdog timer. Additional ADCs onboard the microcontroller monitor the winding temperature and current in the brushless three-phase drive motors. The PID servo gains can be dynamically changed in software. Calibration factors and software filters can be applied to the tachometer readings prior to the application of the servo gains in the torque computations. The Palm pilot handpaddle displays the complete status of the telescope and allows full local control of the drives in an intuitive, touchscreen user interface which is especially useful during reconfigurations of the antenna array.
Programming Wireless Handheld Devices for Applications in Teaching Astronomy
NASA Astrophysics Data System (ADS)
Budiardja, R.; Saranathan, V.; Guidry, M.
2002-12-01
Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. The presentation will include hands-on demonstrations with real devices.
Use of digital technologies for nasal prosthesis manufacturing.
Palousek, David; Rosicky, Jiri; Koutny, Daniel
2014-04-01
Digital technology is becoming more accessible for common use in medical applications; however, their expansion in prosthetic and orthotic laboratories is not large because of the persistent image of difficult applicability to real patients. This article aims to offer real example in the area of human facial prostheses. This article describes the utilization of optical digitization, computational modelling, rapid prototyping, mould fabrication and manufacturing of a nasal silicone prosthesis. This technical note defines the key points of the methodology and aspires to contribute to the introduction of a certified manufacturing procedure. The results show that the used technologies reduce the manufacturing time, reflect patient's requirements and allow the manufacture of high-quality prostheses for missing facial asymmetric parts. The methodology provides a good position for further development issues and is usable for clinical practice. Clinical relevance Utilization of digital technologies in facial prosthesis manufacturing process can be a good contribution for higher patient comfort and higher production efficiency but with higher initial investment and demands for experience with software tools.
Discrete-time stability of continuous-time controller designs for large space structures
NASA Technical Reports Server (NTRS)
Balas, M. J.
1982-01-01
In most of the stable control designs for flexible structures, continuous time is assumed. However, in view of the implementation of the controllers by on-line digital computers, the discrete-time stability of such controllers is an important consideration. In the case of direct-velocity feedback (DVFB), involving negative feedback from collocated force actuators and velocity sensors, it is not immediately apparent how much delay due to digital implementation of DVFB can be tolerated without loss of stability. The present investigation is concerned with such questions. A study is conducted of the discrete-time stability of DVFB, taking into account an employment of Euler's method of approximation of the time derivative. The obtained result gives an indication of the acceptable time-step size for stable digital implementation of DVFB. A result derived in connection with the consideration of the discrete-time stability of stable continuous-time systems provides a general condition under which digital implementation of such a system will remain stable.
ERIC Educational Resources Information Center
Morris, Jonathan Padraig
2011-01-01
Attempts to bridge the Digital Divide have seen vast investment in Information Communication Technology in schools. In the United Kingdom, the Computers for Pupils initiative has invested 60 million British Pounds of funds to help some of the most disadvantaged secondary school pupils by putting a computer in their home. This paper charts and…
1981-11-30
COMPUTER PROGRAM USER’S MANUAL FOR FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM 30 NOVEMBER 1981 by: Marie Ceres Leslie R...Library .............................. 1-2 1.2.3 Dubbing .......................... 1-2 1.3 Library Process Overview ..................... 1-3 2 LIBRARY...RPOSE AND SCOPE This manual describes the computer programs for the FIREFINDER Digital Topographic Data Veri fication-Library- Dubbing System (FFDTDVLDS
A Scoping Review of Digital Gaming Research Involving Older Adults Aged 85 and Older.
Marston, Hannah R; Freeman, Shannon; Bishop, Kristen A; Beech, Christian L
2016-06-01
Interest in the use of digital game technologies by older adults is growing across disciplines from health and gerontology to computer science and game studies. The objective of this scoping review was to examine research evidence involving the oldest old (persons 85 years of age or greater) and digital game technology. PubMed, CINHAL, and Scopus were searched, and 46 articles were included in this review. Results highlighted that 60 percent of articles were published in gerontological journals, whereas only 8.7 percent were published in computer science journals. No studies focused directly on the oldest old population. Few studies included sample sizes greater than 100 participants. Seven primary and 34 secondary themes were identified, of which Hardware Technology and Assessment were the most common. Existing evidence demonstrates the paucity of studies engaging older adults 85 years of age and above regarding the use of digital gaming and highlights a new understudied cohort for further research focus. Recommendations for future research include intentional recruitment and proportionate representation of participants ≥85 years of age, large sample sizes, and explicit mention of specific numbers of participants ≥85 years of age, which are necessary to advance knowledge in this area. Integrating a rigorous and robust mixed-methods approach including theoretical perspectives would lend itself to further in-depth understanding and knowledge generation in this field.
Computer Instructional Aids for Undergraduate Control Education. 1978 Edition.
ERIC Educational Resources Information Center
Volz, Richard A.; And Others
This work represents the development of computer tools for undergraduate students. Emphasis is on automatic control theory using hybrid and digital computation. The routine calculations of control system analysis are presented as students would use them on the University of Michigan's central digital computer and the time-shared graphic terminals…
Test and control computer user's guide for a digital beam former test system
NASA Technical Reports Server (NTRS)
Alexovich, Robert E.; Mallasch, Paul G.
1992-01-01
A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.
An all digital phase locked loop for FM demodulation.
NASA Technical Reports Server (NTRS)
Greco, J.; Garodnick, J.; Schilling, D. L.
1972-01-01
A phase-locked loop designed with all-digital circuitry which avoids certain problems, and a digital voltage controlled oscillator algorithm are described. The system operates synchronously and performs all required digital calculations within one sampling period, thereby performing as a real-time special-purpose computer. The SNR ratio is computed for frequency offsets and sinusoidal modulation, and experimental results verify the theoretical calculations.
28 CFR 75.6 - Statement describing location of books and records.
Code of Federal Regulations, 2011 CFR
2011-07-01
...- or computer-manipulated image, digital image, or picture, or other matter (including but not limited... the book, magazine, periodical, film, videotape, digitally- or computer-manipulated image, digital image, picture, or other matter to affix the statement. In this paragraph, the term “copy” includes...
Molecular computational elements encode large populations of small objects
NASA Astrophysics Data System (ADS)
Prasanna de Silva, A.; James, Mark R.; McKinney, Bernadine O. F.; Pears, David A.; Weir, Sheenagh M.
2006-10-01
Since the introduction of molecular computation, experimental molecular computational elements have grown to encompass small-scale integration, arithmetic and games, among others. However, the need for a practical application has been pressing. Here we present molecular computational identification (MCID), a demonstration that molecular logic and computation can be applied to a widely relevant issue. Examples of populations that need encoding in the microscopic world are cells in diagnostics or beads in combinatorial chemistry (tags). Taking advantage of the small size (about 1nm) and large `on/off' output ratios of molecular logic gates and using the great variety of logic types, input chemical combinations, switching thresholds and even gate arrays in addition to colours, we produce unique identifiers for members of populations of small polymer beads (about 100μm) used for synthesis of combinatorial libraries. Many millions of distinguishable tags become available. This method should be extensible to far smaller objects, with the only requirement being a `wash and watch' protocol. Our focus on converting molecular science into technology concerning analog sensors, turns to digital logic devices in the present work.
Molecular computational elements encode large populations of small objects.
de Silva, A Prasanna; James, Mark R; McKinney, Bernadine O F; Pears, David A; Weir, Sheenagh M
2006-10-01
Since the introduction of molecular computation, experimental molecular computational elements have grown to encompass small-scale integration, arithmetic and games, among others. However, the need for a practical application has been pressing. Here we present molecular computational identification (MCID), a demonstration that molecular logic and computation can be applied to a widely relevant issue. Examples of populations that need encoding in the microscopic world are cells in diagnostics or beads in combinatorial chemistry (tags). Taking advantage of the small size (about 1 nm) and large 'on/off' output ratios of molecular logic gates and using the great variety of logic types, input chemical combinations, switching thresholds and even gate arrays in addition to colours, we produce unique identifiers for members of populations of small polymer beads (about 100 microm) used for synthesis of combinatorial libraries. Many millions of distinguishable tags become available. This method should be extensible to far smaller objects, with the only requirement being a 'wash and watch' protocol. Our focus on converting molecular science into technology concerning analog sensors, turns to digital logic devices in the present work.
ERIC Educational Resources Information Center
Lanham, Richard A.
1989-01-01
Traces the early history of the electronic digital computer and the viewpoints held concerning the computer from its inception to its present status. Highlights three key words ("mimesis,""topic," and "decorum") to develop the rhetoricality of the personal computer as a communications device. (KEH)
Janowczyk, Andrew; Doyle, Scott; Gilmore, Hannah; Madabhushi, Anant
2018-01-01
Deep learning (DL) has recently been successfully applied to a number of image analysis problems. However, DL approaches tend to be inefficient for segmentation on large image data, such as high-resolution digital pathology slide images. For example, typical breast biopsy images scanned at 40× magnification contain billions of pixels, of which usually only a small percentage belong to the class of interest. For a typical naïve deep learning scheme, parsing through and interrogating all the image pixels would represent hundreds if not thousands of hours of compute time using high performance computing environments. In this paper, we present a resolution adaptive deep hierarchical (RADHicaL) learning scheme wherein DL networks at lower resolutions are leveraged to determine if higher levels of magnification, and thus computation, are necessary to provide precise results. We evaluate our approach on a nuclear segmentation task with a cohort of 141 ER+ breast cancer images and show we can reduce computation time on average by about 85%. Expert annotations of 12,000 nuclei across these 141 images were employed for quantitative evaluation of RADHicaL. A head-to-head comparison with a naïve DL approach, operating solely at the highest magnification, yielded the following performance metrics: .9407 vs .9854 Detection Rate, .8218 vs .8489 F -score, .8061 vs .8364 true positive rate and .8822 vs 0.8932 positive predictive value. Our performance indices compare favourably with state of the art nuclear segmentation approaches for digital pathology images.
Multi-source Geospatial Data Analysis with Google Earth Engine
NASA Astrophysics Data System (ADS)
Erickson, T.
2014-12-01
The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org
CMOS-compatible InP/InGaAs digital photoreceiver
Lovejoy, Michael L.; Rose, Benny H.; Craft, David C.; Enquist, Paul M.; Slater, Jr., David B.
1997-01-01
A digital photoreceiver is formed monolithically on an InP semiconductor substrate and comprises a p-i-n photodetector formed from a plurality of InP/InGaAs layers deposited by an epitaxial growth process and an adjacent heterojunction bipolar transistor (HBT) amplifier formed from the same InP/InGaAs layers. The photoreceiver amplifier operates in a large-signal mode to convert a detected photocurrent signal into an amplified output capable of directly driving integrated circuits such as CMOS. In combination with an optical transmitter, the photoreceiver may be used to establish a short-range channel of digital optical communications between integrated circuits with applications to multi-chip modules (MCMs). The photoreceiver may also be used with fiber optic coupling for establishing longer-range digital communications (i.e. optical interconnects) between distributed computers or the like. Arrays of digital photoreceivers may be formed on a common substrate for establishing a plurality of channels of digital optical communication, with each photoreceiver being spaced by less than about 1 mm and consuming less than about 20 mW of power, and preferably less than about 10 mW. Such photoreceiver arrays are useful for transferring huge amounts of digital data between integrated circuits at bit rates of up to about 1000 Mb/s or more.
CMOS-compatible InP/InGaAs digital photoreceiver
Lovejoy, M.L.; Rose, B.H.; Craft, D.C.; Enquist, P.M.; Slater, D.B. Jr.
1997-11-04
A digital photoreceiver is formed monolithically on an InP semiconductor substrate and comprises a p-i-n photodetector formed from a plurality of InP/InGaAs layers deposited by an epitaxial growth process and an adjacent heterojunction bipolar transistor (HBT) amplifier formed from the same InP/InGaAs layers. The photoreceiver amplifier operates in a large-signal mode to convert a detected photocurrent signal into an amplified output capable of directly driving integrated circuits such as CMOS. In combination with an optical transmitter, the photoreceiver may be used to establish a short-range channel of digital optical communications between integrated circuits with applications to multi-chip modules (MCMs). The photoreceiver may also be used with fiber optic coupling for establishing longer-range digital communications (i.e. optical interconnects) between distributed computers or the like. Arrays of digital photoreceivers may be formed on a common substrate for establishing a plurality of channels of digital optical communication, with each photoreceiver being spaced by less than about 1 mm and consuming less than about 20 mW of power, and preferably less than about 10 mW. Such photoreceiver arrays are useful for transferring huge amounts of digital data between integrated circuits at bit rates of up to about 1,000 Mb/s or more. 4 figs.
ERIC Educational Resources Information Center
Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.
2001-01-01
Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…
NASA Astrophysics Data System (ADS)
Zhang, Yunlu; Yan, Lei; Liou, Frank
2018-05-01
The quality initial guess of deformation parameters in digital image correlation (DIC) has a serious impact on convergence, robustness, and efficiency of the following subpixel level searching stage. In this work, an improved feature-based initial guess (FB-IG) scheme is presented to provide initial guess for points of interest (POIs) inside a large region. Oriented FAST and Rotated BRIEF (ORB) features are semi-uniformly extracted from the region of interest (ROI) and matched to provide initial deformation information. False matched pairs are eliminated by the novel feature guided Gaussian mixture model (FG-GMM) point set registration algorithm, and nonuniform deformation parameters of the versatile reproducing kernel Hilbert space (RKHS) function are calculated simultaneously. Validations on simulated images and real-world mini tensile test verify that this scheme can robustly and accurately compute initial guesses with semi-subpixel level accuracy in cases with small or large translation, deformation, or rotation.
Divergence of Digital World of Teachers
ERIC Educational Resources Information Center
Uzunboylu, Huseyin; Tuncay, Nazime
2010-01-01
There exists great diversity in the teachers' digital world. Teachers are being discriminated based on numerous educational gaps. This paper seeks to assess the extent of the digital divide among the North Cyprus vocational teachers along the four axes: age, Internet access, computer access, and performance (computing knowledge/experience). A…
Finding a roadmap to achieve large neuromorphic hardware systems
Hasler, Jennifer; Marr, Bo
2013-01-01
Neuromorphic systems are gaining increasing importance in an era where CMOS digital computing techniques are reaching physical limits. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. Toward this end, the authors provide a glimpse at what the technology evolution roadmap looks like for these systems so that Neuromorphic engineers may gain the same benefit of anticipation and foresight that IC designers gained from Moore's law many years ago. Scaling of energy efficiency, performance, and size will be discussed as well as how the implementation and application space of Neuromorphic systems are expected to evolve over time. PMID:24058330
ERIC Educational Resources Information Center
Mechling, Linda C.; Youhouse, Iva R.
2012-01-01
This investigation compared the ability of students with disabilities to complete fine motor tasks when presented with video models on a small personal digital assistant (PDA) screen and a traditional computer laptop screen. Two groups of elementary age students participated in the study: four with moderate intellectual disabilities (Moderate ID),…
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Ketcham, R. A.
2017-12-01
Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.
A computer program for obtaining airplane configuration plots from digital Datcom input data
NASA Technical Reports Server (NTRS)
Roy, M. L.; Sliwa, S. M.
1983-01-01
A computer program is described which reads the input file for the Stability and Control Digital Datcom program and generates plots from the aircraft configuration data. These plots can be used to verify the geometric input data to the Digital Datcom program. The program described interfaces with utilities available for plotting aircraft configurations by creating a file from the Digital Datcom input data.
High Performance Proactive Digital Forensics
NASA Astrophysics Data System (ADS)
Alharbi, Soltan; Moa, Belaid; Weber-Jahnke, Jens; Traore, Issa
2012-10-01
With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.
NASA Astrophysics Data System (ADS)
Guo, Jie; Zhu, Chang`an
2016-01-01
The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.
The big data challenges of connectomics.
Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir
2014-11-01
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.
ProShare teleconferencing with KIDSAT participants
1997-02-27
STS081-378-012 (12-22 January 1997) --- Astronaut Marsha S. Ivins, mission specialist, looks at digital still photo imagery on a lap top computer on the Space Shuttle Atlantis' aft flight deck while communicating with students on Earth. Her activity is all part of the once-a-year shuttle participation in an educational endeavor called KidSat. The KidSat project allows students the opportunity to interact with the astronauts' real-time observations and photography of geographic points of interest. The Electronic Still Camera (ESC), which was handled largely by Ivins, can be seen near the computer.
Sizing of complex structure by the integration of several different optimal design algorithms
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.
1974-01-01
Practical design of large-scale structures can be accomplished with the aid of the digital computer by bringing together in one computer program algorithms of nonlinear mathematical programing and optimality criteria with weight-strength and other so-called engineering methods. Applications of this approach to aviation structures are discussed with a detailed description of how the total problem of structural sizing can be broken down into subproblems for best utilization of each algorithm and for efficient organization of the program into iterative loops. Typical results are examined for a number of examples.
1982-01-29
N - Nw .VA COMPUTER PROGRAM USER’S MANUAL FOR . 0FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM VOLUME II DUBBING 29 JANUARY...Digital Topographic Data Verification Library Dubbing System, Volume II, Dubbing 6. PERFORMING ORG. REPORT NUMER 7. AUTHOR(q) S. CONTRACT OR GRANT...Software Library FIREFINDER Dubbing 20. ABSTRACT (Continue an revWee *Ide II necessary end identify by leek mauber) PThis manual describes the computer
Microsoft C#.NET program and electromagnetic depth sounding for large loop source
NASA Astrophysics Data System (ADS)
Prabhakar Rao, K.; Ashok Babu, G.
2009-07-01
A program, in the C# (C Sharp) language with Microsoft.NET Framework, is developed to compute the normalized vertical magnetic field of a horizontal rectangular loop source placed on the surface of an n-layered earth. The field can be calculated either inside or outside the loop. Five C# classes with member functions in each class are, designed to compute the kernel, Hankel transform integral, coefficients for cubic spline interpolation between computed values and the normalized vertical magnetic field. The program computes the vertical magnetic field in the frequency domain using the integral expressions evaluated by a combination of straightforward numerical integration and the digital filter technique. The code utilizes different object-oriented programming (OOP) features. It finally computes the amplitude and phase of the normalized vertical magnetic field. The computed results are presented for geometric and parametric soundings. The code is developed in Microsoft.NET visual studio 2003 and uses various system class libraries.
ECDSA B-233 with Precomputation 1.0 Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draelos, Timothy; Schroeppel, Richard; Schoeneman, Barry
2009-12-11
This software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public keymore » data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signatureThis software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public key data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signature« less
Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba
2017-12-23
Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.
NASA Technical Reports Server (NTRS)
Giddings, L.; Boston, S.
1976-01-01
A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.
Two schemes for rapid generation of digital video holograms using PC cluster
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il
2017-12-01
Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.
Universe creation on a computer
NASA Astrophysics Data System (ADS)
McCabe, Gordon
The purpose of this paper is to provide an account of the epistemology and metaphysics of universe creation on a computer. The paper begins with F.J. Tipler's argument that our experience is indistinguishable from the experience of someone embedded in a perfect computer simulation of our own universe, hence we cannot know whether or not we are part of such a computer program ourselves. Tipler's argument is treated as a special case of epistemological scepticism, in a similar vein to 'brain-in-a-vat' arguments. It is argued that Tipler's hypothesis that our universe is a program running on a digital computer in another universe, generates empirical predictions, and is therefore a falsifiable hypothesis. The computer program hypothesis is also treated as a hypothesis about what exists beyond the physical world, and is compared with Kant's metaphysics of noumena. It is argued that if our universe is a program running on a digital computer, then our universe must have compact spatial topology, and the possibilities of observationally testing this prediction are considered. The possibility of testing the computer program hypothesis with the value of the density parameter Ω0 is also analysed. The informational requirements for a computer to represent a universe exactly and completely are considered. Consequent doubt is thrown upon Tipler's claim that if a hierarchy of computer universes exists, we would not be able to know which 'level of implementation' our universe exists at. It is then argued that a digital computer simulation of a universe, or any other physical system, does not provide a realisation of that universe or system. It is argued that a digital computer simulation of a physical system is not objectively related to that physical system, and therefore cannot exist as anything else other than a physical process occurring upon the components of the computer. It is concluded that Tipler's sceptical hypothesis, and a related hypothesis from Bostrom, cannot be true: it is impossible that our own experience is indistinguishable from the experience of somebody embedded in a digital computer simulation because it is impossible for anybody to be embedded in a digital computer simulation.
Photographic memory: The storage and retrieval of data
NASA Technical Reports Server (NTRS)
Horton, J.
1984-01-01
The concept of density encoding digital data in a mass-storage computer peripheral is proposed. This concept requires that digital data be encoded as distinguishable density levels (DDLS) of the film to be used as the storage medium. These DDL's are then recorded on the film in relatively large pixels. Retrieval of the data would be accomplished by scanning the photographic record using a relatively small aperture. Multiplexing of the pixels is used to store data of a range greater than the number of DDL's supportable by the film in question. Although a cartographic application is used as an example for the photographic storage of data, any digital data can be stored in a like manner. When the data is inherently spatially-distributed, the aptness of the proposed scheme is even more evident. In such a case, human-readability is an advantage which can be added to those mentioned earlier: speed of acquisition, ease of implementation, and cost effectiveness.
Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.
2017-01-01
Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625
Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M
2017-04-01
The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.
Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.
1957-10-01
The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.
An improved data transfer and storage technique for hybrid computation
NASA Technical Reports Server (NTRS)
Hansing, A. M.
1972-01-01
Improved technique was developed for transferring and storing data at faster than real time speeds on hybrid computer. Predominant advantage is combined use of electronic relays, track and store units, and analog-to-digital and digital-to-analog conversion units of hybrid computer.
Theoretical and experimental study of a new algorithm for factoring numbers
NASA Astrophysics Data System (ADS)
Tamma, Vincenzo
The security of codes, for example in credit card and government information, relies on the fact that the factorization of a large integer N is a rather costly process on a classical digital computer. Such a security is endangered by Shor's algorithm which employs entangled quantum systems to find, with a polynomial number of resources, the period of a function which is connected with the factors of N. We can surely expect a possible future realization of such a method for large numbers, but so far the period of Shor's function has been only computed for the number 15. Inspired by Shor's idea, our work aims to methods of factorization based on the periodicity measurement of a given continuous periodic "factoring function" which is physically implementable using an analogue computer. In particular, we have focused on both the theoretical and the experimental analysis of Gauss sums with continuous arguments leading to a new factorization algorithm. The procedure allows, for the first time, to factor several numbers by measuring the periodicity of Gauss sums performing first-order "factoring" interfer ence processes. We experimentally implemented this idea by exploiting polychromatic optical interference in the visible range with a multi-path interferometer, and achieved the factorization of seven digit numbers. The physical principle behind this "factoring" interference procedure can be potentially exploited also on entangled systems, as multi-photon entangled states, in order to achieve a polynomial scaling in the number of resources.
"I'm Good, but Not That Good": Digitally-Skilled Young People's Identity in Computing
ERIC Educational Resources Information Center
Wong, Billy
2017-01-01
Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their…
NASA Astrophysics Data System (ADS)
Leidi, Tiziano; Scocchi, Giulio; Grossi, Loris; Pusterla, Simone; D'Angelo, Claudio; Thiran, Jean-Philippe; Ortona, Alberto
2012-11-01
In recent decades, finite element (FE) techniques have been extensively used for predicting effective properties of random heterogeneous materials. In the case of very complex microstructures, the choice of numerical methods for the solution of this problem can offer some advantages over classical analytical approaches, and it allows the use of digital images obtained from real material samples (e.g., using computed tomography). On the other hand, having a large number of elements is often necessary for properly describing complex microstructures, ultimately leading to extremely time-consuming computations and high memory requirements. With the final objective of reducing these limitations, we improved an existing freely available FE code for the computation of effective conductivity (electrical and thermal) of microstructure digital models. To allow execution on hardware combining multi-core CPUs and a GPU, we first translated the original algorithm from Fortran to C, and we subdivided it into software components. Then, we enhanced the C version of the algorithm for parallel processing with heterogeneous processors. With the goal of maximizing the obtained performances and limiting resource consumption, we utilized a software architecture based on stream processing, event-driven scheduling, and dynamic load balancing. The parallel processing version of the algorithm has been validated using a simple microstructure consisting of a single sphere located at the centre of a cubic box, yielding consistent results. Finally, the code was used for the calculation of the effective thermal conductivity of a digital model of a real sample (a ceramic foam obtained using X-ray computed tomography). On a computer equipped with dual hexa-core Intel Xeon X5670 processors and an NVIDIA Tesla C2050, the parallel application version features near to linear speed-up progression when using only the CPU cores. It executes more than 20 times faster when additionally using the GPU.
Comprehensive Digital Imaging Network Project At Georgetown University Hospital
NASA Astrophysics Data System (ADS)
Mun, Seong K.; Stauffer, Douglas; Zeman, Robert; Benson, Harold; Wang, Paul; Allman, Robert
1987-10-01
The radiology practice is going through rapid changes due to the introduction of state-of-the-art computed based technologies. For the last twenty years we have witnessed the introduction of many new medical diagnostic imaging systems such as x-ray computed tomo-graphy, digital subtraction angiography (DSA), computerized nuclear medicine, single pho-ton emission computed tomography (SPECT), positron emission tomography (PET) and more re-cently, computerized digital radiography and nuclear magnetic resonance imaging (MRI). Other than the imaging systems, there has been a steady introduction of computed based information systems for radiology departments and hospitals.
Amadasi, Alberto; Borgonovo, Simone; Brandone, Alberto; Di Giancamillo, Mauro; Cattaneo, Cristina
2014-05-01
The radiological search for GSR is crucial in burnt material although it has been rarely tested. In this study, thirty-one bovine ribs were shot at near-contact range and burnt to calcination in an oven simulating a real combustion. Computed tomography (CT) and magnetic resonance (MR) were performed before and after carbonization and compared with former analyses with DR (digital radiography); thus comparing the assistance, the radiological methods can provide in the search for GSR in fresh and burnt bone. DR demonstrated the greatest ability in the detection of metallic residues, CT showed lower abilities, while MR showed a high sensitivity only in soft tissues. Thus, DR can be considered as the most sensitive method in the detection of GSR in charred bones, whereas CT and MR demonstrated much less reliability. Nonetheless, the MR ameliorates the analysis of gunshot wounds in other types of remains with large quantities of soft tissues. © 2013 American Academy of Forensic Sciences.
Huynh, Benjamin Q; Li, Hui; Giger, Maryellen L
2016-07-01
Convolutional neural networks (CNNs) show potential for computer-aided diagnosis (CADx) by learning features directly from the image data instead of using analytically extracted features. However, CNNs are difficult to train from scratch for medical images due to small sample sizes and variations in tumor presentations. Instead, transfer learning can be used to extract tumor information from medical images via CNNs originally pretrained for nonmedical tasks, alleviating the need for large datasets. Our database includes 219 breast lesions (607 full-field digital mammographic images). We compared support vector machine classifiers based on the CNN-extracted image features and our prior computer-extracted tumor features in the task of distinguishing between benign and malignant breast lesions. Five-fold cross validation (by lesion) was conducted with the area under the receiver operating characteristic (ROC) curve as the performance metric. Results show that classifiers based on CNN-extracted features (with transfer learning) perform comparably to those using analytically extracted features [area under the ROC curve [Formula: see text
NASA Astrophysics Data System (ADS)
Blume, H.; Alexandru, R.; Applegate, R.; Giordano, T.; Kamiya, K.; Kresina, R.
1986-06-01
In a digital diagnostic imaging department, the majority of operations for handling and processing of images can be grouped into a small set of basic operations, such as image data buffering and storage, image processing and analysis, image display, image data transmission and image data compression. These operations occur in almost all nodes of the diagnostic imaging communications network of the department. An image processor architecture was developed in which each of these functions has been mapped into hardware and software modules. The modular approach has advantages in terms of economics, service, expandability and upgradeability. The architectural design is based on the principles of hierarchical functionality, distributed and parallel processing and aims at real time response. Parallel processing and real time response is facilitated in part by a dual bus system: a VME control bus and a high speed image data bus, consisting of 8 independent parallel 16-bit busses, capable of handling combined up to 144 MBytes/sec. The presented image processor is versatile enough to meet the video rate processing needs of digital subtraction angiography, the large pixel matrix processing requirements of static projection radiography, or the broad range of manipulation and display needs of a multi-modality diagnostic work station. Several hardware modules are described in detail. For illustrating the capabilities of the image processor, processed 2000 x 2000 pixel computed radiographs are shown and estimated computation times for executing the processing opera-tions are presented.
Building the interspace: Digital library infrastructure for a University Engineering Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schatz, B.
A large-scale digital library is being constructed and evaluated at the University of Illinois, with the goal of bringing professional search and display to Internet information services. A testbed planned to grow to 10K documents and 100K users is being constructed in the Grainger Engineering Library Information Center, as a joint effort of the University Library and the National Center for Supercomputing Applications (NCSA), with evaluation and research by the Graduate School of Library and Information Science and the Department of Computer Science. The electronic collection will be articles from engineering and science journals and magazines, obtained directly from publishersmore » in SGML format and displayed containing all text, figures, tables, and equations. The publisher partners include IEEE Computer Society, AIAA (Aerospace Engineering), American Physical Society, and Wiley & Sons. The software will be based upon NCSA Mosaic as a network engine connected to commercial SGML displayers and full-text searchers. The users will include faculty/students across the midwestern universities in the Big Ten, with evaluations via interviews, surveys, and transaction logs. Concurrently, research into scaling the testbed is being conducted. This includes efforts in computer science, information science, library science, and information systems. These efforts will evaluate different semantic retrieval technologies, including automatic thesaurus and subject classification graphs. New architectures will be designed and implemented for a next generation digital library infrastructure, the Interspace, which supports interaction with information spread across information spaces within the Net.« less
NASA Astrophysics Data System (ADS)
Graves, A. Palmer
This study examines the effect of increasing the visual complexity used in computer assisted instruction in general chemistry. Traditional recitation instruction was used as a control for the experiment. One tutorial presented a chemistry topic using 3-D animation showing molecular activity and symbolic representation of the macroscopic view of a chemical phenomenon. A second tutorial presented the same topic but simultaneously presented students with a digital video movie showing the phenomena and 3-D animation showing the molecular view of the phenomena. This experimental set-up was used in two different experiments during the first semester of college level general chemistry course. The topics covered were the molecular effect of heating water through the solid-liquid phase change and the kinetic molecular theory used in explaining pressure changes. The subjects used in the experiment were 236 college students enrolled in a freshman chemistry course at a large university. The data indicated that the simultaneous presentation of digital video, showing the solid to liquid phase change of water, with a molecular animation, showing the molecular behavior during the phase change, had a significant effect on student particulate understanding when compared to traditional recitation. Although the effect of the KMT tutorial was not statistically significant, there was a positive effect on student particulate understanding. The use of computer tutorial also had a significant effect on student attitude toward their comprehension of the lesson.
Computer-aided diagnosis of malignant mammograms using Zernike moments and SVM.
Sharma, Shubhi; Khanna, Pritee
2015-02-01
This work is directed toward the development of a computer-aided diagnosis (CAD) system to detect abnormalities or suspicious areas in digital mammograms and classify them as malignant or nonmalignant. Original mammogram is preprocessed to separate the breast region from its background. To work on the suspicious area of the breast, region of interest (ROI) patches of a fixed size of 128×128 are extracted from the original large-sized digital mammograms. For training, patches are extracted manually from a preprocessed mammogram. For testing, patches are extracted from a highly dense area identified by clustering technique. For all extracted patches corresponding to a mammogram, Zernike moments of different orders are computed and stored as a feature vector. A support vector machine (SVM) is used to classify extracted ROI patches. The experimental study shows that the use of Zernike moments with order 20 and SVM classifier gives better results among other studies. The proposed system is tested on Image Retrieval In Medical Application (IRMA) reference dataset and Digital Database for Screening Mammography (DDSM) mammogram database. On IRMA reference dataset, it attains 99% sensitivity and 99% specificity, and on DDSM mammogram database, it obtained 97% sensitivity and 96% specificity. To verify the applicability of Zernike moments as a fitting texture descriptor, the performance of the proposed CAD system is compared with the other well-known texture descriptors namely gray-level co-occurrence matrix (GLCM) and discrete cosine transform (DCT).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...
Challenges in Managing Trustworthy Large-scale Digital Science
NASA Astrophysics Data System (ADS)
Evans, B. J. K.
2017-12-01
The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.
Very Large Scale Integrated Circuits for Military Systems.
1981-01-01
ABBREVIATIONS A/D Analog-to-digital C AGC Automatic Gain Control A A/J Anti-jam ASP Advanced Signal Processor AU Arithmetic Units C.AD Computer-Aided...ESM) equipments (Ref. 23); in lieu of an adequate automatic proces- sing capability, the function is now performed manually (Ref. 24), which involves...a human operator, displays, etc., and a sacrifice in performance (acquisition speed, saturation signal density). Various automatic processing
Full-color large-scaled computer-generated holograms for physical and non-physical objects
NASA Astrophysics Data System (ADS)
Matsushima, Kyoji; Tsuchiyama, Yasuhiro; Sonobe, Noriaki; Masuji, Shoya; Yamaguchi, Masahiro; Sakamoto, Yuji
2017-05-01
Several full-color high-definition CGHs are created for reconstructing 3D scenes including real-existing physical objects. The field of the physical objects are generated or captured by employing three techniques; 3D scanner, synthetic aperture digital holography, and multi-viewpoint images. Full-color reconstruction of high-definition CGHs is realized by RGB color filters. The optical reconstructions are presented for verifying these techniques.
Acquisition of a High Performance Computing Instrument for Big Data Research and Education
2015-12-03
Security and Privacy , University of Texas at Dallas, TX, September 16-17, 2014. • Chopade, P., Zhan, J., Community Detection in Large Scale Big Data...Security and Privacy in Communication Networks, Beijing, China, September 24-26, 2014. • Pravin Chopade, Kenneth Flurchick, Justin Zhan and Marwan...Balkirat Kaur, Malcolm Blow, and Justin Zhan, Digital Image Authentication in Social Media, The Sixth ASE International Conference on Privacy
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Technology and the Media section of the proceedings contains the following 18 papers: "What's Wrong with This Picture?: Attitudes of Photographic Editors at Daily Newspapers and Their Tolerance toward Digital Manipulation" (Shiela Reaves); "Strategies for the Analysis of Large-Scale Databases in Computer-Assisted Investigative…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopwood, J.E.; Affeldt, B.
An IBM personal computer (PC), a Gerber coordinate digitizer, and a collection of other instruments make up a system known as the Coordinate Digitizer Interactive Processor (CDIP). The PC extracts coordinate data from the digitizer through a special interface, and then, after reformatting, transmits the data to a remote VAX computer, a floppy disk, and a display terminal. This system has improved the efficiency of producing printed circuit-board artwork and extended the useful life of the Gerber GCD-1 Digitizer. 1 ref., 12 figs.
A Digital Divide? Class and Gender in the Computer Practices of Two Mexicano Families.
ERIC Educational Resources Information Center
Menard-Warwick, Julia; Dabach, Dafney Blanca
The purpose of this paper is to critically re-examine the popular concept of a developing "digital divide." Based on qualitative studies of the computer practices of two Mexicano families resident in California, the paper argues with Warschauer (2001) that the "digital divide" should be seen as a continuum of varying degrees of…
Simulated Laboratory in Digital Logic.
ERIC Educational Resources Information Center
Cleaver, Thomas G.
Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…
SCANIT: centralized digitizing of forest resource maps or photographs
Elliot L. Amidon; E. Joyce Dye
1981-01-01
Spatial data on wildland resource maps and aerial photographs can be analyzed by computer after digitizing. SCANIT is a computerized system for encoding such data in digital form. The system, consisting of a collection of computer programs and subroutines, provides a powerful and versatile tool for a variety of resource analyses. SCANIT also may be converted easily to...
Enhancing Tele-robotics with Immersive Virtual Reality
2017-11-03
graduate and undergraduate students within the Digital Gaming and Simulation, Computer Science, and psychology programs have actively collaborated...investigates the use of artificial intelligence and visual computing. Numerous fields across the human-computer interaction and gaming research areas...invested in digital gaming and simulation to cognitively stimulate humans by computers, forming a $10.5B industry [1]. On the other hand, cognitive
Promoting Gender Equality in Digital Literacy
ERIC Educational Resources Information Center
Ertl, Bernhard; Helling, Kathrin
2011-01-01
This article deals with gender phenomena in the context of digital literacy. Studies show that computer use, computer skills, and computer-related self-concepts are subject to gender differences. These differences may affect classroom interactions as well as learning processes and have therefore to be considered carefully by teachers who apply…
Digital Data Transmission Via CATV.
ERIC Educational Resources Information Center
Stifle, Jack; And Others
A low cost communications network has been designed for use in the PLATO IV computer-assisted instruction system. Over 1,000 remote computer graphic terminals each requiring a 1200 bps channel are to be connected to one centrally located computer. Digital data are distributed to these terminals using standard commercial cable television (CATV)…
ERIC Educational Resources Information Center
Sargent, John
The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…
Sarpeshkar, R
2014-03-28
We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog-digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA-protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations.
Sarpeshkar, R.
2014-01-01
We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog–digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA–protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations. PMID:24567476
NASA Astrophysics Data System (ADS)
Cataldo, Franca
The world is at the dawn of a third industrial revolution, the digital revolution, that brings great changes the world over. Today, computing devices, the Internet, and the World Wide Web are vital technology tools that affect every aspect of everyday life and success. While computing technologies offer enormous benefits, there are equally enormous safety and security risks that have been growing exponentially since they became widely available to the public in 1994. Cybercriminals are increasingly implementing sophisticated and serious hack attacks and breaches upon our nation's government, financial institutions, organizations, communities, and private citizens. There is a great need for computer scientists to carry America's innovation and economic growth forward and for cybersecurity professionals to keep our nation safe from criminal hacking. In this digital age, computer science and cybersecurity are essential foundational ingredients of technological innovation, economic growth, and cybersecurity that span all industries. Yet, America's K-12 education institutions are not teaching the computer science and cybersecurity skills required to produce a technologically-savvy 21st century workforce. Education is the key to preparing students to enter the workforce and, therefore, American K-12 STEM education must be reformed to accommodate the teachings required in the digital age. Keywords: Cybersecurity Education, Cybersecurity Education Initiatives, Computer Science Education, Computer Science Education Initiatives, 21 st Century K-12 STEM Education Reform, 21st Century Digital Literacies, High-Tech Innovative Problem-Solving Skills, 21st Century Digital Workforce, Standardized Testing, Foreign Language and Culture Studies, Utica College, Professor Chris Riddell.
Autonomous Telemetry Collection for Single-Processor Small Satellites
NASA Technical Reports Server (NTRS)
Speer, Dave
2003-01-01
For the Space Technology 5 mission, which is being developed under NASA's New Millennium Program, a single spacecraft processor will be required to do on-board real-time computations and operations associated with attitude control, up-link and down-link communications, science data processing, solid-state recorder management, power switching and battery charge management, experiment data collection, health and status data collection, etc. Much of the health and status information is in analog form, and each of the analog signals must be routed to the input of an analog-to-digital converter, converted to digital form, and then stored in memory. If the micro-operations of the analog data collection process are implemented in software, the processor may use up a lot of time either waiting for the analog signal to settle, waiting for the analog-to-digital conversion to complete, or servicing a large number of high frequency interrupts. In order to off-load a very busy processor, the collection and digitization of all analog spacecraft health and status data will be done autonomously by a field-programmable gate array that can configure the analog signal chain, control the analog-to-digital converter, and store the converted data in memory.
NASA Astrophysics Data System (ADS)
Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.
2016-12-01
Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.
Trelease, R B
1996-01-01
Advances in computer visualization and user interface technologies have enabled development of "virtual reality" programs that allow users to perceive and to interact with objects in artificial three-dimensional environments. Such technologies were used to create an image database and program for studying the human skull, a specimen that has become increasingly expensive and scarce. Stereoscopic image pairs of a museum-quality skull were digitized from multiple views. For each view, the stereo pairs were interlaced into a single, field-sequential stereoscopic picture using an image processing program. The resulting interlaced image files are organized in an interactive multimedia program. At run-time, gray-scale 3-D images are displayed on a large-screen computer monitor and observed through liquid-crystal shutter goggles. Users can then control the program and change views with a mouse and cursor to point-and-click on screen-level control words ("buttons"). For each view of the skull, an ID control button can be used to overlay pointers and captions for important structures. Pointing and clicking on "hidden buttons" overlying certain structures triggers digitized audio spoken word descriptions or mini lectures.
Real-time classification and sensor fusion with a spiking deep belief network.
O'Connor, Peter; Neil, Daniel; Liu, Shih-Chii; Delbruck, Tobi; Pfeiffer, Michael
2013-01-01
Deep Belief Networks (DBNs) have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128 × 128 Dynamic Vision Sensor (DVS) silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input.
Weaves as an Interconnection Fabric for ASIM's and Nanosatellites
NASA Technical Reports Server (NTRS)
Gorlick, Michael M.
1995-01-01
Many of the micromachines under consideration require computer support, indeed, one of the appeals of this technology is the ability to intermix mechanical, optical, analog, and digital devices on the same substrate. The amount of computer power is rarely an issue, the sticking point is the complexity of the software required to make effective use of these devices. Micromachines are the nano-technologist's equivalent of 'golden screws'. In other words, they will be piece parts in larger assemblages. For example, a nano-satellite may be composed of stacked silicon wafers where each wafer contains hundreds to thousands of micromachines, digital controllers, general purpose computers, memories, and high-speed bus interconnects. Comparatively few of these devices will be custom designed, most will be stock parts selected from libraries and catalogs. The novelty will lie in the interconnections. For example, a digital accelerometer may be a component part in an adaptive suspension, a monitoring element embedded in the wrapper of a package, or a portion of the smart skin of a launch vehicle. In each case, this device must inter-operate with other devices and probes for the purposes of command, control, and communication. We propose a software technology called 'weaves' that will permit large collections of micromachines and their attendant computers to freely intercommunicate while preserving modularity, transparency, and flexibility. Weaves are composed of networks of communicating software components. The network, and the components comprising it, may be changed even while the software, and the devices it controls, are executing. This unusual degree of software plasticity permits micromachines to dynamically adapt the software to changing conditions and allows system engineers to rapidly and inexpensively develop special purpose software by assembling stock software components in custom configurations.
Bolesta, Michael J; Winslow, Lauren; Gill, Kevin
2010-06-01
A comparison of measurements of degenerative spondylolisthesis made on film and on computer workstations. To determine whether the 2 methodologies are comparable in some of the parameters used to assess lumbar degenerative spondylolisthesis. Digital radiology has been replacing analog radiographs. In scoliosis, several studies have shown that measurements made on digital and analog films are similar and that they are also similar to those made on computer workstations. Such work has not been done in spondylolisthesis. Twenty-four cases of lumbar degenerative spondylolisthesis were identified from our clinic practice. Three observers measured anterior displacement, sagittal rotation, and lumbar lordosis on digital films using the same protractor and pencil. The same parameters were measured on the same studies at clinical workstations. All measurements were repeated 2 weeks later. A statistician determined the intra and interobserver reliability of the 2 measurement methods and the degree of agreement between the 2 methods. The differences between the first and second readings did reach statistical significance in some cases, but none of them were large enough to be clinically meaningful. The interclass correlation coefficients (ICCs) were >or=0.80 except for one (0.67). The difference among the 3 observers was similarly statistically significant in a few instances but not enough to influence clinical decisions and with good ICCs (0.67 and better). Similarly, the differences in the 2 methods were small, and ICCs ranged from 0.69 to 0.98. This study supports the use of computer workstation measurements in lumbar degenerative spondylolisthesis. The parameters used in this study were comparable, whether measured on film or at clinical workstations.
NASA Astrophysics Data System (ADS)
Zhang, Chun-Sen; Zhang, Meng-Meng; Zhang, Wei-Xing
2017-01-01
This paper outlines a low-cost, user-friendly photogrammetric technique with nonmetric cameras to obtain excavation site digital sequence images, based on photogrammetry and computer vision. Digital camera calibration, automatic aerial triangulation, image feature extraction, image sequence matching, and dense digital differential rectification are used, combined with a certain number of global control points of the excavation site, to reconstruct the high precision of measured three-dimensional (3-D) models. Using the acrobatic figurines in the Qin Shi Huang mausoleum excavation as an example, our method solves the problems of little base-to-height ratio, high inclination, unstable altitudes, and significant ground elevation changes affecting image matching. Compared to 3-D laser scanning, the 3-D color point cloud obtained by this method can maintain the same visual result and has advantages of low project cost, simple data processing, and high accuracy. Structure-from-motion (SfM) is often used to reconstruct 3-D models of large scenes and has lower accuracy if it is a reconstructed 3-D model of a small scene at close range. Results indicate that this method quickly achieves 3-D reconstruction of large archaeological sites and produces heritage site distribution of orthophotos providing a scientific basis for accurate location of cultural relics, archaeological excavations, investigation, and site protection planning. This proposed method has a comprehensive application value.
Chen, Yue; Fang, Zhao-Xiang; Ren, Yu-Xuan; Gong, Lei; Lu, Rong-De
2015-09-20
Optical vortices are associated with a spatial phase singularity. Such a beam with a vortex is valuable in optical microscopy, hyper-entanglement, and optical levitation. In these applications, vortex beams with a perfect circle shape and a large topological charge are highly desirable. But the generation of perfect vortices with high topological charges is challenging. We present a novel method to create perfect vortex beams with large topological charges using a digital micromirror device (DMD) through binary amplitude modulation and a narrow Gaussian approximation. The DMD with binary holograms encoding both the spatial amplitude and the phase could generate fast switchable, reconfigurable optical vortex beams with significantly high quality and fidelity. With either the binary Lee hologram or the superpixel binary encoding technique, we were able to generate the corresponding hologram with high fidelity and create a perfect vortex with topological charge as large as 90. The physical properties of the perfect vortex beam produced were characterized through measurements of propagation dynamics and the focusing fields. The measurements show good consistency with the theoretical simulation. The perfect vortex beam produced satisfies high-demand utilization in optical manipulation and control, momentum transfer, quantum computing, and biophotonics.
Computational biomedicine: a challenge for the twenty-first century.
Coveney, Peter V; Shublaq, Nour W
2012-01-01
With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.
McDermott, Kelly; Tieu, Lina; Rios, Christina; Gibson, Eliza; Sweet, Cynthia Castro; Payne, Mike
2016-01-01
Background. The feasibility of digital health programs to prevent and manage diabetes in low-income patients has not been adequately explored. Methods. Researchers collaborated with a digital health company to adapt a diabetes prevention program for low-income prediabetes patients at a large safety net clinic. We conducted focus groups to assess patient perspectives, revised lessons for improved readability and cultural relevance to low-income and Hispanic patients, conducted a feasibility study of the adapted program in English and Spanish speaking cohorts, and implemented real-time adaptations to the program for commercial use and for a larger trial of in multiple safety net clinics. Results. The majority of focus group participants were receptive to the program. We modified the curriculum to a 5th-grade reading level and adapted content based on patient feedback. In the feasibility study, 54% of eligible contacted patients expressed interest in enrolling (n = 23). Although some participants' computer access and literacy made registration challenging, they were highly satisfied and engaged (80% logged in at least once/week). Conclusions. Underserved prediabetic patients displayed high engagement and satisfaction with a digital diabetes prevention program despite lower digital literacy skills. The collaboration between researchers and a digital health company enabled iterative improvements in technology implementation to address challenges in low-income populations. PMID:27868070
Digital optical conversion module
Kotter, D.K.; Rankin, R.A.
1988-07-19
A digital optical conversion module used to convert an analog signal to a computer compatible digital signal including a voltage-to-frequency converter, frequency offset response circuitry, and an electrical-to-optical converter. Also used in conjunction with the digital optical conversion module is an optical link and an interface at the computer for converting the optical signal back to an electrical signal. Suitable for use in hostile environments having high levels of electromagnetic interference, the conversion module retains high resolution of the analog signal while eliminating the potential for errors due to noise and interference. The module can be used to link analog output scientific equipment such as an electrometer used with a mass spectrometer to a computer. 2 figs.
Digital optical conversion module
Kotter, Dale K.; Rankin, Richard A.
1991-02-26
A digital optical conversion module used to convert an analog signal to a computer compatible digital signal including a voltage-to-frequency converter, frequency offset response circuitry, and an electrical-to-optical converter. Also used in conjunction with the digital optical conversion module is an optical link and an interface at the computer for converting the optical signal back to an electrical signal. Suitable for use in hostile environments having high levels of electromagnetic interference, the conversion module retains high resolution of the analog signal while eliminating the potential for errors due to noise and interference. The module can be used to link analog output scientific equipment such as an electrometer used with a mass spectrometer to a computer.
Color engineering in the age of digital convergence
NASA Astrophysics Data System (ADS)
MacDonald, Lindsay W.
1998-09-01
Digital color imaging has developed over the past twenty years from specialized scientific applications into the mainstream of computing. In addition to the phenomenal growth of computer processing power and storage capacity, great advances have been made in the capabilities and cost-effectiveness of color imaging peripherals. The majority of imaging applications, including the graphic arts, video and film have made the transition from analogue to digital production methods. Digital convergence of computing, communications and television now heralds new possibilities for multimedia publishing and mobile lifestyles. Color engineering, the application of color science to the design of imaging products, is an emerging discipline that poses exciting challenges to the international color imaging community for training, research and standards.
Tree-Structured Digital Organisms Model
NASA Astrophysics Data System (ADS)
Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo
Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.
Digital computer processing of peach orchard multispectral aerial photography
NASA Technical Reports Server (NTRS)
Atkinson, R. J.
1976-01-01
Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.
Pilot Study of Bovine Interdigital Cassetteless Computed Radiography
EL-SHAFAEY, El-Sayed Ahmed Awad; AOKI, Takahiro; ISHII, Mitsuo; YAMADA, Kazutaka
2013-01-01
ABSTRACT Twenty-one limbs of bovine cadavers (42 digits) were exposed to interdigital cassetteless imaging plate using computed radiography. The radiographic findings included exostosis, a rough planta surface, osteolysis of the apex of the distal phalanx and widening of the laminar zone between the distal phalanx and the hoof wall. All these findings were confirmed by computed tomography. The hindlimbs (19 digits) showed more changes than the forelimbs (10 digits), particularly in the lateral distal phalanx. The cassetteless computed radiography technique is expected to be an easily applicable method for the distal phalanx rather than a conventional cassette-plate and/or the film-screen cassetteless methods. PMID:23782542
From transistor to trapped-ion computers for quantum chemistry.
Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E
2014-01-07
Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.
The NASA computer aided design and test system
NASA Technical Reports Server (NTRS)
Gould, J. M.; Juergensen, K.
1973-01-01
A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.
From transistor to trapped-ion computers for quantum chemistry
Yung, M.-H.; Casanova, J.; Mezzacapo, A.; McClean, J.; Lamata, L.; Aspuru-Guzik, A.; Solano, E.
2014-01-01
Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology. PMID:24395054
Performance analysis of a large-grain dataflow scheduling paradigm
NASA Technical Reports Server (NTRS)
Young, Steven D.; Wills, Robert W.
1993-01-01
A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.
A 21st Century Science, Technology, and Innovation Strategy for Americas National Security
2016-05-01
areas. Advanced Computing and Communications The exponential growth of the digital economy, driven by ubiquitous computing and communication...weapons- focused R&D, many of the capabilities being developed have significant dual-use potential. Digital connectivity, for instance, brings...scale than traditional recombinant DNA techniques, and to share these designs digitally . Nanotechnology promises the ability to engineer entirely
ERIC Educational Resources Information Center
Watkins, Ryan; Engel, Laura C.; Hastedt, Dirk
2015-01-01
The rise of digital information and communication technologies (ICT) has made the acquisition of computer and information literacy (CIL) a leading factor in creating an engaged, informed, and employable citizenry. However, are young people, often described as "digital natives" or the "net generation," developing the necessary…
US GeoData: Digital cartographic and geographic data
,
1985-01-01
The increasing use of computers for storing and analyzing earth science information has sparked a growth in the demand for various types of cartographic data in digital form. The production of map data in computerized form is called digital cartography, and it involves the collection, storage, processing, analysis, and display of map data with the aid of computers. The U.S. Geological Survey, the Nation's largest earth science research agency, has expanded its national mapping program to incorporate operations associated with digital cartography, including the collection of planimetric, elevation, and geographic names information in digital form. This digital information is available for use in meeting the multipurpose needs and applications of the map user community.
NASA Astrophysics Data System (ADS)
Zhao, Ziyue; Gan, Xiaochuan; Zou, Zhi; Ma, Liqun
2018-01-01
The dynamic envelope measurement plays very important role in the external dimension design for high-speed train. Recently there is no digital measurement system to solve this problem. This paper develops an optoelectronic measurement system by using monocular digital camera, and presents the research of measurement theory, visual target design, calibration algorithm design, software programming and so on. This system consists of several CMOS digital cameras, several luminous targets for measuring, a scale bar, data processing software and a terminal computer. The system has such advantages as large measurement scale, high degree of automation, strong anti-interference ability, noise rejection and real-time measurement. In this paper, we resolve the key technology such as the transformation, storage and calculation of multiple cameras' high resolution digital image. The experimental data show that the repeatability of the system is within 0.02mm and the distance error of the system is within 0.12mm in the whole workspace. This experiment has verified the rationality of the system scheme, the correctness, the precision and effectiveness of the relevant methods.
ATCA digital controller hardware for vertical stabilization of plasmas in tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Batista, A. J. N.; Sousa, J.; Varandas, C. A. F.
2006-10-15
The efficient vertical stabilization (VS) of plasmas in tokamaks requires a fast reaction of the VS controller, for example, after detection of edge localized modes (ELM). For controlling the effects of very large ELMs a new digital control hardware, based on the Advanced Telecommunications Computing Architecture trade mark sign (ATCA), is being developed aiming to reduce the VS digital control loop cycle (down to an optimal value of 10 {mu}s) and improve the algorithm performance. The system has 1 ATCA trade mark sign processor module and up to 12 ATCA trade mark sign control modules, each one with 32 analogmore » input channels (12 bit resolution), 4 analog output channels (12 bit resolution), and 8 digital input/output channels. The Aurora trade mark sign and PCI Express trade mark sign communication protocols will be used for data transport, between modules, with expected latencies below 2 {mu}s. Control algorithms are implemented on a ix86 based processor with 6 Gflops and on field programmable gate arrays with 80 GMACS, interconnected by serial gigabit links in a full mesh topology.« less
Plans for wind energy system simulation
NASA Technical Reports Server (NTRS)
Dreier, M. E.
1978-01-01
A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.
Lee, W Anthony
2007-01-01
The gold standard for preoperative evaluation of an aortic aneurysm is a computed tomography angiogram (CTA). Three-dimensional reconstruction and analysis of the computed tomography data set is enormously helpful, and even sometimes essential, in proper sizing and planning for endovascular stent graft repair. To a large extent, it has obviated the need for conventional angiography for morphologic evaluation. The TeraRecon Aquarius workstation (San Mateo, Calif) represents a highly sophisticated but user-friendly platform utilizing a combination of task-specific hardware and software specifically designed to rapidly manipulate large Digital Imaging and Communications in Medicine (DICOM) data sets and provide surface-shaded and multiplanar renderings in real-time. This article discusses the basics of sizing and planning for endovascular abdominal aortic aneurysm repair and the role of 3-dimensional analysis using the TeraRecon workstation.
The big data challenges of connectomics
Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir
2015-01-01
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911
The big data challenges of connectomics
Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir
2014-10-28
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less
The big data challenges of connectomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less
Flight experience with flight control redundancy management
NASA Technical Reports Server (NTRS)
Szalai, K. J.; Larson, R. R.; Glover, R. D.
1980-01-01
Flight experience with both current and advanced redundancy management schemes was gained in recent flight research programs using the F-8 digital fly by wire aircraft. The flight performance of fault detection, isolation, and reconfiguration (FDIR) methods for sensors, computers, and actuators is reviewed. Results of induced failures as well as of actual random failures are discussed. Deficiencies in modeling and implementation techniques are also discussed. The paper also presents comparison off multisensor tracking in smooth air, in turbulence, during large maneuvers, and during maneuvers typical of those of large commercial transport aircraft. The results of flight tests of an advanced analytic redundancy management algorithm are compared with the performance of a contemporary algorithm in terms of time to detection, false alarms, and missed alarms. The performance of computer redundancy management in both iron bird and flight tests is also presented.
NASA Astrophysics Data System (ADS)
Cook, Perry
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.
NASA Technical Reports Server (NTRS)
Greene, P. H.
1972-01-01
Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.
On the Need for Research Evidence to Guide the Design of Computer Games for Learning
ERIC Educational Resources Information Center
Mayer, Richard E.
2015-01-01
Computer games for learning (also called video games or digital games) have potential to improve education. This is the intriguing idea that motivates this special issue of the "Educational Psychologist" on "Psychological Perspectives on Digital Games and Learning." Computer games for learning are games delivered via computer…
A Method for Identifying Contours in Processing Digital Images from Computer Tomograph
NASA Astrophysics Data System (ADS)
Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela
2011-09-01
The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.
Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions
ERIC Educational Resources Information Center
Torbeyns, Joke; Verschaffel, Lieven
2016-01-01
This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…
Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.
ERIC Educational Resources Information Center
Smith, Cecil L., Ed.
Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…
A Digital Computer Simulation of Cardiovascular and Renal Physiology.
ERIC Educational Resources Information Center
Tidball, Charles S.
1979-01-01
Presents the physiological MACPEE, one of a family of digital computer simulations used in Canada and Great Britain. A general description of the model is provided, along with a sample of computer output format, options for making interventions, advanced capabilities, an evaluation, and technical information for running a MAC model. (MA)
NASA Astrophysics Data System (ADS)
Xiong, Ting; He, Zhiwen
2017-06-01
Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.
Digital model as an alternative to plaster model in assessment of space analysis
Kumar, A. Anand; Phillip, Abraham; Kumar, Sathesh; Rawat, Anuradha; Priya, Sakthi; Kumaran, V.
2015-01-01
Introduction: Digital three-dimensional models are widely used for orthodontic diagnosis. The purpose of this study was to appraise the accuracy of digital models obtained from computer-aided design/computer-aided manufacturing (CAD/CAM) and cone-beam computed tomography (CBCT) for tooth-width measurements and the Bolton analysis. Materials and Methods: Digital models (CAD/CAM, CBCT) and plaster model were made for each of 50 subjects. Tooth-width measurements on the digital models (CAD/CAM, CBCT) were compared with those on the corresponding plaster models. The anterior and overall Bolton ratios were calculated for each participant and for each method. The paired t-test was applied to determine the validity. Results: Tooth-width measurements, anterior, and overall Bolton ratio of digital models of CAD/CAM and CBCT did not differ significantly from those on the plaster models. Conclusion: Hence, both CBCT and CAD/CAM are trustable and promising technique that can replace plaster models due to its overwhelming advantages. PMID:26538899
NASA Astrophysics Data System (ADS)
Zacharovas, Stanislovas; Nikolskij, Andrej; Kuchin, Jevgenij
2011-02-01
We have created a programming tool which uses image data provided by webcam connected to personal computer and gives user an ability to see the future digital hologram preview on his computer screen, before sending video data to holographic printing companies. In order to print digital hologram, one needs to have a sequence of images of the same scene taken from different angles and nowadays web cameras - stand-alone or incorporated into mobile computer, can be an acceptable source of such image sequences. In this article we are describing this DIY holographic imaging process in details.
Near real-time digital holographic microscope based on GPU parallel computing
NASA Astrophysics Data System (ADS)
Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan
2018-01-01
A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,
Metasurface optics for full-color computational imaging.
Colburn, Shane; Zhan, Alan; Majumdar, Arka
2018-02-01
Conventional imaging systems comprise large and expensive optical components that successively mitigate aberrations. Metasurface optics offers a route to miniaturize imaging systems by replacing bulky components with flat and compact implementations. The diffractive nature of these devices, however, induces severe chromatic aberrations, and current multiwavelength and narrowband achromatic metasurfaces cannot support full visible spectrum imaging (400 to 700 nm). We combine principles of both computational imaging and metasurface optics to build a system with a single metalens of numerical aperture ~0.45, which generates in-focus images under white light illumination. Our metalens exhibits a spectrally invariant point spread function that enables computational reconstruction of captured images with a single digital filter. This work connects computational imaging and metasurface optics and demonstrates the capabilities of combining these disciplines by simultaneously reducing aberrations and downsizing imaging systems using simpler optics.
NASA Technical Reports Server (NTRS)
Rickman, Douglas
2008-01-01
Remote sensing is measuring something without touching it. Most methods measure a portion of the electro-magnetic spectrum using energy reflected from or emitted by a material. Moving the instrument away makes it easier to see more at one time. Airplanes are good but satellites are much better. Many things can not be easily measured on the scale of an individual person. Example - measuring all the vegetation growing at one time in even the smallest country. A satellite can see things over large areas repeatedly and in a consistent way. Data from the detector is reported as digital values for a grid that covers some portion of the Earth. Because it is digital and consistent a computer can extract information or enhance the data for a specific purpose.
Low data rate digital space communications
NASA Technical Reports Server (NTRS)
Chen, C. H.
1973-01-01
The low available transmitter power and the large frequency uncertainty constrain the data rate to be low. An all-digital communication receiver is proposed, and its feasibility is established. Although coherent systems should be used whenever practical, the noncoherent MFSK system is more suitable for very low data rates. The effect of Rician fading on the performance of MFSK receiver is studied. Fading characteristics of the Venus channel are examined based on the exponential model and available experimental data on the Venus atmosphere. Because of the requirement of high communication efficiency, three codes are evaluated and compared. The rapidly varying phase error at low data rate has great effects on the tracking loop behaviors which are examined by extensive computer study of the phase plane trajectories.
NASA Astrophysics Data System (ADS)
Ceres, M.; Heselton, L. R., III
1981-11-01
This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.
NASA Space Engineering Research Center for VLSI systems design
NASA Technical Reports Server (NTRS)
1991-01-01
This annual review reports the center's activities and findings on very large scale integration (VLSI) systems design for 1990, including project status, financial support, publications, the NASA Space Engineering Research Center (SERC) Symposium on VLSI Design, research results, and outreach programs. Processor chips completed or under development are listed. Research results summarized include a design technique to harden complementary metal oxide semiconductors (CMOS) memory circuits against single event upset (SEU); improved circuit design procedures; and advances in computer aided design (CAD), communications, computer architectures, and reliability design. Also described is a high school teacher program that exposes teachers to the fundamentals of digital logic design.
A FORTRAN program for determining aircraft stability and control derivatives from flight data
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1975-01-01
A digital computer program written in FORTRAN IV for the estimation of aircraft stability and control derivatives is presented. The program uses a maximum likelihood estimation method, and two associated programs for routine, related data handling are also included. The three programs form a package that can be used by relatively inexperienced personnel to process large amounts of data with a minimum of manpower. This package was used to successfully analyze 1500 maneuvers on 20 aircraft, and is designed to be used without modification on as many types of computers as feasible. Program listings and sample check cases are included.
Secure public cloud platform for medical images sharing.
Pan, Wei; Coatrieux, Gouenou; Bouslimi, Dalel; Prigent, Nicolas
2015-01-01
Cloud computing promises medical imaging services offering large storage and computing capabilities for limited costs. In this data outsourcing framework, one of the greatest issues to deal with is data security. To do so, we propose to secure a public cloud platform devoted to medical image sharing by defining and deploying a security policy so as to control various security mechanisms. This policy stands on a risk assessment we conducted so as to identify security objectives with a special interest for digital content protection. These objectives are addressed by means of different security mechanisms like access and usage control policy, partial-encryption and watermarking.
WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliopoulos, AS; Sun, X; Pitsianis, N
2015-06-15
Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digitalmore » projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub-kernels. Conclusion: Composable projection operators constitute a versatile research tool which can greatly accelerate iterative registration algorithms and may be conducive to the clinical applicability of LIVE. National Institutes of Health Grant No. R01-CA184173; GPU donation by NVIDIA Corporation.« less
Modified-Signed-Digit Optical Computing Using Fan-Out
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang; Zhou, Shaomin; Yeh, Pochi
1996-01-01
Experimental optical computing system containing optical fan-out elements implements modified signed-digit (MSD) arithmetic and logic. In comparison with previous optical implementations of MSD arithmetic, this one characterized by larger throughput, greater flexibility, and simpler optics.
Precision digital control systems
NASA Astrophysics Data System (ADS)
Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.
This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.
Intelligent Agents for the Digital Battlefield
1998-11-01
specific outcome of our long term research will be the development of a collaborative agent technology system, CATS , that will provide the underlying...software infrastructure needed to build large, heterogeneous, distributed agent applications. CATS will provide a software environment through which multiple...intelligent agents may interact with other agents, both human and computational. In addition, CATS will contain a number of intelligent agent components that will be useful for a wide variety of applications.
The future of consumer cameras
NASA Astrophysics Data System (ADS)
Battiato, Sebastiano; Moltisanti, Marco
2015-03-01
In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.
NASA Technical Reports Server (NTRS)
Jeun, B. H.; Barger, G. L.
1977-01-01
A data base of synoptic meteorological information was compiled for the People's Republic of China, as an integral part of the Large Area Crop Inventory Experiment. A system description is provided, including hardware and software specifications, computation algorithms and an evaluation of output validity. Operations are also outlined, with emphasis placed on least squares interpolation.
NASA Technical Reports Server (NTRS)
Shiva, S. G.
1978-01-01
Several high level languages which evolved over the past few years for describing and simulating the structure and behavior of digital systems, on digital computers are assessed. The characteristics of the four prominent languages (CDL, DDL, AHPL, ISP) are summarized. A criterion for selecting a suitable hardware description language for use in an automatic integrated circuit design environment is provided.
Trend Alert: A History Teacher's Guide to Using Podcasts in the Classroom
ERIC Educational Resources Information Center
Swan, Kathleen Owings; Hofer, Mark
2009-01-01
A "podcast" (an amalgam of the word broadcast and the iPod digital audio player) is essentially a broadcast of digital audio files on the web that users can listen to on their computer or digital audio player (e.g., iPod). Podcasts can be automatically delivered to an iPod or computer whenever new content is available. This unique feature of…
Histopathological Image Analysis: A Review
Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent
2010-01-01
Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804
Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P
2015-01-01
As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology.
A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.
Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo
2015-01-01
The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.
Automated attendance accounting system
NASA Technical Reports Server (NTRS)
Chapman, C. P. (Inventor)
1973-01-01
An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.
Characterization of a 16-Bit Digitizer for Lidar Data Acquisition
NASA Technical Reports Server (NTRS)
Williamson, Cynthia K.; DeYoung, Russell J.
2000-01-01
A 6-MHz 16-bit waveform digitizer was evaluated for use in atmospheric differential absorption lidar (DIAL) measurements of ozone. The digitizer noise characteristics were evaluated, and actual ozone DIAL atmospheric returns were digitized. This digitizer could replace computer-automated measurement and control (CAMAC)-based commercial digitizers and improve voltage accuracy.
Status of emerging standards for removable computer storage media and related contributions of NIST
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1992-01-01
Standards for removable computer storage media are needed so that users may reliably interchange data both within and among various computer installations. Furthermore, media interchange standards support competition in industry and prevent sole-source lock-in. NIST participates in magnetic tape and optical disk standards development through Technical Committees X3B5, Digital Magnetic Tapes, X3B11, Optical Digital Data Disk, and the Joint Technical Commission on Data Permanence. NIST also participates in other relevant national and international standards committees for removable computer storage media. Industry standards for digital magnetic tapes require the use of Standard Reference Materials (SRM's) developed and maintained by NIST. In addition, NIST has been studying care and handling procedures required for digital magnetic tapes. NIST has developed a methodology for determining the life expectancy of optical disks. NIST is developing care and handling procedures for optical digital data disks and is involved in a program to investigate error reporting capabilities of optical disk drives. This presentation reflects the status of emerging magnetic tape and optical disk standards, as well as NIST's contributions in support of these standards.
Mendikute, Alberto; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-01-01
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g., 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras. PMID:28891946
Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-09-09
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras.
Flight test experience and controlled impact of a large, four-engine, remotely piloted airplane
NASA Technical Reports Server (NTRS)
Kempel, R. W.; Horton, T. W.
1985-01-01
A controlled impact demonstration (CID) program using a large, four engine, remotely piloted transport airplane was conducted. Closed loop primary flight control was performed from a ground based cockpit and digital computer in conjunction with an up/down telemetry link. Uplink commands were received aboard the airplane and transferred through uplink interface systems to a highly modified Bendix PB-20D autopilot. Both proportional and discrete commands were generated by the ground pilot. Prior to flight tests, extensive simulation was conducted during the development of ground based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems. However, manned flight tests were the primary method of verification and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and the systems required to accomplish the remotely piloted mission are discussed.
Mendonca, Derick A; Naidoo, Sybill D; Skolnick, Gary; Skladman, Rachel; Woo, Albert S
2013-07-01
Craniofacial anthropometry by direct caliper measurements is a common method of quantifying the morphology of the cranial vault. New digital imaging modalities including computed tomography and three-dimensional photogrammetry are similarly being used to obtain craniofacial surface measurements. This study sought to compare the accuracy of anthropometric measurements obtained by calipers versus 2 methods of digital imaging.Standard anterior-posterior, biparietal, and cranial index measurements were directly obtained on 19 participants with an age range of 1 to 20 months. Computed tomographic scans and three-dimensional photographs were both obtained on each child within 2 weeks of the clinical examination. Two analysts measured the anterior-posterior and biparietal distances on the digital images. Measures of reliability and bias between the modalities were calculated and compared.Caliper measurements were found to underestimate the anterior-posterior and biparietal distances as compared with those of the computed tomography and the three-dimensional photogrammetry (P < 0.001). Cranial index measurements between the computed tomography and the calipers differed by up to 6%. The difference between the 2 modalities was statistically significant (P = 0.021). The biparietal and cranial index results were similar between the digital modalities, but the anterior-posterior measurement was greater with the three-dimensional photogrammetry (P = 0.002). The coefficients of variation for repeated measures based on the computed tomography and the three-dimensional photogrammetry were 0.008 and 0.007, respectively.In conclusion, measurements based on digital modalities are generally reliable and interchangeable. Caliper measurements lead to underestimation of anterior-posterior and biparietal values compared with digital imaging.
ERIC Educational Resources Information Center
Kim, SugHee; Chung, KwangSik; Yu, HeonChang
2013-01-01
The purpose of this paper is to propose a training program for creative problem solving based on computer programming. The proposed program will encourage students to solve real-life problems through a creative thinking spiral related to cognitive skills with computer programming. With the goal of enhancing digital fluency through this proposed…
Developing Digital Immigrants' Computer Literacy: The Case of Unemployed Women
ERIC Educational Resources Information Center
Ktoridou, Despo; Eteokleous-Grigoriou, Nikleia
2011-01-01
Purpose: The purpose of this study is to evaluate the effectiveness of a 40-hour computer course for beginners provided to a group of unemployed women learners with no/minimum computer literacy skills who can be characterized as digital immigrants. The aim of the study is to identify participants' perceptions and experiences regarding technology,…
The evolvability of programmable hardware.
Raman, Karthik; Wagner, Andreas
2011-02-06
In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected 'neutral networks' in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 10(45) logic circuits ('genotypes') and 10(19) logic functions ('phenotypes'). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry.
Towards pattern generation and chaotic series prediction with photonic reservoir computers
NASA Astrophysics Data System (ADS)
Antonik, Piotr; Hermans, Michiel; Duport, François; Haelterman, Marc; Massar, Serge
2016-03-01
Reservoir Computing is a bio-inspired computing paradigm for processing time dependent signals that is particularly well suited for analog implementations. Our team has demonstrated several photonic reservoir computers with performance comparable to digital algorithms on a series of benchmark tasks such as channel equalisation and speech recognition. Recently, we showed that our opto-electronic reservoir computer could be trained online with a simple gradient descent algorithm programmed on an FPGA chip. This setup makes it in principle possible to feed the output signal back into the reservoir, and thus highly enrich the dynamics of the system. This will allow to tackle complex prediction tasks in hardware, such as pattern generation and chaotic and financial series prediction, which have so far only been studied in digital implementations. Here we report simulation results of our opto-electronic setup with an FPGA chip and output feedback applied to pattern generation and Mackey-Glass chaotic series prediction. The simulations take into account the major aspects of our experimental setup. We find that pattern generation can be easily implemented on the current setup with very good results. The Mackey-Glass series prediction task is more complex and requires a large reservoir and more elaborate training algorithm. With these adjustments promising result are obtained, and we now know what improvements are needed to match previously reported numerical results. These simulation results will serve as basis of comparison for experiments we will carry out in the coming months.
The evolvability of programmable hardware
Raman, Karthik; Wagner, Andreas
2011-01-01
In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected ‘neutral networks’ in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 1045 logic circuits (‘genotypes’) and 1019 logic functions (‘phenotypes’). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry. PMID:20534598
NASA Astrophysics Data System (ADS)
Schmalz, Mark S.; Ritter, Gerhard X.; Caimi, Frank M.
2001-12-01
A wide variety of digital image compression transforms developed for still imaging and broadcast video transmission are unsuitable for Internet video applications due to insufficient compression ratio, poor reconstruction fidelity, or excessive computational requirements. Examples include hierarchical transforms that require all, or large portion of, a source image to reside in memory at one time, transforms that induce significant locking effect at operationally salient compression ratios, and algorithms that require large amounts of floating-point computation. The latter constraint holds especially for video compression by small mobile imaging devices for transmission to, and compression on, platforms such as palmtop computers or personal digital assistants (PDAs). As Internet video requirements for frame rate and resolution increase to produce more detailed, less discontinuous motion sequences, a new class of compression transforms will be needed, especially for small memory models and displays such as those found on PDAs. In this, the third series of papers, we discuss the EBLAST compression transform and its application to Internet communication. Leading transforms for compression of Internet video and still imagery are reviewed and analyzed, including GIF, JPEG, AWIC (wavelet-based), wavelet packets, and SPIHT, whose performance is compared with EBLAST. Performance analysis criteria include time and space complexity and quality of the decompressed image. The latter is determined by rate-distortion data obtained from a database of realistic test images. Discussion also includes issues such as robustness of the compressed format to channel noise. EBLAST has been shown to perform superiorly to JPEG and, unlike current wavelet compression transforms, supports fast implementation on embedded processors with small memory models.
Applications in Data-Intensive Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Anuj R.; Adkins, Joshua N.; Baxter, Douglas J.
2010-04-01
This book chapter, to be published in Advances in Computers, Volume 78, in 2010 describes applications of data intensive computing (DIC). This is an invited chapter resulting from a previous publication on DIC. This work summarizes efforts coming out of the PNNL's Data Intensive Computing Initiative. Advances in technology have empowered individuals with the ability to generate digital content with mouse clicks and voice commands. Digital pictures, emails, text messages, home videos, audio, and webpages are common examples of digital content that are generated on a regular basis. Data intensive computing facilitates human understanding of complex problems. Data-intensive applications providemore » timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements through the development of new classes of software, algorithms, and hardware.« less
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computers
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1975-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.
NASA Technical Reports Server (NTRS)
Peri, Frank, Jr.
1992-01-01
A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.
Large Scale Cross Drive Correlation Of Digital Media
2016-03-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS LARGE SCALE CROSS-DRIVE CORRELATION OF DIGITAL MEDIA by Joseph Van Bruaene March 2016 Thesis Co...CROSS-DRIVE CORRELATION OF DIGITAL MEDIA 5. FUNDING NUMBERS 6. AUTHOR(S) Joseph Van Bruaene 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...the ability to make large scale cross-drive correlations among a large corpus of digital media becomes increasingly important. We propose a
The digital health divide: evaluating online health information access and use among older adults.
Hall, Amanda K; Bernhardt, Jay M; Dodd, Virginia; Vollrath, Morgan W
2015-04-01
Innovations in health information technology (HIT) provide opportunities to reduce health care spending, improve quality of care, and improve health outcomes for older adults. However, concerns relating to older adults' limited access and use of HIT, including use of the Internet for health information, fuel the digital health divide debate. This study evaluated the potential digital health divide in relation to characteristic and belief differences between older adult users and nonusers of online health information sources. A cross-sectional survey design was conducted using a random sample of older adults. A total of 225 older adults (age range = 50-92 years, M = 68.9 years, SD = 10.4) participated in the study. Seventy-six percent of all respondents had Internet access. Users and nonusers of online health information differed significantly on age (M = 66.29 vs. M = 71.13), education, and previous experience with the health care system. Users and nonusers of online health information also differed significantly on Internet and technology access, however, a large percentage of nonusers had Internet access (56.3%), desktop computers (55.9%), and laptop computers or netbooks (43.2%). Users of online health information had higher mean scores on the Computer Self-Efficacy Measure than nonusers, t(159) = -7.29, p < .0001. This study found significant differences between older adult users and nonusers of online health information. Findings suggest strategies for reducing this divide and implications for health education programs to promote HIT use among older adults. © 2014 Society for Public Health Education.
The Digital Health Divide: Evaluating Online Health Information Access and Use Among Older Adults
Hall, Amanda K.; Bernhardt, Jay M.; Dodd, Virginia; Vollrath, Morgan W.
2015-01-01
Objective Innovations in health information technology (HIT) provide opportunities to reduce health care spending, improve quality of care, and improve health outcomes for older adults. However, concerns relating to older adults’ limited access and use of HIT, including use of the Internet for health information, fuel the digital health divide debate. This study evaluated the potential digital health divide in relation to characteristic and belief differences between older adult users and nonusers of online health information sources. Methods A cross-sectional survey design was conducted using a random sample of older adults. A total of 225 older adults (age range = 50–92 years, M = 68.9 years, SD = 10.4) participated in the study. Results Seventy-six percent of all respondents had Internet access. Users and nonusers of online health information differed significantly on age (M = 66.29 vs. M = 71.13), education, and previous experience with the health care system. Users and nonusers of online health information also differed significantly on Internet and technology access, however, a large percentage of nonusers had Internet access (56.3%), desktop computers (55.9%), and laptop computers or netbooks (43.2%). Users of online health information had higher mean scores on the Computer Self-Efficacy Measure than nonusers, t(159) = −7.29, p < .0001. Conclusion This study found significant differences between older adult users and nonusers of online health information. Findings suggest strategies for reducing this divide and implications for health education programs to promote HIT use among older adults. PMID:25156311
Target-Tracking Camera for a Metrology System
NASA Technical Reports Server (NTRS)
Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David
2009-01-01
An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.
LabVIEW: a software system for data acquisition, data analysis, and instrument control.
Kalkman, C J
1995-01-01
Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.
NASA Technical Reports Server (NTRS)
Nagle, H. T., Jr.
1972-01-01
A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.
NASA Technical Reports Server (NTRS)
Bailey, David H.; Borwein, Jonathan M.; Borwein, Peter B.; Plouffe, Simon
1996-01-01
This article gives a brief history of the analysis and computation of the mathematical constant Pi=3.14159 ..., including a number of the formulas that have been used to compute Pi through the ages. Recent developments in this area are then discussed in some detail, including the recent computation of Pi to over six billion decimal digits using high-order convergent algorithms, and a newly discovered scheme that permits arbitrary individual hexadecimal digits of Pi to be computed.
The potential of multi-port optical memories in digital computing
NASA Technical Reports Server (NTRS)
Alford, C. O.; Gaylord, T. K.
1975-01-01
A high-capacity memory with a relatively high data transfer rate and multi-port simultaneous access capability may serve as the basis for new computer architectures. The implementation of a multi-port optical memory is discussed. Several computer structures are presented that might profitably use such a memory. These structures include (1) a simultaneous record access system, (2) a simultaneously shared memory computer system, and (3) a parallel digital processing structure.
Digital Ethics: Computers, Photographs, and the Manipulation of Pixels.
ERIC Educational Resources Information Center
Mercedes, Dawn
1996-01-01
Summarizes negative aspects of computer technology and problems inherent in the field of digital imaging. Considers the postmodernist response that borrowing and alteration are essential characteristics of the technology. Discusses the implications of this for education and research. (MJP)
Jargon that Computes: Today's PC Terminology.
ERIC Educational Resources Information Center
Crawford, Walt
1997-01-01
Discusses PC (personal computer) and telecommunications terminology in context: Integrated Services Digital Network (ISDN); Asymmetric Digital Subscriber Line (ADSL); cable modems; satellite downloads; T1 and T3 lines; magnitudes ("giga-,""nano-"); Central Processing Unit (CPU); Random Access Memory (RAM); Universal Serial Bus…
Digital Storytelling as an Interactive Digital Media Context
ERIC Educational Resources Information Center
Anderson, Kate T.; Chua, Puay Hoe
2010-01-01
Digital storytelling involves the creation of short, personal narratives combining images, sounds, and text in a multimedia computer-based platform. In education, digital storytelling has been used to foster learning in formal and informal spaces worldwide. The authors offer a critical discussion of claims about digital storytelling's usefulness…
ERIC Educational Resources Information Center
Prensky, Marc
2006-01-01
"Digital natives" refer to today's students because they are native speakers of technology, fluent in the digital language of computers, video games, and the Internet. Those who were not born into the digital world are referred to as digital immigrants. Educators, considered digital immigrants, have slid into the 21st century--and into the digital…
Cellular automaton supercomputing
NASA Technical Reports Server (NTRS)
Wolfram, Stephen
1987-01-01
Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.
The research of laser marking control technology
NASA Astrophysics Data System (ADS)
Zhang, Qiue; Zhang, Rong
2009-08-01
In the area of Laser marking, the general control method is insert control card to computer's mother board, it can not support hot swap, it is difficult to assemble or it. Moreover, the one marking system must to equip one computer. In the system marking, the computer can not to do the other things except to transmit marking digital information. Otherwise it can affect marking precision. Based on traditional control methods existed some problems, introduced marking graphic editing and digital processing by the computer finish, high-speed digital signal processor (DSP) control marking the whole process. The laser marking controller is mainly contain DSP2812, digital memorizer, DAC (digital analog converting) transform unit circuit, USB interface control circuit, man-machine interface circuit, and other logic control circuit. Download the marking information which is processed by computer to U disk, DSP read the information by USB interface on time, then processing it, adopt the DSP inter timer control the marking time sequence, output the scanner control signal by D/A parts. Apply the technology can realize marking offline, thereby reduce the product cost, increase the product efficiency. The system have good effect in actual unit markings, the marking speed is more quickly than PCI control card to 20 percent. It has application value in practicality.
ERIC Educational Resources Information Center
Choudhury, Sayeed; Hobbs, Benjamin; Lorie, Mark; Flores, Nicholas; Coleman, Anita; Martin, Mairead; Kuhlman, David L.; McNair, John H.; Rhodes, William A.; Tipton, Ron; Agnew, Grace; Nicholson, Dennis; Macgregor, George
2002-01-01
Includes four articles that address issues related to digital libraries. Highlights include a framework for evaluating digital library services, particularly academic research libraries; interdisciplinary approaches to education about digital libraries that includes library and information science and computing; digital rights management; and the…
Scalable digital hardware for a trapped ion quantum computer
NASA Astrophysics Data System (ADS)
Mount, Emily; Gaultney, Daniel; Vrijsen, Geert; Adams, Michael; Baek, So-Young; Hudek, Kai; Isabella, Louis; Crain, Stephen; van Rynbach, Andre; Maunz, Peter; Kim, Jungsang
2016-12-01
Many of the challenges of scaling quantum computer hardware lie at the interface between the qubits and the classical control signals used to manipulate them. Modular ion trap quantum computer architectures address scalability by constructing individual quantum processors interconnected via a network of quantum communication channels. Successful operation of such quantum hardware requires a fully programmable classical control system capable of frequency stabilizing the continuous wave lasers necessary for loading, cooling, initialization, and detection of the ion qubits, stabilizing the optical frequency combs used to drive logic gate operations on the ion qubits, providing a large number of analog voltage sources to drive the trap electrodes, and a scheme for maintaining phase coherence among all the controllers that manipulate the qubits. In this work, we describe scalable solutions to these hardware development challenges.
Seminar on Understanding Digital Control and Analysis in Vibration Test Systems
NASA Technical Reports Server (NTRS)
1975-01-01
The advantages of the digital methods over the analog vibration methods are demonstrated. The following topics are covered: (1) methods of computer-controlled random vibration and reverberation acoustic testing, (2) methods of computer-controlled sinewave vibration testing, and (3) methods of computer-controlled shock testing. General algorithms are described in the form of block diagrams and flow diagrams.
Real-time classification and sensor fusion with a spiking deep belief network
O'Connor, Peter; Neil, Daniel; Liu, Shih-Chii; Delbruck, Tobi; Pfeiffer, Michael
2013-01-01
Deep Belief Networks (DBNs) have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128 × 128 Dynamic Vision Sensor (DVS) silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input. PMID:24115919
NASA Astrophysics Data System (ADS)
Clements, Logan W.; Collins, Jarrod A.; Wu, Yifei; Simpson, Amber L.; Jarnagin, William R.; Miga, Michael I.
2015-03-01
Soft tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface based metrics and sub-surface validation has largely been performed via phantom experiments. Tracked intraoperative ultrasound (iUS) provides a means to digitize sub-surface anatomical landmarks during clinical procedures. The proposed method involves the validation of a deformation correction algorithm for open hepatic image-guided surgery systems via sub-surface targets digitized with tracked iUS. Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration within the guidance system and for use in retrospective deformation correction. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. After the procedure, the clinician reviewed the iUS images to delineate contours of anatomical target features for use in the validation procedure. Mean closest point distances between the feature contours delineated in the iUS images and corresponding 3-D anatomical model generated from the preoperative tomograms were computed to quantify the extent to which the deformation correction algorithm improved registration accuracy. The preliminary results for two patients indicate that the deformation correction method resulted in a reduction in target error of approximately 50%.
Berquist, Rachel M.; Gledhill, Kristen M.; Peterson, Matthew W.; Doan, Allyson H.; Baxter, Gregory T.; Yopak, Kara E.; Kang, Ning; Walker, H. J.; Hastings, Philip A.; Frank, Lawrence R.
2012-01-01
Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators. PMID:22493695
A method for normalizing pathology images to improve feature extraction for quantitative pathology.
Tam, Allison; Barker, Jocelyn; Rubin, Daniel
2016-01-01
With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.
Berquist, Rachel M; Gledhill, Kristen M; Peterson, Matthew W; Doan, Allyson H; Baxter, Gregory T; Yopak, Kara E; Kang, Ning; Walker, H J; Hastings, Philip A; Frank, Lawrence R
2012-01-01
Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators.
The BOEING 777 - concurrent engineering and digital pre-assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abarbanel, B.
The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less
High-resolution digital brain atlases: a Hubble telescope for the brain.
Jones, Edward G; Stone, James M; Karten, Harvey J
2011-05-01
We describe implementation of a method for digitizing at microscopic resolution brain tissue sections containing normal and experimental data and for making the content readily accessible online. Web-accessible brain atlases and virtual microscopes for online examination can be developed using existing computer and internet technologies. Resulting databases, made up of hierarchically organized, multiresolution images, enable rapid, seamless navigation through the vast image datasets generated by high-resolution scanning. Tools for visualization and annotation of virtual microscope slides enable remote and universal data sharing. Interactive visualization of a complete series of brain sections digitized at subneuronal levels of resolution offers fine grain and large-scale localization and quantification of many aspects of neural organization and structure. The method is straightforward and replicable; it can increase accessibility and facilitate sharing of neuroanatomical data. It provides an opportunity for capturing and preserving irreplaceable, archival neurohistological collections and making them available to all scientists in perpetuity, if resources could be obtained from hitherto uninterested agencies of scientific support. © 2011 New York Academy of Sciences.
Computer Graphics Instruction in VizClass
ERIC Educational Resources Information Center
Grimes, Douglas; Warschauer, Mark; Hutchinson, Tara; Kuester, Falko
2005-01-01
"VizClass" is a university classroom environment designed to offer students in computer graphics and engineering courses up-to-date visualization technologies. Three digital whiteboards and a three-dimensional stereoscopic display provide complementary display surfaces. Input devices include touchscreens on the digital whiteboards, remote…
ERIC Educational Resources Information Center
Negroponto, Nicholas
1995-01-01
According to the author's book "Being Digital," our world is shifting from atoms to bits. Digitally rendered information, combined with personal computing power and networks, will make computers active participants in our everyday lives. "Teaching-disabled" classrooms will move from passivity to active participation and…
Automatic measurements and computations for radiochemical analyses
Rosholt, J.N.; Dooley, J.R.
1960-01-01
In natural radioactive sources the most important radioactive daughter products useful for geochemical studies are protactinium-231, the alpha-emitting thorium isotopes, and the radium isotopes. To resolve the abundances of these thorium and radium isotopes by their characteristic decay and growth patterns, a large number of repeated alpha activity measurements on the two chemically separated elements were made over extended periods of time. Alpha scintillation counting with automatic measurements and sample changing is used to obtain the basic count data. Generation of the required theoretical decay and growth functions, varying with time, and the least squares solution of the overdetermined simultaneous count rate equations are done with a digital computer. Examples of the complex count rate equations which may be solved and results of a natural sample containing four ??-emitting isotopes of thorium are illustrated. These methods facilitate the determination of the radioactive sources on the large scale required for many geochemical investigations.
CG2Real: Improving the Realism of Computer Generated Images Using a Large Collection of Photographs.
Johnson, Micah K; Dale, Kevin; Avidan, Shai; Pfister, Hanspeter; Freeman, William T; Matusik, Wojciech
2011-09-01
Computer-generated (CG) images have achieved high levels of realism. This realism, however, comes at the cost of long and expensive manual modeling, and often humans can still distinguish between CG and real images. We introduce a new data-driven approach for rendering realistic imagery that uses a large collection of photographs gathered from online repositories. Given a CG image, we retrieve a small number of real images with similar global structure. We identify corresponding regions between the CG and real images using a mean-shift cosegmentation algorithm. The user can then automatically transfer color, tone, and texture from matching regions to the CG image. Our system only uses image processing operations and does not require a 3D model of the scene, making it fast and easy to integrate into digital content creation workflows. Results of a user study show that our hybrid images appear more realistic than the originals.
USSR and Eastern Europe Scientific Abstracts, Engineering and Equipment. Number 25.
1976-10-29
is necessary to consider the problem of diffraction at a_cylindrical cavity. Some methods of solving this problem become very un- wieldy, when...applied to such a cavity of large wave dimensions, even with the aid of a digital computer. In the simpler Watson method , the series represent- ing the...potential of cylindrical waves is transformed to an integral in the complex plane and evaluated as the sum of residues. A difficulty in this method
1990-02-01
Representative is Mr. H. C. Race, AMC Smart Weapons Management Office, ATTN: AMSMI-SW, Redstone Arsenal, Alabama 35898- 5222. Reproduction. Permission to...reproduce any material contained in this document must be requested and approved in writing by the AMC Smart Weapons Management Office, AMSMI-SW, Redstone...elevation points, necessitating a large geometric data base that requires heavy computation loads for rendering. The second innovative technique is the
Readout and DAQ for Pixel Detectors
NASA Astrophysics Data System (ADS)
Platkevic, Michal
2010-01-01
Data readout and acquisition control of pixel detectors demand the transfer of significantly a large amounts of bits between the detector and the computer. For this purpose dedicated interfaces are used which are designed with focus on features like speed, small dimensions or flexibility of use such as digital signal processors, field-programmable gate arrays (FPGA) and USB communication ports. This work summarizes the readout and DAQ system built for state-of-the-art pixel detectors of the Medipix family.
Mineral resource of the month: cultured quartz crystal
,
2008-01-01
The article presents information on cultured quartz crystals, a mineral used in mobile phones, computers, clocks and other devices controlled by digital circuits. Cultured quartz, which is synthetically produced in large pressurized vessels known as autoclaves, is useful in electronic circuits for precise filtration, frequency control and timing for consumer and military use. Several ingredients are used in producing cultured quartz, including seed crystals, lascas, a solution of sodium hydroxide or sodium carbonate, lithium salts and deionized water.
NASA Astrophysics Data System (ADS)
Papers are presented on such topics as the wireless data network in PCS, advances in digital mobile networks, ATM switching experiments, broadband applications, network planning, and advances in SONET/SDH implementations. Consideration is also given to gigabit computer networks, techniques for modeling large high-speed networks, coding and modulation, the next-generation lightwave system, signaling systems for broadband ISDN, satellite technologies, and advances in standardization of low-rate signal processing.
NASA Technical Reports Server (NTRS)
Reeves, P. M.; Campbell, G. S.; Ganzer, V. M.; Joppa, R. G.
1974-01-01
A method is described for generating time histories which model the frequency content and certain non-Gaussian probability characteristics of atmospheric turbulence including the large gusts and patchy nature of turbulence. Methods for time histories using either analog or digital computation are described. A STOL airplane was programmed into a 6-degree-of-freedom flight simulator, and turbulence time histories from several atmospheric turbulence models were introduced. The pilots' reactions are described.
NASA Technical Reports Server (NTRS)
Patterson, G.
1973-01-01
The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.
Lin, Wei-Shao; Metz, Michael J; Pollini, Adrien; Ntounis, Athanasios; Morton, Dean
2014-12-01
This dental technique report describes a digital workflow with digital data acquisition at the implant level, computer-aided design and computer-aided manufacturing fabricated, tissue-colored, anodized titanium framework, individually luted zirconium oxide restorations, and autopolymerizing injection-molded acrylic resin to fabricate an implant-supported, metal-ceramic-resin fixed complete dental prosthesis in an edentulous mandible. The 1-step computer-aided design and computer-aided manufacturing fabrication of titanium framework and zirconium oxide restorations can provide a cost-effective alternative to the conventional metal-resin fixed complete dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Simultaneous G-Quadruplex DNA Logic.
Bader, Antoine; Cockroft, Scott L
2018-04-03
A fundamental principle of digital computer operation is Boolean logic, where inputs and outputs are described by binary integer voltages. Similarly, inputs and outputs may be processed on the molecular level as exemplified by synthetic circuits that exploit the programmability of DNA base-pairing. Unlike modern computers, which execute large numbers of logic gates in parallel, most implementations of molecular logic have been limited to single computing tasks, or sensing applications. This work reports three G-quadruplex-based logic gates that operate simultaneously in a single reaction vessel. The gates respond to unique Boolean DNA inputs by undergoing topological conversion from duplex to G-quadruplex states that were resolved using a thioflavin T dye and gel electrophoresis. The modular, addressable, and label-free approach could be incorporated into DNA-based sensors, or used for resolving and debugging parallel processes in DNA computing applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Real-time depth processing for embedded platforms
NASA Astrophysics Data System (ADS)
Rahnama, Oscar; Makarov, Aleksej; Torr, Philip
2017-05-01
Obtaining depth information of a scene is an important requirement in many computer-vision and robotics applications. For embedded platforms, passive stereo systems have many advantages over their active counterparts (i.e. LiDAR, Infrared). They are power efficient, cheap, robust to lighting conditions and inherently synchronized to the RGB images of the scene. However, stereo depth estimation is a computationally expensive task that operates over large amounts of data. For embedded applications which are often constrained by power consumption, obtaining accurate results in real-time is a challenge. We demonstrate a computationally and memory efficient implementation of a stereo block-matching algorithm in FPGA. The computational core achieves a throughput of 577 fps at standard VGA resolution whilst consuming less than 3 Watts of power. The data is processed using an in-stream approach that minimizes memory-access bottlenecks and best matches the raster scan readout of modern digital image sensors.
ERIC Educational Resources Information Center
Rasmusson, Maria; Åberg-Bengtsson, Lisbeth
2015-01-01
Data from a Swedish PISA-sample were used (1) to identify a digital reading factor, (2) to investigate gender differences in this factor (if found), and (3) to explore how computer game playing might relate to digital reading performance and gender. The analyses were conducted with structural equation modeling techniques. In addition to an overall…
User’s Manual for the Modular Analysis-Package Libraries ANAPAC and TRANL
1977-09-01
number) Computer software Fourier transforms Computer software library Interpolation software Digitized data...disregarded to give the user a simplified plot. (b) The last digit of ISPACE determines the type of line to be drawn, provided KODE is not...negative. If the last digit of ISPACE is 0 a solid line is drawn 1 a dashed line is drawn - - - 2 a dotted line is drawn .... 3 a dash-dot line is
Digital computer operation of a nuclear reactor
Colley, R.W.
1982-06-29
A method is described for the safe operation of a complex system such as a nuclear reactor using a digital computer. The computer is supplied with a data base containing a list of the safe state of the reactor and a list of operating instructions for achieving a safe state when the actual state of the reactor does not correspond to a listed safe state, the computer selects operating instructions to return the reactor to a safe state.
Digital computer operation of a nuclear reactor
Colley, Robert W.
1984-01-01
A method is described for the safe operation of a complex system such as a nuclear reactor using a digital computer. The computer is supplied with a data base containing a list of the safe state of the reactor and a list of operating instructions for achieving a safe state when the actual state of the reactor does not correspond to a listed safe state, the computer selects operating instructions to return the reactor to a safe state.
NCSTRL: Design and Deployment of a Globally Distributed Digital Library.
ERIC Educational Resources Information Center
Davies, James R.; Lagoze, Carl
2000-01-01
Discusses the development of a digital library architecture that allows the creation of digital libraries within the World Wide Web. Describes a digital library, NCSTRL (Networked Computer Science Technical Research Library), within which the work has taken place and explains Dienst, a protocol and architecture for distributed digital libraries.…
,
1990-01-01
The development of geographic information systems (GIS) is a rapidly growing industry that supports natural resources, studies, land management, environmental analysis, and urban and transporation planning. The increasing use of computers for storing and analyzing earth science information has greatly expanded the demand for digital cartographic and geographic data. Digital cartography involves the collection, storage, processing, analysis, and display of map data with the aid of computers. The U.S. Geological Survey (USGS), the Nation's largest earth science research agency, through its National Mapping Program, has expanded digital cartography operations to include the collection of elevation, planimetric, land use and land cover, and geographic names information in digital form. This digital information is available on 9-track magnetic tapes and, in the case of 1:2,000,000-scale planimetric digital line graph data, in Compact Disc Read Only Memory (CD-ROM) format. Digital information can be used with all types of geographic and land information systems.
A hardware implementation of the discrete Pascal transform for image processing
NASA Astrophysics Data System (ADS)
Goodman, Thomas J.; Aburdene, Maurice F.
2006-02-01
The discrete Pascal transform is a polynomial transform with applications in pattern recognition, digital filtering, and digital image processing. It already has been shown that the Pascal transform matrix can be decomposed into a product of binary matrices. Such a factorization leads to a fast and efficient hardware implementation without the use of multipliers, which consume large amounts of hardware. We recently developed a field-programmable gate array (FPGA) implementation to compute the Pascal transform. Our goal was to demonstrate the computational efficiency of the transform while keeping hardware requirements at a minimum. Images are uploaded into memory from a remote computer prior to processing, and the transform coefficients can be offloaded from the FPGA board for analysis. Design techniques like as-soon-as-possible scheduling and adder sharing allowed us to develop a fast and efficient system. An eight-point, one-dimensional transform completes in 13 clock cycles and requires only four adders. An 8x8 two-dimensional transform completes in 240 cycles and requires only a top-level controller in addition to the one-dimensional transform hardware. Finally, through minor modifications to the controller, the transform operations can be pipelined to achieve 100% utilization of the four adders, allowing one eight-point transform to complete every seven clock cycles.
Digital Screen Media and Cognitive Development.
Anderson, Daniel R; Subrahmanyam, Kaveri
2017-11-01
In this article, we examine the impact of digital screen devices, including television, on cognitive development. Although we know that young infants and toddlers are using touch screen devices, we know little about their comprehension of the content that they encounter on them. In contrast, research suggests that children begin to comprehend child-directed television starting at ∼2 years of age. The cognitive impact of these media depends on the age of the child, the kind of programming (educational programming versus programming produced for adults), the social context of viewing, as well the particular kind of interactive media (eg, computer games). For children <2 years old, television viewing has mostly negative associations, especially for language and executive function. For preschool-aged children, television viewing has been found to have both positive and negative outcomes, and a large body of research suggests that educational television has a positive impact on cognitive development. Beyond the preschool years, children mostly consume entertainment programming, and cognitive outcomes are not well explored in research. The use of computer games as well as educational computer programs can lead to gains in academically relevant content and other cognitive skills. This article concludes by identifying topics and goals for future research and provides recommendations based on current research-based knowledge. Copyright © 2017 by the American Academy of Pediatrics.
Hardware synthesis from DDL. [Digital Design Language for computer aided design and test of LSI
NASA Technical Reports Server (NTRS)
Shah, A. M.; Shiva, S. G.
1981-01-01
The details of the digital systems can be conveniently input into the design automation system by means of Hardware Description Languages (HDL). The Computer Aided Design and Test (CADAT) system at NASA MSFC is used for the LSI design. The Digital Design Language (DDL) has been selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. This paper addresses problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system.
NASA Astrophysics Data System (ADS)
Yamaguchi, Masahiro; Haneishi, Hideaki; Fukuda, Hiroyuki; Kishimoto, Junko; Kanazawa, Hiroshi; Tsuchida, Masaru; Iwama, Ryo; Ohyama, Nagaaki
2006-01-01
In addition to the great advancement of high-resolution and large-screen imaging technology, the issue of color is now receiving considerable attention as another aspect than the image resolution. It is difficult to reproduce the original color of subject in conventional imaging systems, and that obstructs the applications of visual communication systems in telemedicine, electronic commerce, and digital museum. To breakthrough the limitation of conventional RGB 3-primary systems, "Natural Vision" project aims at an innovative video and still-image communication technology with high-fidelity color reproduction capability, based on spectral information. This paper summarizes the results of NV project including the development of multispectral and multiprimary imaging technologies and the experimental investigations on the applications to medicine, digital archives, electronic commerce, and computer graphics.
Cui, Yang; Hanley, Luke
2015-06-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.
Cui, Yang; Hanley, Luke
2015-01-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science. PMID:26133872
NASA Astrophysics Data System (ADS)
Cui, Yang; Hanley, Luke
2015-06-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.
Digital Textbooks. Research Brief
ERIC Educational Resources Information Center
Johnston, Howard
2011-01-01
Despite their growing popularity, digital alternatives to conventional textbooks are stirring up controversy. With the introduction of tablet computers, and the growing trend toward "cloud computing" and "open source" software, the trend is accelerating because costs are coming down and free or inexpensive materials are becoming more available.…
Computer considerations for real time simulation of a generalized rotor model
NASA Technical Reports Server (NTRS)
Howe, R. M.; Fogarty, L. E.
1977-01-01
Scaled equations were developed to meet requirements for real time computer simulation of the rotor system research aircraft. These equations form the basis for consideration of both digital and hybrid mechanization for real time simulation. For all digital simulation estimates of the required speed in terms of equivalent operations per second are developed based on the complexity of the equations and the required intergration frame rates. For both conventional hybrid simulation and hybrid simulation using time-shared analog elements the amount of required equipment is estimated along with a consideration of the dynamic errors. Conventional hybrid mechanization using analog simulation of those rotor equations which involve rotor-spin frequencies (this consititutes the bulk of the equations) requires too much analog equipment. Hybrid simulation using time-sharing techniques for the analog elements appears possible with a reasonable amount of analog equipment. All-digital simulation with affordable general-purpose computers is not possible because of speed limitations, but specially configured digital computers do have the required speed and consitute the recommended approach.
Helicopter rotor and engine sizing for preliminary performance estimation
NASA Technical Reports Server (NTRS)
Talbot, P. D.; Bowles, J. V.; Lee, H. C.
1986-01-01
Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.
NASA Technical Reports Server (NTRS)
Svalbonas, V.; Levine, H.
1975-01-01
The theoretical analysis background for the STARS-2P nonlinear inelastic program is discussed. The theory involved is amenable for the analysis of large deflection inelastic behavior in axisymmetric shells of revolution subjected to axisymmetric loadings. The analysis is capable of considering such effects as those involved in nonproportional and cyclic loading conditions. The following are also discussed: orthotropic nonlinear kinematic hardening theory; shell wall cross sections and discrete ring stiffeners; the coupled axisymmetric large deflection elasto-plastic torsion problem; and the provision for the inelastic treatment of smeared stiffeners, isogrid, and waffle wall constructions.
Memory Network For Distributed Data Processors
NASA Technical Reports Server (NTRS)
Bolen, David; Jensen, Dean; Millard, ED; Robinson, Dave; Scanlon, George
1992-01-01
Universal Memory Network (UMN) is modular, digital data-communication system enabling computers with differing bus architectures to share 32-bit-wide data between locations up to 3 km apart with less than one millisecond of latency. Makes it possible to design sophisticated real-time and near-real-time data-processing systems without data-transfer "bottlenecks". This enterprise network permits transmission of volume of data equivalent to an encyclopedia each second. Facilities benefiting from Universal Memory Network include telemetry stations, simulation facilities, power-plants, and large laboratories or any facility sharing very large volumes of data. Main hub of UMN is reflection center including smaller hubs called Shared Memory Interfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torquato, S.; Kim, I.C.; Cule, D.
1999-02-01
We generalize the Brownian motion simulation method of Kim and Torquato [J. Appl. Phys. {bold 68}, 3892 (1990)] to compute the effective conductivity, dielectric constant and diffusion coefficient of digitized composite media. This is accomplished by first generalizing the {ital first-passage-time equations} to treat first-passage regions of arbitrary shape. We then develop the appropriate first-passage-time equations for digitized media: first-passage squares in two dimensions and first-passage cubes in three dimensions. A severe test case to prove the accuracy of the method is the two-phase periodic checkerboard in which conduction, for sufficiently large phase contrasts, is dominated by corners that joinmore » two conducting-phase pixels. Conventional numerical techniques (such as finite differences or elements) do not accurately capture the local fields here for reasonable grid resolution and hence lead to inaccurate estimates of the effective conductivity. By contrast, we show that our algorithm yields accurate estimates of the effective conductivity of the periodic checkerboard for widely different phase conductivities. Finally, we illustrate our method by computing the effective conductivity of the random checkerboard for a wide range of volume fractions and several phase contrast ratios. These results always lie within rigorous four-point bounds on the effective conductivity. {copyright} {ital 1999 American Institute of Physics.}« less
El-Shafey, A; Kassab, A
2013-04-01
The purpose of the present study was to provide a detailed computed tomography (CT) and cross-sectional anatomic reference of the normal metatarsus and digits for the camel and buffalo, as well as to compare between metatarsus and digits in these animals to outstand a basis for diagnosis of their diseases. Advantages, including depiction of detailed cross-sectional anatomy, improved contrast resolution and computer reformatting, make it a potentially valuable diagnostic technique. The hind limbs of 12 healthy adult camel and buffalo were used. Clinically relevant anatomic structures were identified and labelled at each level in the corresponding images (CT and anatomic slices). CT images were used to identify the bony and soft tissue structures of the metatarsus and digits. The knowledge of normal anatomy of the camel and buffalo metatarsus and digits would serve as initial reference to the evaluation of CT images in these species. © 2012 Blackwell Verlag GmbH.
Digital Humanities: What Can Libraries Offer?
ERIC Educational Resources Information Center
Wong, Shun Han Rebekah
2016-01-01
The collaborative aspect of digital humanities is one of the core values of the field. Specialists and organizations involved in digital humanities partnerships may include individual scholars focusing on a particular area, multiple scholars across disciplines, computer scientists, or digital humanities centers. Through a quantitative analysis of…
NASA Technical Reports Server (NTRS)
1978-01-01
A triplex digital flight control system was installed in a NASA F-8C airplane to provide fail operate, full authority control. The triplex digital computers and interface circuitry process the pilot commands and aircraft motion feedback parameters according to the selected control laws, and they output the surface commands as an analog signal to the servoelectronics for position control of the aircraft's power actuators. The system and theory of operation of the computer by pass and servoelectronics are described and an automated ground test for each axis is included.
The computation of pi to 29,360,000 decimal digits using Borweins' quartically convergent algorithm
NASA Technical Reports Server (NTRS)
Bailey, David H.
1988-01-01
The quartically convergent numerical algorithm developed by Borwein and Borwein (1987) for 1/pi is implemented via a prime-modulus-transform multiprecision technique on the NASA Ames Cray-2 supercomputer to compute the first 2.936 x 10 to the 7th digits of the decimal expansion of pi. The history of pi computations is briefly recalled; the most recent algorithms are characterized; the implementation procedures are described; and samples of the output listing are presented. Statistical analyses show that the present decimal expansion is completely random, with only acceptable numbers of long repeating strings and single-digit runs.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting.
Alomar, Miquel L; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting
Alomar, Miquel L.; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L.
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting. PMID:26880876
Distributing digital video to multiple computers
Murray, James A.
2004-01-01
Video is an effective teaching tool, and live video microscopy is especially helpful in teaching dissection techniques and the anatomy of small neural structures. Digital video equipment is more affordable now and allows easy conversion from older analog video devices. I here describe a simple technique for bringing digital video from one camera to all of the computers in a single room. This technique allows students to view and record the video from a single camera on a microscope. PMID:23493464
Digital Immersive Virtual Environments and Instructional Computing
ERIC Educational Resources Information Center
Blascovich, Jim; Beall, Andrew C.
2010-01-01
This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…
A system for automatic analysis of blood pressure data for digital computer entry
NASA Technical Reports Server (NTRS)
Miller, R. L.
1972-01-01
Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.
Micro computed tomography (CT) scanned anatomical gateway to insect pest bioinformatics
USDA-ARS?s Scientific Manuscript database
An international collaboration to establish an interactive Digital Video Library for a Systems Biology Approach to study the Asian citrus Psyllid and psyllid genomics/proteomics interactions is demonstrated. Advances in micro-CT, digital computed tomography (CT) scan uses X-rays to make detailed pic...
Peering into the Future of Advertising.
ERIC Educational Resources Information Center
Hsia, H. J.
All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…
Teaching Multimedia Data Protection through an International Online Competition
ERIC Educational Resources Information Center
Battisti, F.; Boato, G.; Carli, M.; Neri, A.
2011-01-01
Low-cost personal computers, wireless access technologies, the Internet, and computer-equipped classrooms allow the design of novel and complementary methodologies for teaching digital information security in electrical engineering curricula. The challenges of the current digital information era require experts who are effectively able to…
Photography/Digital Imaging: Parallel & Paradoxical Histories.
ERIC Educational Resources Information Center
Witte, Mary Stieglitz
With the introduction of photography and photomechanical printing processes in the 19th century, the first age of machine pictures and reproductions emerged. The 20th century introduced computer image processing systems, creating a digital imaging revolution. Rather than concentrating on the adversarial aspects of the computer's influence on…
NASA Technical Reports Server (NTRS)
1973-01-01
Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.
Photogrammetry and computer-aided piping design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keneflick, J.F.; Chirillo, R.D.
1985-02-18
Three-dimensional measurements taken from photographs of a plant model can be digitized and linked with computer-aided piping design. This can short-cut the design and construction of new plants and expedite repair and retrofitting projects. Some designers bridge the gap between model and computer by digitizing from orthographic prints obtained via orthography or the laser scanning of model sections. Such valve or fitting then processed is described in this paper. The marriage of photogrammetry and computer-aided piping design can economically produce such numerical drawings.
Examples of challenges and opportunities in visual analysis in the digital humanities
NASA Astrophysics Data System (ADS)
Rushmeier, Holly; Pintus, Ruggero; Yang, Ying; Wong, Christiana; Li, David
2015-03-01
The massive digitization of books and manuscripts has converted millions of works that were once only physical into electronic documents. This conversion has made it possible for scholars to study large bodies of work, rather than just individual texts. This has offered new opportunities for scholarship in the humanities. Much previous work on digital collections has relied on optical character recognition and focused on the textual content of books. New work is emerging that is analyzing the visual layout and content of books and manuscripts. We present two different digital humanities projects in progress that present new opportunities for extracting data about the past, with new challenges for designing systems for scholars to interact with this data. The first project we consider is the layout and spectral content of thousands of pages from medieval manuscripts. We present the techniques used to study content variations in sets of similar manuscripts, and to study material variations that may indicate the location of manuscript production. The second project is the analysis of representations in the complete archive of Vogue magazine over 120 years. We present samples of applying computer vision techniques to understanding the changes in representation of women over time.
Digital terrain tapes: user guide
,
1980-01-01
DMATC's digital terrain tapes are a by-product of the agency's efforts to streamline the production of raised-relief maps. In the early 1960's DMATC developed the Digital Graphics Recorder (DGR) system that introduced new digitizing techniques and processing methods into the field of three-dimensional mapping. The DGR system consisted of an automatic digitizing table and a computer system that recorded a grid of terrain elevations from traces of the contour lines on standard topographic maps. A sequence of computer accuracy checks was performed and then the elevations of grid points not intersected by contour lines were interpolated. The DGR system produced computer magnetic tapes which controlled the carving of plaster forms used to mold raised-relief maps. It was realized almost immediately that this relatively simple tool for carving plaster molds had enormous potential for storing, manipulating, and selectively displaying (either graphically or numerically) a vast number of terrain elevations. As the demand for the digital terrain tapes increased, DMATC began developing increasingly advanced digitizing systems and now operates the Digital Topographic Data Collection System (DTDCS). With DTDCS, two types of data elevations as contour lines and points, and stream and ridge lines are sorted, matched, and resorted to obtain a grid of elevation values for every 0.01 inch on each map (approximately 200 feet on the ground). Undefined points on the grid are found by either linear or or planar interpolation.
A new procedure of modal parameter estimation for high-speed digital image correlation
NASA Astrophysics Data System (ADS)
Huňady, Róbert; Hagara, Martin
2017-09-01
The paper deals with the use of 3D digital image correlation in determining modal parameters of mechanical systems. It is a non-contact optical method, which for the measurement of full-field spatial displacements and strains of bodies uses precise digital cameras with high image resolution. Most often this method is utilized for testing of components or determination of material properties of various specimens. In the case of using high-speed cameras for measurement, the correlation system is capable of capturing various dynamic behaviors, including vibration. This enables the potential use of the mentioned method in experimental modal analysis. For that purpose, the authors proposed a measuring chain for the correlation system Q-450 and developed a software application called DICMAN 3D, which allows the direct use of this system in the area of modal testing. The created application provides the post-processing of measured data and the estimation of modal parameters. It has its own graphical user interface, in which several algorithms for the determination of natural frequencies, mode shapes and damping of particular modes of vibration are implemented. The paper describes the basic principle of the new estimation procedure which is crucial in the light of post-processing. Since the FRF matrix resulting from the measurement is usually relatively large, the estimation of modal parameters directly from the FRF matrix may be time-consuming and may occupy a large part of computer memory. The procedure implemented in DICMAN 3D provides a significant reduction in memory requirements and computational time while achieving a high accuracy of modal parameters. Its computational efficiency is particularly evident when the FRF matrix consists of thousands of measurement DOFs. The functionality of the created software application is presented on a practical example in which the modal parameters of a composite plate excited by an impact hammer were determined. For the verification of the obtained results a verification experiment was conducted during which the vibration responses were measured using conventional acceleration sensors. In both cases MIMO analysis was realized.
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
[Possibilities of use of digital imaging in forensic medicine].
Gaval'a, P; Ivicsics, I; Mlynár, J; Novomeský, F
2005-07-01
Based on the daily practice with digital photography and documentation, the authors point out the achievements of the computer technologies implementation to the practice of forensic medicine. The modern methods of imaging, especially the digital photography, offer a wide spectrum of use in forensic medicine--the digital documentation and archivation of autopsy findings, the possibility of immediate consultation of findings with another experts via Internet, and many others. Another possibility is a creation of digital photographic atlas of forensic medicine as a useful aid in pre- and postgradual study. Thus the application of the state-of-the-art computer technologies to the forensic medicine discloses the unknown before possibilities for further development of such a discipline of human medical sciences.
Digital avionics design and reliability analyzer
NASA Technical Reports Server (NTRS)
1981-01-01
The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.
Design requirements for ubiquitous computing environments for healthcare professionals.
Bång, Magnus; Larsson, Anders; Eriksson, Henrik
2004-01-01
Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.
Visual Form Perception Can Be a Cognitive Correlate of Lower Level Math Categories for Teenagers.
Cui, Jiaxin; Zhang, Yiyun; Cheng, Dazhi; Li, Dawei; Zhou, Xinlin
2017-01-01
Numerous studies have assessed the cognitive correlates of performance in mathematics, but little research has been conducted to systematically examine the relations between visual perception as the starting point of visuospatial processing and typical mathematical performance. In the current study, we recruited 223 seventh graders to perform a visual form perception task (figure matching), numerosity comparison, digit comparison, exact computation, approximate computation, and curriculum-based mathematical achievement tests. Results showed that, after controlling for gender, age, and five general cognitive processes (choice reaction time, visual tracing, mental rotation, spatial working memory, and non-verbal matrices reasoning), visual form perception had unique contributions to numerosity comparison, digit comparison, and exact computation, but had no significant relation with approximate computation or curriculum-based mathematical achievement. These results suggest that visual form perception is an important independent cognitive correlate of lower level math categories, including the approximate number system, digit comparison, and exact computation.
Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.; Cartin, K. F.
1984-01-01
The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.
A panning DLT procedure for three-dimensional videography.
Yu, B; Koh, T J; Hay, J G
1993-06-01
The direct linear transformation (DLT) method [Abdel-Aziz and Karara, APS Symposium on Photogrammetry. American Society of Photogrammetry, Falls Church, VA (1971)] is widely used in biomechanics to obtain three-dimensional space coordinates from film and video records. This method has some major shortcomings when used to analyze events which take place over large areas. To overcome these shortcomings, a three-dimensional data collection method based on the DLT method, and making use of panning cameras, was developed. Several small single control volumes were combined to construct a large total control volume. For each single control volume, a regression equation (calibration equation) is developed to express each of the 11 DLT parameters as a function of camera orientation, so that the DLT parameters can then be estimated from arbitrary camera orientations. Once the DLT parameters are known for at least two cameras, and the associated two-dimensional film or video coordinates of the event are obtained, the desired three-dimensional space coordinates can be computed. In a laboratory test, five single control volumes (in a total control volume of 24.40 x 2.44 x 2.44 m3) were used to test the effect of the position of the single control volume on the accuracy of the computed three dimensional space coordinates. Linear and quadratic calibration equations were used to test the effect of the order of the equation on the accuracy of the computed three dimensional space coordinates. For four of the five single control volumes tested, the mean resultant errors associated with the use of the linear calibration equation were significantly larger than those associated with the use of the quadratic calibration equation. The position of the single control volume had no significant effect on the mean resultant errors in computed three dimensional coordinates when the quadratic calibration equation was used. Under the same data collection conditions, the mean resultant errors in the computed three dimensional coordinates associated with the panning and stationary DLT methods were 17 and 22 mm, respectively. The major advantages of the panning DLT method lie in the large image sizes obtained and in the ease with which the data can be collected. The method also has potential for use in a wide variety of contexts. The major shortcoming of the method is the large amount of digitizing necessary to calibrate the total control volume. Adaptations of the method to reduce the amount of digitizing required are being explored.
NASA Technical Reports Server (NTRS)
Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.
1988-01-01
Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.
Reduction of lithologic-log data to numbers for use in the digital computer
Morgan, C.O.; McNellis, J.M.
1971-01-01
The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.
A Computational Model of Selection by Consequences
ERIC Educational Resources Information Center
McDowell, J. J.
2004-01-01
Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…
An Undergraduate Experiment in Alarm System Design.
ERIC Educational Resources Information Center
Martini, R. A.; And Others
1988-01-01
Describes an experiment involving data acquisition by a computer, digital signal transmission from the computer to a digital logic circuit and signal interpretation by this circuit. The system is being used at the Illinois Institute of Technology. Discusses the fundamental concepts involved. Demonstrates the alarm experiment as it is used in…
NASA Technical Reports Server (NTRS)
Simon, M. K.
1980-01-01
A technique is presented for generating phase plane plots on a digital computer which circumvents the difficulties associated with more traditional methods of numerical solving nonlinear differential equations. In particular, the nonlinear differential equation of operation is formulated.
An Undergraduate Electrical Engineering Course on Computer Organization.
ERIC Educational Resources Information Center
Commission on Engineering Education, Washington, DC.
Outlined is an undergraduate electrical engineering course on computer organization designed to meet the need for electrical engineers familiar with digital system design. The program includes both hardware and software aspects of digital systems essential to design function and correlates design and organizational aspects of the subject. The…
Definition and trade-off study of reconfigurable airborne digital computer system organizations
NASA Technical Reports Server (NTRS)
Conn, R. B.
1974-01-01
A highly-reliable, fault-tolerant reconfigurable computer system for aircraft applications was developed. The development and application reliability and fault-tolerance assessment techniques are described. Particular emphasis is placed on the needs of an all-digital, fly-by-wire control system appropriate for a passenger-carrying airplane.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...
APQ-102 imaging radar digital image quality study
NASA Technical Reports Server (NTRS)
Griffin, C. R.; Estes, J. M.
1982-01-01
A modified APQ-102 sidelooking radar collected synthetic aperture radar (SAR) data which was digitized and recorded on wideband magnetic tape. These tapes were then ground processed into computer compatible tapes (CCT's). The CCT's may then be processed into high resolution radar images by software on the CYBER computer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, David Edward
A description of the development of the mc_runjob software package used to manage large scale computing tasks for the D0 Experiment at Fermilab is presented, along with a review of the Digital Front End Trigger electronics and the software used to control them. A tracking study is performed on detector data to determine that the D0 Experiment can detect charged B mesons, and that these results are in accordance with current results. B mesons are found by searching for the decay channel B ± → J / Ψ K ± .
Gaydos, Leonard
1978-01-01
The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.
Theoretical and experimental studies in support of the geophysical fluid flow experiment
NASA Technical Reports Server (NTRS)
Hart, J.; Toomre, J.; Gilman, P.
1984-01-01
Computer programming was completed for digital acquisition of temperature and velocity data generated by the Geophysical Fluid Flow Cell (GFFC) during the upcoming Spacelab 3 mission. A set of scenarios was developed which covers basic electro-hydrodynamic instability, highly supercritical convection with isothermal boundaries, convection with imposed thermal forcing, and some stably stratified runs to look at large-scale thermohaline ocean circulations. The extent to which the GFFC experimental results apply to more complicated circumstances within the Sun or giant planets was assessed.
1976-08-01
extensive areas of good agreement with measured loadings where the prediction is based on acoustic theory. Acoustic theory as applied to thin airfoils...Acoustic thaory ha« baen demonstrated by references 12 through 18 to provide fairly good agreeaent with measured airloads due to blast and shock... ia found to riae to large values near the leading edge. Higher observed values of Ac further rearward of the leading edge ere found to compel
Surface features of central North America: a synoptic view from computer graphics
Pike, R.J.
1991-01-01
A digital shaded-relief image of the 48 contiguous United States shows the details of large- and small-scale landforms, including several linear trends. The features faithfully reflect tectonism, continental glaciation, fluvial activity, volcanism, and other surface-shaping events and processes. The new map not only depicts topography accurately and in its true complexity, but does so in one synoptic view that provides a regional context for geologic analysis unobscured by clouds, culture, vegetation, or artistic constraints. -Author
Development of a computerized atlas of neonatal surgery
NASA Astrophysics Data System (ADS)
Gill, Brijesh S.; Hardin, William D., Jr.
1995-05-01
Digital imaging is an evolving technology with significant potential for enhancing medical education and practice. Current teaching methodologies still rely on the time-honored traditions of group lectures, small group discussions, and clinical preceptorships. Educational content and value are variable. Utilization of electronic media is in its infancy but offers significant potential for enhancing if not replacing current teaching methodologies. This report details our experience with the creation of an interactive atlas on neonatal surgical conditions. The photographic atlas has been one of the classic tools of practice, reference, and especially of education in surgery. The major limitations on current atlases all stem from the fact that they are produced in book form. The limiting factors in the inclusion of large numbers of images in these volumes include the desire to limit the physical size of the book and the costs associated with high quality color reproduction of print images. The structure of the atlases usually makes them reference tools, rather than teaching tools. We have digitized a large number of clinical images dealing with the diagnosis and surgical management of all of the most common neonatal surgical conditions. The flexibility of the computer presentation environment allows the images to be organized in a number of different ways. In addition to a standard captioned atlas, the user may choose to review case histories of several of the more common conditions in neonates, complete with presenting conditions, imaging studies, surgery and pathology. Use of the computer offers the ability to choose multiple views of the images, including comparison views and transparent overlays that point out important anatomical and histopathological structures, and the ability to perform user self-tests. This atlas thus takes advantage of several aspects of data management unique to computerized digital imaging, particularly the ability to combine all aspects of medical imaging related to a single case for easy retrieval. This facet unique to digital imaging makes it the obvious choice for new methods of teaching such complex subjects as the clinical management of neonatal surgical conditions. We anticipate that many more subjects in the surgical, pathologic, and radiologic realms will eventually be presented in a similar manner.