NASA Technical Reports Server (NTRS)
Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1991-01-01
Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.
Digital Methodology to implement the ECOUTER engagement process.
Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J
2016-01-01
ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.
NASA Astrophysics Data System (ADS)
Swartz, Charles S.
2003-05-01
The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.
Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.
2004-01-01
Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971
Digital redesign of anti-wind-up controller for cascaded analog system.
Chen, Y S; Tsai, J S H; Shieh, L S; Moussighi, M M
2003-01-01
The cascaded conventional anti-wind-up (CAW) design method for integral controller is discussed. Then, the prediction-based digital redesign methodology is utilized to find the new pulse amplitude modulated (PAM) digital controller for effective digital control of the analog plant with input saturation constraint. The desired digital controller is determined from existing or pre-designed CAW analog controller. The proposed method provides a novel methodology for indirect digital design of a continuous-time unity output-feedback system with a cascaded analog controller as in the case of PID controllers for industrial control processes with the presence of actuator saturations. It enables us to implement an existing or pre-designed cascaded CAW analog controller via a digital controller effectively.
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.
1974-01-01
The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.
Digital Literacy: Tools and Methodologies for Information Society
ERIC Educational Resources Information Center
Rivoltella, Pier Cesare, Ed.
2008-01-01
Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
Applications of Landsat data and the data base approach
Lauer, D.T.
1986-01-01
A generalized methodology for applying digital Landsat data to resource inventory and assessment tasks is currently being used by several bureaux and agencies within the US Department of the Interior. The methodology includes definition of project objectives and output, identification of source materials, construction of the digital data base, performance of computer-assisted analyses, and generation of output. The USGS, Bureau of Land Management, US Fish and Wildlife Service, Bureau of Indian Affairs, Bureau of Reclamation, and National Park Service have used this generalized methodology to assemble comprehensive digital data bases for resource management. Advanced information processing techniques have been applied to these data bases for making regional environmental surveys on millions of acres of public lands at costs ranging from $0.01 to $0.08 an acre.-Author
Floating-to-Fixed-Point Conversion for Digital Signal Processors
NASA Astrophysics Data System (ADS)
Menard, Daniel; Chillet, Daniel; Sentieys, Olivier
2006-12-01
Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.
ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION
The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...
Digital storytelling as a method in health research: a systematic review protocol.
Rieger, Kendra L; West, Christina H; Kenny, Amanda; Chooniedass, Rishma; Demczuk, Lisa; Mitchell, Kim M; Chateau, Joanne; Scott, Shannon D
2018-03-05
Digital storytelling is an arts-based research method with potential to elucidate complex narratives in a compelling manner, increase participant engagement, and enhance the meaning of research findings. This method involves the creation of a 3- to 5-min video that integrates multimedia materials including photos, participant voices, drawings, and music. Given the significant potential of digital storytelling to meaningfully capture and share participants' lived experiences, a systematic review of its use in healthcare research is crucial to develop an in-depth understanding of how researchers have used this method, with an aim to refine and further inform future iterations of its use. We aim to identify and synthesize evidence on the use, impact, and ethical considerations of using digital storytelling in health research. The review questions are as follows: (1) What is known about the purpose, definition, use (processes), and contexts of digital storytelling as part of the research process in health research? (2) What impact does digital storytelling have upon the research process, knowledge development, and healthcare practice? (3) What are the key ethical considerations when using digital storytelling within qualitative, quantitative, and mixed method research studies? Key databases and the grey literature will be searched from 1990 to the present for qualitative, quantitative, and mixed methods studies that utilized digital storytelling as part of the research process. Two independent reviewers will screen and critically appraise relevant articles with established quality appraisal tools. We will extract narrative data from all studies with a standardized data extraction form and conduct a thematic analysis of the data. To facilitate innovative dissemination through social media, we will develop a visual infographic and three digital stories to illustrate the review findings, as well as methodological and ethical implications. In collaboration with national and international experts in digital storytelling, we will synthesize key evidence about digital storytelling that is critical to the development of methodological and ethical expertise about arts-based research methods. We will also develop recommendations for incorporating digital storytelling in a meaningful and ethical manner into the research process. PROSPERO registry number CRD42017068002 .
An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits
NASA Astrophysics Data System (ADS)
Corliss, Walter F., II
1989-03-01
The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.
Institutionalizing human-computer interaction for global health
Gulliksen, Jan
2017-01-01
ABSTRACT Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations’ new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable. PMID:28838309
Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.
The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array withoutmore » any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.« less
Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors
NASA Astrophysics Data System (ADS)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman
2015-08-01
The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine
2008-01-01
NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.
ERIC Educational Resources Information Center
Dalbello, Marija
2005-01-01
The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…
ERIC Educational Resources Information Center
Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.
2010-01-01
ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…
Teachers' Pedagogical Reasoning and Reframing of Practice in Digital Contexts
ERIC Educational Resources Information Center
Holmberg, Jörgen; Fransson, Göran; Fors, Uno
2018-01-01
Purpose: The purpose of this paper is to advance the understanding of teachers' reframing of practice in digital contexts by analysing teachers' pedagogical reasoning processes as they explore ways of using information and communication technologies (ICT) to create added pedagogical value. Design/methodology/approach: A design-based research (DBR)…
NASA Astrophysics Data System (ADS)
Chang, S. S. L.
State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.
Digital Skills Acquisition: Future Trends among Older Adults
ERIC Educational Resources Information Center
Gilliam, Brian K.
2011-01-01
Purpose: The purpose of this study was to identify future trends and barriers that will either facilitate or impede the narrowing of the digital skills divide among older adults during the next 10 years. Methodology: To address the research questions, this study used a modified version of the Delphi process using a panel of experts who…
NASA Astrophysics Data System (ADS)
Hayes, J.; Fai, S.; Kretz, S.; Ouimet, C.; White, P.
2015-08-01
The emerging field of digital fabrication is a process where three-dimensional datasets can be directly transferred to fabrication equipment to create models or even 1:1 building elements. In this paper, we will discuss the results of a collaboration between the Carleton Immersive Media Studio (CIMS), the Dominion Sculptor of Canada, and the Heritage Conservation Directorate (HCD) of Public Works and Government Services Canada (PWGSC), that utilizes digital fabrication technologies in the development of a digitally-assisted stone carving process. The collaboration couples the distinguished skill of the Dominion Sculptor with the latest digital acquisition and digital fabrication technologies for the reconstruction of a deteriorated stone bas-relief on the façade of the East Block building of the Parliament Buildings National Historic Site of Canada. The intention of the research is to establish a workflow of hybrid digital/analogue methodologies from acquisition through rehabilitation and ultimately to the fabrication of stone elements.
Fast and Accurate Cell Tracking by a Novel Optical-Digital Hybrid Method
NASA Astrophysics Data System (ADS)
Torres-Cisneros, M.; Aviña-Cervantes, J. G.; Pérez-Careta, E.; Ambriz-Colín, F.; Tinoco, Verónica; Ibarra-Manzano, O. G.; Plascencia-Mora, H.; Aguilera-Gómez, E.; Ibarra-Manzano, M. A.; Guzman-Cabrera, R.; Debeir, Olivier; Sánchez-Mondragón, J. J.
2013-09-01
An innovative methodology to detect and track cells using microscope images enhanced by optical cross-correlation techniques is proposed in this paper. In order to increase the tracking sensibility, image pre-processing has been implemented as a morphological operator on the microscope image. Results show that the pre-processing process allows for additional frames of cell tracking, therefore increasing its robustness. The proposed methodology can be used in analyzing different problems such as mitosis, cell collisions, and cell overlapping, ultimately designed to identify and treat illnesses and malignancies.
Stability and performance analysis of a jump linear control system subject to digital upsets
NASA Astrophysics Data System (ADS)
Wang, Rui; Sun, Hui; Ma, Zhen-Yang
2015-04-01
This paper focuses on the methodology analysis for the stability and the corresponding tracking performance of a closed-loop digital jump linear control system with a stochastic switching signal. The method is applied to a flight control system. A distributed recoverable platform is implemented on the flight control system and subject to independent digital upsets. The upset processes are used to stimulate electromagnetic environments. Specifically, the paper presents the scenarios that the upset process is directly injected into the distributed flight control system, which is modeled by independent Markov upset processes and independent and identically distributed (IID) processes. A theoretical performance analysis and simulation modelling are both presented in detail for a more complete independent digital upset injection. The specific examples are proposed to verify the methodology of tracking performance analysis. The general analyses for different configurations are also proposed. Comparisons among different configurations are conducted to demonstrate the availability and the characteristics of the design. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61403395), the Natural Science Foundation of Tianjin, China (Grant No. 13JCYBJC39000), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, the Tianjin Key Laboratory of Civil Aircraft Airworthiness and Maintenance in Civil Aviation of China (Grant No. 104003020106), and the Fund for Scholars of Civil Aviation University of China (Grant No. 2012QD21x).
Use of digital technologies for nasal prosthesis manufacturing.
Palousek, David; Rosicky, Jiri; Koutny, Daniel
2014-04-01
Digital technology is becoming more accessible for common use in medical applications; however, their expansion in prosthetic and orthotic laboratories is not large because of the persistent image of difficult applicability to real patients. This article aims to offer real example in the area of human facial prostheses. This article describes the utilization of optical digitization, computational modelling, rapid prototyping, mould fabrication and manufacturing of a nasal silicone prosthesis. This technical note defines the key points of the methodology and aspires to contribute to the introduction of a certified manufacturing procedure. The results show that the used technologies reduce the manufacturing time, reflect patient's requirements and allow the manufacture of high-quality prostheses for missing facial asymmetric parts. The methodology provides a good position for further development issues and is usable for clinical practice. Clinical relevance Utilization of digital technologies in facial prosthesis manufacturing process can be a good contribution for higher patient comfort and higher production efficiency but with higher initial investment and demands for experience with software tools.
NASA Astrophysics Data System (ADS)
van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-02-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.
Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-01-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842
REVIEW ARTICLE: Spectrophotometric applications of digital signal processing
NASA Astrophysics Data System (ADS)
Morawski, Roman Z.
2006-09-01
Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.
Dupree, Jean A.; Crowfoot, Richard M.
2012-01-01
The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)
NASA Technical Reports Server (NTRS)
Belcastro, C. M.
1983-01-01
Flight critical computer based control systems designed for advanced aircraft must exhibit ultrareliable performance in lightning charged environments. Digital system upset can occur as a result of lightning induced electrical transients, and a methodology was developed to test specific digital systems for upset susceptibility. Initial upset data indicates that there are several distinct upset modes and that the occurrence of upset is related to the relative synchronization of the transient input with the processing sate of the digital system. A large upset test data base will aid in the formulation and verification of analytical upset reliability modeling techniques which are being developed.
Valuing national effects of digital health investments: an applied method.
Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad
2015-01-01
This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.
Evaluating Multi-Input/Multi-Output Digital Control Systems
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek
1994-01-01
Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.
Automated image processing of LANDSAT 2 digital data for watershed runoff prediction
NASA Technical Reports Server (NTRS)
Sasso, R. R.; Jensen, J. R.; Estes, J. E.
1977-01-01
The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…
2013-12-01
45 A. METHODOLOGY... 45 B. ANALYSIS .....................................................................................................46 C...digital airborne guidance section MFOM MLRS family of munitions MLRS Multiple Launch Rocket System MMW millimeter wave MT material test MTM
2016-01-01
Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709
Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*
Piorun, Mary; Palmer, Lisa A.
2008-01-01
Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648
ERIC Educational Resources Information Center
Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.
2015-01-01
Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…
NASA Technical Reports Server (NTRS)
Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1990-01-01
The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.
Toward a Digital Thread and Data Package for Metals-Additive Manufacturing.
Kim, D B; Witherell, P; Lu, Y; Feng, S
2017-01-01
Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread.
Toward a Digital Thread and Data Package for Metals-Additive Manufacturing
Kim, D. B.; Witherell, P.; Lu, Y.; Feng, S.
2017-01-01
Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread. PMID:28691115
Biagianti, Bruno; Hidalgo-Mazzei, Diego
2017-01-01
The rapidly expanding field of mobile health (mHealth) seeks to harness increasingly affordable and ubiquitous mobile digital technologies including smartphones, tablets, apps and wearable devices to enhance clinical care. Accumulating evidence suggests that mHealth interventions are increasingly being adopted and valued by people living with serious mental illnesses such as schizophrenia and bipolar disorder, as a means of better understanding and managing their condition. We draw on experiences from three geographically and methodologically distinct mHealth studies to provide a pragmatic overview of the key challenges and considerations relating to the process of developing digital interventions for this population. PMID:29025862
Design of neurophysiologically motivated structures of time-pulse coded neurons
NASA Astrophysics Data System (ADS)
Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lobodzinska, Raisa F.
2009-04-01
The common methodology of biologically motivated concept of building of processing sensors systems with parallel input and picture operands processing and time-pulse coding are described in paper. Advantages of such coding for creation of parallel programmed 2D-array structures for the next generation digital computers which require untraditional numerical systems for processing of analog, digital, hybrid and neuro-fuzzy operands are shown. The optoelectronic time-pulse coded intelligent neural elements (OETPCINE) simulation results and implementation results of a wide set of neuro-fuzzy logic operations are considered. The simulation results confirm engineering advantages, intellectuality, circuit flexibility of OETPCINE for creation of advanced 2D-structures. The developed equivalentor-nonequivalentor neural element has power consumption of 10mW and processing time about 10...100us.
A methodology aimed at fostering and sustaining the development processes of an IE-based industry
NASA Astrophysics Data System (ADS)
Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza
In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.
Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.
2017-01-01
Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625
Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M
2017-04-01
The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.
Code of Federal Regulations, 2011 CFR
2011-01-01
... integration of systems, technologies, programs, equipment, supporting processes, and implementing procedures...-in-depth methodologies to minimize the potential for an insider to adversely affect, either directly... protection of digital computer and communication systems and networks. (ii) Site-specific conditions that...
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
NASA Astrophysics Data System (ADS)
Münster, S.; Kuroczyński, P.; Pfarr-Harfst, M.; Grellert, M.; Lengyel, D.
2015-08-01
The workgroup for Digital Reconstruction of the Digital Humanities in the German-speaking area association (Digital Humanities im deutschsprachigen Raum e.V.) was founded in 2014 as cross-disciplinary scientific society dealing with all aspects of digital reconstruction of cultural heritage and currently involves more than 40 German researchers. Moreover, the workgroup is dedicated to synchronise and foster methodological research for these topics. As one preliminary result a memorandum was created to name urgent research challenges and prospects in a condensed way and assemble a research agenda which could propose demands for further research and development activities within the next years. The version presented within this paper was originally created as a contribution to the so-called agenda development process initiated by the German Federal Ministry of Education and Research (BMBF) in 2014 and has been amended during a joint meeting of the digital reconstruction workgroup in November 2014.
Automated Meteor Detection by All-Sky Digital Camera Systems
NASA Astrophysics Data System (ADS)
Suk, Tomáš; Šimberová, Stanislava
2017-12-01
We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.
Movement measurement of isolated skeletal muscle using imaging microscopy
NASA Astrophysics Data System (ADS)
Elias, David; Zepeda, Hugo; Leija, Lorenzo S.; Sossa, Humberto; de la Rosa, Jose I.
1997-05-01
An imaging-microscopy methodology to measure contraction movement in chemically stimulated crustacean skeletal muscle, whose movement speed is about 0.02 mm/s is presented. For this, a CCD camera coupled to a microscope and a high speed digital image acquisition system, allowing us to capture 960 images per second are used. The images are digitally processed in a PC and displayed in a video monitor. A maximal field of 0.198 X 0.198 mm2 and a spatial resolution of 3.5 micrometers are obtained.
Advanced reliability modeling of fault-tolerant computer-based systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1982-01-01
Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.
Bagayoko, C-O; Bediang, G; Anne, A; Niang, M; Traoré, A-K; Geissbuhler, A
2017-11-01
It is generally agreed today that digital technology provides a lever for improving access to health care, care processes, and public health planning and activities such as education and prevention. Its use in countries that have reached a given level of development has taken place in a somewhat fragmented manner that raises important interoperability problems and sometimes makes synergy impossible between the different projects of digital health. This may be linked to several factors, principally the lack of a global vision of digital health, and inadequate methodological knowledge that prevents the development and implementation of this vision. The countries of Africa should be able to profit from these errors from the beginnings of digital health, by moving toward systemic approaches, known standards, and tools appropriate to the realities on the ground. The aim of this work is to present the methodological approaches as well as the principal results of two relatively new centers of expertise in Mali and Cameroon intended to cultivate this vision of digital governance in the domain of health and to train professionals to implement the projects. Both centers were created due to initiatives of organizations of civil society. The center in Mali developed toward an economic interest group and then to collaboration with healthcare and university organizations. The same process is underway at the Cameroon center. The principal results from these centers can be enumerated under different aspects linked to research, development, training, and implementation of digital health tools. They have produced dozens of scientific publications, doctoral dissertations, theses, and papers focused especially on subjects such as the medicoeconomic evaluation tools of e-health and health information technology systems. In light of these results, we can conclude that these two centers of expertise have well and truly been established. Their role may be decisive in the local training of participants, the culture of good governance of digital health projects, the development of operational strategies, and the implementation of projects.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
Winging It: Using Digital Imaging To Investigate Butterfly Metamorphosis
ERIC Educational Resources Information Center
Bowen, Anne; Bell, Randy L.
2004-01-01
One of the best ways to inspire interest in biology is through observations of living things. Unfortunately, this important component of science methodology is often left out because of the difficulty of including it in the classroom. Additionally, amazing processes occur in nature that few have the chance to observe. This article reviews a…
NASA Astrophysics Data System (ADS)
Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang
2018-07-01
A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Asif, Muhammad; Guo, Xiangzhou; Zhang, Jing; Miao, Jungang
2018-04-17
Digital cross-correlation is central to many applications including but not limited to Digital Image Processing, Satellite Navigation and Remote Sensing. With recent advancements in digital technology, the computational demands of such applications have increased enormously. In this paper we are presenting a high throughput digital cross correlator, capable of processing 1-bit digitized stream, at the rate of up to 2 GHz, simultaneously on 64 channels i.e., approximately 4 Trillion correlation and accumulation operations per second. In order to achieve higher throughput, we have focused on frequency based partitioning of our design and tried to minimize and localize high frequency operations. This correlator is designed for a Passive Millimeter Wave Imager intended for the detection of contraband items concealed on human body. The goals are to increase the system bandwidth, achieve video rate imaging, improve sensitivity and reduce the size. Design methodology is detailed in subsequent sections, elaborating the techniques enabling high throughput. The design is verified for Xilinx Kintex UltraScale device in simulation and the implementation results are given in terms of device utilization and power consumption estimates. Our results show considerable improvements in throughput as compared to our baseline design, while the correlator successfully meets the functional requirements.
Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano
2015-07-01
In the field of orthodontic planning, the creation of a complete digital dental model to simulate and predict treatments is of utmost importance. Nowadays, orthodontists use panoramic radiographs (PAN) and dental crown representations obtained by optical scanning. However, these data do not contain any 3D information regarding tooth root geometries. A reliable orthodontic treatment should instead take into account entire geometrical models of dental shapes in order to better predict tooth movements. This paper presents a methodology to create complete 3D patient dental anatomies by combining digital mouth models and panoramic radiographs. The modeling process is based on using crown surfaces, reconstructed by optical scanning, and root geometries, obtained by adapting anatomical CAD templates over patient specific information extracted from radiographic data. The radiographic process is virtually replicated on crown digital geometries through the Discrete Radon Transform (DRT). The resulting virtual PAN image is used to integrate the actual radiographic data and the digital mouth model. This procedure provides the root references on the 3D digital crown models, which guide a shape adjustment of the dental CAD templates. The entire geometrical models are finally created by merging dental crowns, captured by optical scanning, and root geometries, obtained from the CAD templates. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McEwan, W.; Butterfield, J.
2011-05-01
The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.
Gregory, Katherine
2018-06-01
In the last 20 years, qualitative research scholars have begun to interrogate methodological and analytic issues concerning online research settings as both data sources and instruments for digital methods. This article examines the adaptation of parts of a qualitative research curriculum for understanding online communication settings. I propose methodological best practices for researchers and educators that I developed while teaching research methods to undergraduate and graduate students across disciplinary departments and discuss obstacles faced during my own research while gathering data from online sources. This article confronts issues concerning the disembodied aspects of applying what in practice should be rooted in a humanistic inquiry. Furthermore, as some approaches to online qualitative research as a digital method grow increasingly problematic with the development of new data mining technologies, I will also briefly touch upon borderline ethical practices involving data-scraping-based qualitative research.
A Case Study of Reverse Engineering Integrated in an Automated Design Process
NASA Astrophysics Data System (ADS)
Pescaru, R.; Kyratsis, P.; Oancea, G.
2016-11-01
This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
Akkaynak, Derya; Treibitz, Tali; Xiao, Bei; Gürkan, Umut A.; Allen, Justine J.; Demirci, Utkan; Hanlon, Roger T.
2014-01-01
Commercial off-the-shelf digital cameras are inexpensive and easy-to-use instruments that can be used for quantitative scientific data acquisition if images are captured in raw format and processed so that they maintain a linear relationship with scene radiance. Here we describe the image-processing steps required for consistent data acquisition with color cameras. In addition, we present a method for scene-specific color calibration that increases the accuracy of color capture when a scene contains colors that are not well represented in the gamut of a standard color-calibration target. We demonstrate applications of the proposed methodology in the fields of biomedical engineering, artwork photography, perception science, marine biology, and underwater imaging. PMID:24562030
Accelerated numerical processing of electronically recorded holograms with reduced speckle noise.
Trujillo, Carlos; Garcia-Sucerquia, Jorge
2013-09-01
The numerical reconstruction of digitally recorded holograms suffers from speckle noise. An accelerated method that uses general-purpose computing in graphics processing units to reduce that noise is shown. The proposed methodology utilizes parallelized algorithms to record, reconstruct, and superimpose multiple uncorrelated holograms of a static scene. For the best tradeoff between reduction of the speckle noise and processing time, the method records, reconstructs, and superimposes six holograms of 1024 × 1024 pixels in 68 ms; for this case, the methodology reduces the speckle noise by 58% compared with that exhibited by a single hologram. The fully parallelized method running on a commodity graphics processing unit is one order of magnitude faster than the same technique implemented on a regular CPU using its multithreading capabilities. Experimental results are shown to validate the proposal.
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Vollmar, Horst Christian; Kramer, Ursula; Müller, Hardy; Griemmert, Maria; Noelle, Guido; Schrappe, Matthias
2017-12-01
The term "digital health" is currently the most comprehensive term that includes all information and communication technologies in healthcare, including e-health, mobile health, telemedicine, big data, health apps and others. Digital health can be seen as a good example of the use of the concept and methodology of health services research in the interaction between complex interventions and complex contexts. The position paper deals with 1) digital health as the subject of health services research; 2) digital health as a methodological and ethical challenge for health services research. The often-postulated benefits of digital health interventions should be demonstrated with good studies. First systematic evaluations of apps for "treatment support" show that risks are higher than benefits. The need for a rigorous proof applies even more to big data-assisted interventions that support decision-making in the treatment process with the support of artificial intelligence. Of course, from the point of view of health services research, it is worth participating as much as possible in data access available through digital health and "big data". However, there is the risk that a noncritical application of digital health and big data will lead to a return to a linear understanding of biomedical research, which, at best, accepts complex conditions assuming multivariate models but does not take complex facts into account. It is not just a matter of scientific ethical requirements in health services care research, for instance, better research instead of unnecessary research ("reducing waste"), but it is primarily a matter of anticipating the social consequences (system level) of scientific analysis and evaluation. This is both a challenge and an attractive option for health services research to present itself as a mature and responsible scientific discipline. © Georg Thieme Verlag KG Stuttgart · New York.
Digital Storytelling: A Novel Methodology for Sexual Health Promotion
ERIC Educational Resources Information Center
Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa
2013-01-01
Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…
Towards a Methodology of Postmodern Assemblage: Adolescent Identity in the Age of Social Networking
ERIC Educational Resources Information Center
Barnett, Chad
2009-01-01
Adolescents who occupy virtual spaces construct identities for a dual audience, those intimate friends whose favor they seek and a broader public audience whose purpose for viewing cannot be known. The digital world of MySpace, Facebook, and Instant Messaging has simultaneously complicated and enhanced the process of identity construction. The…
A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing
2014-03-01
possible mappings ...................................................60 Table 25. Possible optimal leaf -nodes... size weight and power UAV unmanned aerial vehicle UHF ultra-high frequency UML universal modeling language Verilog verify logic VHDL VHSIC...optimal leaf -nodes to some design patterns for embedded system design. Software and hardware partitioning is a very difficult challenge in the field of
Three-Dimensional Extension of a Digital Library Service System
ERIC Educational Resources Information Center
Xiao, Long
2010-01-01
Purpose: The paper aims to provide an overall methodology and case study for the innovation and extension of a digital library, especially the service system. Design/methodology/approach: Based on the three-dimensional structure theory of the information service industry, this paper combines a comprehensive analysis with the practical experiences…
A Design Methodology for Medical Processes.
Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.
The State of Development of Digital Libraries in Poland
ERIC Educational Resources Information Center
Gorny, Miroslaw; Catlow, John; Lewandowski, Rafal
2010-01-01
Purpose: The purpose of this paper is to describe the state of development of Polish digital libraries. Design/methodology/approach: The paper describes the establishment of the first digital library in Poland, the creation of the Wielkopolska Digital Library and other regional digital libraries. The organisational and technological solutions used…
Duarte-Galvan, Carlos; Romero-Troncoso, Rene de J; Torres-Pacheco, Irineo; Guevara-Gonzalez, Ramon G; Fernandez-Jaramillo, Arturo A; Contreras-Medina, Luis M; Carrillo-Serrano, Roberto V; Millan-Almaraz, Jesus R
2014-10-09
Soil drought represents one of the most dangerous stresses for plants. It impacts the yield and quality of crops, and if it remains undetected for a long time, the entire crop could be lost. However, for some plants a certain amount of drought stress improves specific characteristics. In such cases, a device capable of detecting and quantifying the impact of drought stress in plants is desirable. This article focuses on testing if the monitoring of physiological process through a gas exchange methodology provides enough information to detect drought stress conditions in plants. The experiment consists of using a set of smart sensors based on Field Programmable Gate Arrays (FPGAs) to monitor a group of plants under controlled drought conditions. The main objective was to use different digital signal processing techniques such as the Discrete Wavelet Transform (DWT) to explore the response of plant physiological processes to drought. Also, an index-based methodology was utilized to compensate the spatial variation inside the greenhouse. As a result, differences between treatments were determined to be independent of climate variations inside the greenhouse. Finally, after using the DWT as digital filter, results demonstrated that the proposed system is capable to reject high frequency noise and to detect drought conditions.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will focus on the modernization of design and engineering practices through the use of Model Based Definition methodology. By gathering important engineering data into one 3D digital data set, applying model annotations, and setting up model view states directly in the 3D CAD model, model-specific information can be published to Windchill and CreoView for use during the Design Review Process. This presentation will describe the methods that have been incorporated into the modeling.
NASA Astrophysics Data System (ADS)
Attention is given to aspects of quality assurance methodologies in development life cycles, optical intercity transmission systems, multiaccess protocols, system and technology aspects in the case of regional/domestic satellites, advances in SSB-AM radio transmission over terrestrial and satellite network, and development environments for telecommunications systems. Other subjects studied are concerned with business communication networks for voice and data, VLSI in local network and communication protocol, product evaluation and support, an update regarding Videotex, topics in communication theory, topics in radio propagation, a status report regarding societal effects of technology in the workplace, digital image processing, and adaptive signal processing for communications. The management of the reliability function in the development process is considered along with Giga-bit technologies for long distance large capacity optical transmission equipment. The application of gallium arsenide analog and digital integrated circuits for high-speed fiber optical communications, and a simple algorithm for image data coding.
NASA Astrophysics Data System (ADS)
Sokolov, M. A.
This handbook treats the design and analysis of of pulsed radar receivers, with emphasis on elements (especially IC elements) that implement optimal and suboptimal algorithms. The design methodology is developed from the viewpoint of statistical communications theory. Particular consideration is given to the synthesis of single-channel and multichannel detectors, the design of analog and digital signal-processing devices, and the analysis of IF amplifiers.
ERIC Educational Resources Information Center
Exarchou, Evi; Klonari, Aikaterini; Lambrinos, Nikos; Vaitis, Michalis
2017-01-01
This study focused on the analysis of Grade-12 (Senior) students' sociocultural constructivist interactions using Web 2.0 applications during a geographical research process. In the study methodology context, a transdisciplinary case study (TdCS) with ethnographic and research action data was designed, implemented and analyzed in real teaching…
ERIC Educational Resources Information Center
Márquez, Manuel; Chaves, Beatriz
2016-01-01
The application of a methodology based on S.C. Dik's Functionalist Grammar linguistic principles, which is addressed to the teaching of Latin to secondary students, has resulted in a quantitative improvement in students' acquisition process of knowledge. To do so, we have used a self-learning tool, an ad hoc dictionary, of which the use in…
NASA Astrophysics Data System (ADS)
Rasera, L. G.; Mariethoz, G.; Lane, S. N.
2017-12-01
Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.
Empirical Mode Decomposition and Neural Networks on FPGA for Fault Diagnosis in Induction Motors
Garcia-Perez, Arturo; Osornio-Rios, Roque Alfredo; Romero-Troncoso, Rene de Jesus
2014-01-01
Nowadays, many industrial applications require online systems that combine several processing techniques in order to offer solutions to complex problems as the case of detection and classification of multiple faults in induction motors. In this work, a novel digital structure to implement the empirical mode decomposition (EMD) for processing nonstationary and nonlinear signals using the full spline-cubic function is presented; besides, it is combined with an adaptive linear network (ADALINE)-based frequency estimator and a feed forward neural network (FFNN)-based classifier to provide an intelligent methodology for the automatic diagnosis during the startup transient of motor faults such as: one and two broken rotor bars, bearing defects, and unbalance. Moreover, the overall methodology implementation into a field-programmable gate array (FPGA) allows an online and real-time operation, thanks to its parallelism and high-performance capabilities as a system-on-a-chip (SoC) solution. The detection and classification results show the effectiveness of the proposed fused techniques; besides, the high precision and minimum resource usage of the developed digital structures make them a suitable and low-cost solution for this and many other industrial applications. PMID:24678281
Empirical mode decomposition and neural networks on FPGA for fault diagnosis in induction motors.
Camarena-Martinez, David; Valtierra-Rodriguez, Martin; Garcia-Perez, Arturo; Osornio-Rios, Roque Alfredo; Romero-Troncoso, Rene de Jesus
2014-01-01
Nowadays, many industrial applications require online systems that combine several processing techniques in order to offer solutions to complex problems as the case of detection and classification of multiple faults in induction motors. In this work, a novel digital structure to implement the empirical mode decomposition (EMD) for processing nonstationary and nonlinear signals using the full spline-cubic function is presented; besides, it is combined with an adaptive linear network (ADALINE)-based frequency estimator and a feed forward neural network (FFNN)-based classifier to provide an intelligent methodology for the automatic diagnosis during the startup transient of motor faults such as: one and two broken rotor bars, bearing defects, and unbalance. Moreover, the overall methodology implementation into a field-programmable gate array (FPGA) allows an online and real-time operation, thanks to its parallelism and high-performance capabilities as a system-on-a-chip (SoC) solution. The detection and classification results show the effectiveness of the proposed fused techniques; besides, the high precision and minimum resource usage of the developed digital structures make them a suitable and low-cost solution for this and many other industrial applications.
NASA Astrophysics Data System (ADS)
Ramanathan, Ramya; Guin, Arijit; Ritzi, Robert W.; Dominic, David F.; Freedman, Vicky L.; Scheibe, Timothy D.; Lunt, Ian A.
2010-04-01
A geometric-based simulation methodology was developed and incorporated into a computer code to model the hierarchical stratal architecture, and the corresponding spatial distribution of permeability, in braided channel belt deposits. The code creates digital models of these deposits as a three-dimensional cubic lattice, which can be used directly in numerical aquifer or reservoir models for fluid flow. The digital models have stratal units defined from the kilometer scale to the centimeter scale. These synthetic deposits are intended to be used as high-resolution base cases in various areas of computational research on multiscale flow and transport processes, including the testing of upscaling theories. The input parameters are primarily univariate statistics. These include the mean and variance for characteristic lengths of sedimentary unit types at each hierarchical level, and the mean and variance of log-permeability for unit types defined at only the lowest level (smallest scale) of the hierarchy. The code has been written for both serial and parallel execution. The methodology is described in part 1 of this paper. In part 2 (Guin et al., 2010), models generated by the code are presented and evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Ramya; Guin, Arijit; Ritzi, Robert W.
A geometric-based simulation methodology was developed and incorporated into a computer code to model the hierarchical stratal architecture, and the corresponding spatial distribution of permeability, in braided channel belt deposits. The code creates digital models of these deposits as a three-dimensional cubic lattice, which can be used directly in numerical aquifer or reservoir models for fluid flow. The digital models have stratal units defined from the km scale to the cm scale. These synthetic deposits are intended to be used as high-resolution base cases in various areas of computational research on multiscale flow and transport processes, including the testing ofmore » upscaling theories. The input parameters are primarily univariate statistics. These include the mean and variance for characteristic lengths of sedimentary unit types at each hierarchical level, and the mean and variance of log-permeability for unit types defined at only the lowest level (smallest scale) of the hierarchy. The code has been written for both serial and parallel execution. The methodology is described in Part 1 of this series. In Part 2, models generated by the code are presented and evaluated.« less
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang (Editor); Schenker, Paul (Editor)
1987-01-01
The papers presented in this volume provide an overview of current research in both optical and digital pattern recognition, with a theme of identifying overlapping research problems and methodologies. Topics discussed include image analysis and low-level vision, optical system design, object analysis and recognition, real-time hybrid architectures and algorithms, high-level image understanding, and optical matched filter design. Papers are presented on synthetic estimation filters for a control system; white-light correlator character recognition; optical AI architectures for intelligent sensors; interpreting aerial photographs by segmentation and search; and optical information processing using a new photopolymer.
Digital Video as Research Practice: Methodology for the Millennium
ERIC Educational Resources Information Center
Shrum, Wesley; Duque, Ricardo; Brown, Timothy
2005-01-01
This essay has its origin in a project on the globalization of science that rediscovered the wisdom of past research practices through the technology of the future. The main argument of this essay is that a convergence of digital video technologies with practices of social surveillance portends a methodological shift towards a new variety of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Y.A.; Chapman, D.M.; Hill, D.J.
2000-12-15
The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.
NASA Technical Reports Server (NTRS)
Yau, M.; Guarro, S.; Apostolakis, G.
1993-01-01
Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.
The Born Digital Graduate: Multiple Representations of and within Digital Humanities PhD Theses
ERIC Educational Resources Information Center
Webb, Sharon; Teehan, Aja; Keating, John
2013-01-01
This chapter examines the production and utilisation of digital tools to create and present a born-digital theses, and in so doing, considers the changing function of traditional theses. It asks how (relatively) new technologies and methodologies should affect the representation and function of graduate scholarship in the Digital Humanities (DH),…
A Design Methodology for Medical Processes
Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415
A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review
Callahan, Ryan; Darzi, Ara; Mayer, Erik
2016-01-01
Background Digital maturity is the extent to which digital technologies are used as enablers to deliver a high-quality health service. Extensive literature exists about how to assess the components of digital maturity, but it has not been used to design a comprehensive framework for evaluation. Consequently, the measurement systems that do exist are limited to evaluating digital programs within one service or care setting, meaning that digital maturity evaluation is not accounting for the needs of patients across their care pathways. Objective The objective of our study was to identify the best methods and metrics for evaluating digital maturity and to create a novel, evidence-based tool for evaluating digital maturity across patient care pathways. Methods We systematically reviewed the literature to find the best methods and metrics for evaluating digital maturity. We searched the PubMed database for all papers relevant to digital maturity evaluation. Papers were selected if they provided insight into how to appraise digital systems within the health service and if they indicated the factors that constitute or facilitate digital maturity. Papers were analyzed to identify methodology for evaluating digital maturity and indicators of digitally mature systems. We then used the resulting information about methodology to design an evaluation framework. Following that, the indicators of digital maturity were extracted and grouped into increasing levels of maturity and operationalized as metrics within the evaluation framework. Results We identified 28 papers as relevant to evaluating digital maturity, from which we derived 5 themes. The first theme concerned general evaluation methodology for constructing the framework (7 papers). The following 4 themes were the increasing levels of digital maturity: resources and ability (6 papers), usage (7 papers), interoperability (3 papers), and impact (5 papers). The framework includes metrics for each of these levels at each stage of the typical patient care pathway. Conclusions The framework uses a patient-centric model that departs from traditional service-specific measurements and allows for novel insights into how digital programs benefit patients across the health system. Trial Registration N/A PMID:27080852
Digital Initiatives and Metadata Use in Thailand
ERIC Educational Resources Information Center
SuKantarat, Wichada
2008-01-01
Purpose: This paper aims to provide information about various digital initiatives in libraries in Thailand and especially use of Dublin Core metadata in cataloguing digitized objects in academic and government digital databases. Design/methodology/approach: The author began researching metadata use in Thailand in 2003 and 2004 while on sabbatical…
Personal Name Identification in the Practice of Digital Repositories
ERIC Educational Resources Information Center
Xia, Jingfeng
2006-01-01
Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…
Meaningful Engagements: Feminist Historiography and the Digital Humanities
ERIC Educational Resources Information Center
Enoch, Jessica; Bessette, Jean
2013-01-01
Recent surveys of feminist rhetorical historiography by Royster and Kirsch, Elizabeth Tasker and Frances B. Holt-Underwood, K. J. Rawson, Kathleen J. Ryan, and Jessica Enoch reveal that very few feminist historiographers have taken up digital methodologies or engaged digital humanist conversations. Thus while digital feminist scholars have…
Examining Factors of Engagement With Digital Interventions for Weight Management: Rapid Review
2017-01-01
Background Digital interventions for weight management provide a unique opportunity to target daily lifestyle choices and eating behaviors over a sustained period of time. However, recent evidence has demonstrated a lack of user engagement with digital health interventions, impacting on the levels of intervention effectiveness. Thus, it is critical to identify the factors that may facilitate user engagement with digital health interventions to encourage behavior change and weight management. Objective The aim of this study was to identify and synthesize the available evidence to gain insights about users’ perspectives on factors that affect engagement with digital interventions for weight management. Methods A rapid review methodology was adopted. The search strategy was executed in the following databases: Web of Science, PsycINFO, and PubMed. Studies were eligible for inclusion if they investigated users’ engagement with a digital weight management intervention and were published from 2000 onwards. A narrative synthesis of data was performed on all included studies. Results A total of 11 studies were included in the review. The studies were qualitative, mixed-methods, or randomized controlled trials. Some of the studies explored features influencing engagement when using a Web-based digital intervention, others specifically explored engagement when accessing a mobile phone app, and some looked at engagement after text message (short message service, SMS) reminders. Factors influencing engagement with digital weight management interventions were found to be both user-related (eg, perceived health benefits) and digital intervention–related (eg, ease of use and the provision of personalized information). Conclusions The findings highlight the importance of incorporating user perspectives during the digital intervention development process to encourage engagement. The review contributes to our understanding of what facilitates user engagement and points toward a coproduction approach for developing digital interventions for weight management. Particularly, it highlights the importance of thinking about user-related and digital tool–related factors from the very early stages of the intervention development process. PMID:29061557
Automated measurement of human body shape and curvature using computer vision
NASA Astrophysics Data System (ADS)
Pearson, Jeremy D.; Hobson, Clifford A.; Dangerfield, Peter H.
1993-06-01
A system to measure the surface shape of the human body has been constructed. The system uses a fringe pattern generated by projection of multi-stripe structured light. The optical methodology used is fully described and the algorithms used to process acquired digital images are outlined. The system has been applied to the measurement of the shape of the human back in scoliosis.
Comparative life cycle assessments: The case of paper and digital media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Justin G., E-mail: jgbull@gmail.com; Kozak, Robert A., E-mail: rob.kozak@ubc.ca
The consumption of the written word is changing, as media transitions from paper products to digital alternatives. We reviewed the life cycle assessment (LCA) research literature that compared the environmental footprint of digital and paper media. To validate the role of context in influencing LCA results, we assessed LCAs that did not compare paper and print, but focused on a product or component that is part of the Information and Communication Technology (ICT) sector. Using a framework that identifies problems in LCA conduct, we assessed whether the comparative LCAs were accurate expressions of the environmental footprints of paper and print.more » We hypothesized that the differences between the product systems that produce paper and digital media weaken LCA's ability to compare environmental footprints. We also hypothesized that the characteristics of ICT as an industrial sector weaken LCA as an environmental assessment methodology. We found that existing comparative LCAs offered problematic comparisons of paper and digital media for two reasons — the stark material differences between ICT products and paper products, and the unique characteristics of the ICT sector. We suggested that the context of the ICT sector, best captured by the concept of “Moore's Law”, will continuously impede the ability of the LCA methodology to measure ICT products. -- Highlights: • We review the LCA research that compares paper and digital media. • We contrast the comparative LCAs with LCAs that examine only digital products. • Stark differences between paper and digital media weakens LCA findings. • Digital products in general challenge the LCA method's reliability. • Continuous innovation and global nature of digital products impedes LCA methodology.« less
Cartography, new technologies and geographic education: theoretical approaches to research the field
NASA Astrophysics Data System (ADS)
Seneme do Canto, Tânia
2018-05-01
In order to understand the roles that digital mapping can play in cartographic and geographic education, this paper discusses the theoretical and methodological approach used in a research that is undertaking in the education of geography teachers. To develop the study, we found in the works of Lankshear and Knobel (2013) a notion of new literacies that allows us looking at the practices within digital mapping in a sociocultural perspective. From them, we conclude that in order to understand the changes that digital cartography is able to foment in geography teaching, it is necessary to go beyond the substitution of means in the classroom and being able to explore what makes the new mapping practices different from others already consolidated in geography teaching. Therefore, we comment on some features of new forms of cartographic literacy that are in full development with digital technologies, but which are not determined solely by their use. The ideas of Kitchin and Dodge (2007) and Del Casino Junior and Hanna (2006) are also an important reference for the research. Methodologically, this approach helps us to understand that in the seek to comprehend maps and their meanings, irrespective of the medium used, we are dealing with a process of literacy that is very particular and emergent because it involves not only the characteristics of the map artifact and of the individual that produces or consumes it, but depends mainly on a diversity of interconnections that are being built between them (map and individual) and the world.
Measuring user experience in digital gaming: theoretical and methodological issues
NASA Astrophysics Data System (ADS)
Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte
2007-01-01
There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.
Conceptualizing Digital Literacies and Digital Ethics for Sustainability Education
ERIC Educational Resources Information Center
Brown, Susan A.
2014-01-01
Purpose: The purpose of this paper is to discuss the need for integrating a focus on digital literacies and digital ethics into sustainability education, proposing a conceptualization of these for sustainability education. Design/methodology/approach: The paper draws on relevant literature in the field of sustainability education and in the field…
Variable Star Signature Classification using Slotted Symbolic Markov Modeling
NASA Astrophysics Data System (ADS)
Johnston, K. B.; Peter, A. M.
2017-01-01
With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. This paper focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.
Variable Star Signature Classification using Slotted Symbolic Markov Modeling
NASA Astrophysics Data System (ADS)
Johnston, Kyle B.; Peter, Adrian M.
2016-01-01
With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Our research focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.
Meta-analysis of digital game and study characteristics eliciting physiological stress responses.
van der Vijgh, Benny; Beun, Robbert-Jan; Van Rood, Maarten; Werkhoven, Peter
2015-08-01
Digital games have been used as stressors in a range of disciplines for decades. Nonetheless, the underlying characteristics of these stressors and the study in which the stressor was applied are generally not recognized for their moderating effect on the measured physiological stress responses. We have therefore conducted a meta-analysis that analyzes the effects of characteristics of digital game stressors and study design on heart rate, systolic and diastolic blood pressure, in studies carried out from 1976 to 2012. In order to assess the differing quality between study designs, a new scale is developed and presented, coined reliability of effect size. The results show specific and consistent moderating functions of both game and study characteristics, on average accounting for around 43%, and in certain cases up to 57% of the variance found in physiological stress responses. Possible cognitive and physiological processes underlying these moderating functions are discussed, and a new model integrating these processes with the moderating functions is presented. These findings indicate that a digital game stressor does not act as a stressor by virtue of being a game, but rather derives its stressor function from its characteristics and the methodology in which it is used. This finding, together with the size of the associated moderations, indicates the need for a standardization of digital game stressors. © 2015 Society for Psychophysiological Research.
wHospital: a web-based application with digital signature for drugs dispensing management.
Rossi, Lorenzo; Margola, Lorenzo; Manzelli, Vacia; Bandera, Alessandra
2006-01-01
wHospital is the result of an information technology research project, based on the utilization of a web based application for managing the hospital drugs dispensing. Part of wHospital back bone and its key distinguishing characteristic is the adoption of the digital signature system,initially deployed by the Government of Lombardia, a Northern Italy Region, throughout the distribution of smart cards to all the healthcare and hospital staffs. The developed system is a web-based application with a proposed Health Records Digital Signature (HReDS) handshake to comply with the national law and with the Joint Commission International Standards. The prototype application, for a single hospital Operative Unit (OU), has focused on data and process management, related to drug therapy. Following a multi-faceted selection process, the Infective Disease OU of the Hospital in Busto Arsizio, Lombardia, was chosen for the development and prototype implementation. The project lead time, from user requirement analysis to training and deployment was approximately 8 months. This paper highlights the applied project methodology, the system architecture, and the achieved preliminary results.
Design Of Combined Stochastic Feedforward/Feedback Control
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1989-01-01
Methodology accommodates variety of control structures and design techniques. In methodology for combined stochastic feedforward/feedback control, main objectives of feedforward and feedback control laws seen clearly. Inclusion of error-integral feedback, dynamic compensation, rate-command control structure, and like integral element of methodology. Another advantage of methodology flexibility to develop variety of techniques for design of feedback control with arbitrary structures to obtain feedback controller: includes stochastic output feedback, multiconfiguration control, decentralized control, or frequency and classical control methods. Control modes of system include capture and tracking of localizer and glideslope, crab, decrab, and flare. By use of recommended incremental implementation, control laws simulated on digital computer and connected with nonlinear digital simulation of aircraft and its systems.
Semantic Modelling of Digital Forensic Evidence
NASA Astrophysics Data System (ADS)
Kahvedžić, Damir; Kechadi, Tahar
The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.
Separation of overlapping dental arch objects using digital records of illuminated plaster casts.
Yadollahi, Mohammadreza; Procházka, Aleš; Kašparová, Magdaléna; Vyšata, Oldřich; Mařík, Vladimír
2015-07-11
Plaster casts of individual patients are important for orthodontic specialists during the treatment process and their analysis is still a standard diagnostical tool. But the growing capabilities of information technology enable their replacement by digital models obtained by complex scanning systems. This paper presents the possibility of using a digital camera as a simple instrument to obtain the set of digital images for analysis and evaluation of the treatment using appropriate mathematical tools of image processing. The methods studied in this paper include the segmentation of overlapping dental bodies and the use of different illumination sources to increase the reliability of the separation process. The circular Hough transform, region growing with multiple seed points, and the convex hull detection method are applied to the segmentation of orthodontic plaster cast images to identify dental arch objects and their sizes. The proposed algorithm presents the methodology of improving the accuracy of segmentation of dental arch components using combined illumination sources. Dental arch parameters and distances between the canines and premolars for different segmentation methods were used as a measure to compare the results obtained. A new method of segmentation of overlapping dental arch components using digital records of illuminated plaster casts provides information with the precision required for orthodontic treatment. The distance between corresponding teeth was evaluated with a mean error of 1.38% and the Dice similarity coefficient of the evaluated dental bodies boundaries reached 0.9436 with a false positive rate [Formula: see text] and false negative rate [Formula: see text].
NASA Astrophysics Data System (ADS)
Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.
2006-09-01
Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.
NASA Astrophysics Data System (ADS)
Singh, Mandeep; Khare, Kedar
2018-05-01
We describe a numerical processing technique that allows single-shot region-of-interest (ROI) reconstruction in image plane digital holographic microscopy with full pixel resolution. The ROI reconstruction is modelled as an optimization problem where the cost function to be minimized consists of an L2-norm squared data fitting term and a modified Huber penalty term that are minimized alternately in an adaptive fashion. The technique can provide full pixel resolution complex-valued images of the selected ROI which is not possible to achieve with the commonly used Fourier transform method. The technique can facilitate holographic reconstruction of individual cells of interest from a large field-of-view digital holographic microscopy data. The complementary phase information in addition to the usual absorption information already available in the form of bright field microscopy can make the methodology attractive to the biomedical user community.
Ramírez-Miquet, Evelio E; Cabrera, Humberto; Grassi, Hilda C; de J Andrades, Efrén; Otero, Isabel; Rodríguez, Dania; Darias, Juan G
2017-08-01
This paper reports on the biospeckle processing of biological activity using a visualization scheme based upon the digital imaging information technology. Activity relative to bacterial growth in agar plates and to parasites affected by a drug is monitored via the speckle patterns generated by a coherent source incident on the microorganisms. We present experimental results to demonstrate the potential application of this methodology for following the activity in time. The digital imaging information technology is an alternative visualization enabling the study of speckle dynamics, which is correlated to the activity of bacteria and parasites. In this method, the changes in Red-Green-Blue (RGB) color component density are considered as markers of the growth of bacteria and parasites motility in presence of a drug. The RGB data was used to generate a two-dimensional surface plot allowing an analysis of color distribution on the speckle images. The proposed visualization is compared to the outcomes of the generalized differences and the temporal difference. A quantification of the activity is performed using a parameterization of the temporal difference method. The adopted digital image processing technique has been found suitable to monitor motility and morphological changes in the bacterial population over time and to detect and distinguish a short term drug action on parasites.
Resource selection for an interdisciplinary field: a methodology.
Jacoby, Beth E; Murray, Jane; Alterman, Ina; Welbourne, Penny
2002-10-01
The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page.
Resource selection for an interdisciplinary field: a methodology*
Jacoby, Beth E.; Murray, Jane; Alterman, Ina; Welbourne, Penny
2002-01-01
The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page. PMID:12398245
A simple landslide susceptibility analysis for hazard and risk assessment in developing countries
NASA Astrophysics Data System (ADS)
Guinau, M.; Vilaplana, J. M.
2003-04-01
In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.
Investigating Encrypted Material
NASA Astrophysics Data System (ADS)
McGrath, Niall; Gladyshev, Pavel; Kechadi, Tahar; Carthy, Joe
When encrypted material is discovered during a digital investigation and the investigator cannot decrypt the material then s/he is faced with the problem of how to determine the evidential value of the material. This research is proposing a methodology of extracting probative value from the encrypted file of a hybrid cryptosystem. The methodology also incorporates a technique for locating the original plaintext file. Since child pornography (KP) images and terrorist related information (TI) are transmitted in encrypted format the digital investigator must ask the question Cui Bono? - who benefits or who is the recipient? By doing this the scope of the digital investigation can be extended to reveal the intended recipient.
Digital Libraries: Situating Use in Changing Information Infrastructure.
ERIC Educational Resources Information Center
Bishop, Ann Peterson; Neumann, Laura J.; Star, Susan Leigh; Merkel, Cecelia; Ignacio, Emily; Sandusky, Robert J.
2000-01-01
Reviews empirical studies about how digital libraries evolve for use in scientific and technical work based on the Digital Libraries Initiative (DLI) at the University of Illinois. Discusses how users meet infrastructure and document disaggregation; describes use of the DLI testbed of full text journal articles; and explains research methodology.…
Creating and Sharing: Teens' Information Practices in Digital Communities
ERIC Educational Resources Information Center
Harlan, Mary Ann; Bruce, Christine; Lupton, Mandy
2014-01-01
Introduction: In a connected world youth are participating in digital content creating communities. This paper introduces a description of teens' information practices in digital content creating and sharing communities. Method: The research design was a constructivist grounded theory methodology. Seventeen interviews with eleven teens were…
Knowledge Organisation Systems in North American Digital Library Collections
ERIC Educational Resources Information Center
Shiri, Ali; Chase-Kruszewski, Sarah
2009-01-01
Purpose: The purpose of this paper is to report an investigation into the types of knowledge organisation systems (KOSs) utilised in North American digital library collections. Design/methodology/approach: The paper identifies, analyses and deep scans online North American hosted digital libraries. It reviews the literature related to the…
1977-08-24
exceeded a million rubles. POLAND SOME METHODOLOGICAL REMARKS RELATING TO THE FORECASTING MODEL OF COMPUTER DEVELOPMENT Warsaw INFORMATYKA in...PROCESSING SYSTEMS Warsaw INFORMATYKA in Polish Vol 11 No 10, Oct 76 pp 19-20 SEKULA, ZOFIA, Wroclaw [Abstract] The author presents critical remarks...TO ODRA 1300 SYSTEM Warsaw INFORMATYKA in Polish Vol 11 No 9, Sep 76 pp 1-4 BZDULA, CZESLAW, Research and Development Center of MERA-ELWRO Digital
NASA Astrophysics Data System (ADS)
Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.
2017-10-01
In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.
Evaluation of color grading impact in restoration process of archive films
NASA Astrophysics Data System (ADS)
Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Janout, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek
2016-09-01
Color grading of archive films is a very particular task in the process of their restoration. The ultimate goal of color grading here is to achieve the same look of the movie as intended at the time of its first presentation. The role of the expert restorer, expert group and a digital colorist in this complicated process is to find the optimal settings of the digital color grading system so that the resulting image look is as close as possible to the estimate of the original reference release print adjusted by the expert group of cinematographers. A methodology for subjective assessment of perceived differences between the outcomes of color grading is introduced, and results of a subjective study are presented. Techniques for objective assessment of perceived differences are discussed, and their performance is evaluated using ground truth obtained from the subjective experiment. In particular, a solution based on calibrated digital single-lens reflex camera and subsequent analysis of image features captured from the projection screen is described. The system based on our previous work is further developed so that it can be used for the analysis of projected images. It allows assessing color differences in these images and predict their impact on the perceived difference in image look.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham
2015-12-01
Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham
2015-12-01
Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less
Lee, Kee Hyuck; Yoo, Sooyoung; Shin, HoGyun; Baek, Rong-Min; Chung, Chin Youb; Hwang, Hee
2013-01-01
It is reported that digital dashboard systems in hospitals provide a user interface (UI) that can centrally manage and retrieve various information related to patients in a single screen, support the decision-making of medical professionals on a real time basis by integrating the scattered medical information systems and core work flows, enhance the competence and decision-making ability of medical professionals, and reduce the probability of misdiagnosis. However, the digital dashboard systems of hospitals reported to date have some limitations when medical professionals use them to generally treat inpatients, because those were limitedly used for the work process of certain departments or developed to improve specific disease-related indicators. Seoul National University Bundang Hospital developed a new concept of EMR system to overcome such limitations. The system allows medical professionals to easily access all information on inpatients and effectively retrieve important information from any part of the hospital by displaying inpatient information in the form of digital dashboard. In this study, we would like to introduce the structure, development methodology and the usage of our new concept.
Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio
2009-01-01
Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.
Patterns of Inclusion: Fostering Digital Citizenship through Hybrid Education
ERIC Educational Resources Information Center
Pedersen, Alex Young; Nørgaard, Rikke Toft; Köppe, Christian
2018-01-01
Reconsidering the concept of digital citizenship and the essential component of education, the authors propose that the concept of Hybrid Education may serve both as a guideline for the utilization of digital technologies in education and as a methodology for fostering new forms of participation, inclusion and engagement in society. Following T.…
Digital Libraries and Repositories in India: An Evaluative Study
ERIC Educational Resources Information Center
Mittal, Rekha; Mahesh, G.
2008-01-01
Purpose: The purpose of this research is to identify and evaluate the collections within digital libraries and repositories in India available in the public domain. Design/methodology/approach: The digital libraries and repositories were identified through a study of the literature, as well as internet searching and browsing. The resulting digital…
Digital Literacy and New Technological Perspectives
ERIC Educational Resources Information Center
Feola, Elvia Ilaria
2016-01-01
This paper aims to reflect on the implications and challenges that experts in the field have to deal with when you want to evaluate the performance in the use of digital technologies in teaching. The argument stems from a contextual and social assessment, and then proceeds to an application and methodological connotation of digital literacy…
IoT-Forensics Meets Privacy: Towards Cooperative Digital Investigations
Lopez, Javier
2018-01-01
IoT-Forensics is a novel paradigm for the acquisition of electronic evidence whose operation is conditioned by the peculiarities of the Internet of Things (IoT) context. As a branch of computer forensics, this discipline respects the most basic forensic principles of preservation, traceability, documentation, and authorization. The digital witness approach also promotes such principles in the context of the IoT while allowing personal devices to cooperate in digital investigations by voluntarily providing electronic evidence to the authorities. However, this solution is highly dependent on the willingness of citizens to collaborate and they may be reluctant to do so if the sensitive information within their personal devices is not sufficiently protected when shared with the investigators. In this paper, we provide the digital witness approach with a methodology that enables citizens to share their data with some privacy guarantees. We apply the PRoFIT methodology, originally defined for IoT-Forensics environments, to the digital witness approach in order to unleash its full potential. Finally, we show the feasibility of a PRoFIT-compliant digital witness with two use cases. PMID:29414864
IoT-Forensics Meets Privacy: Towards Cooperative Digital Investigations.
Nieto, Ana; Rios, Ruben; Lopez, Javier
2018-02-07
IoT-Forensics is a novel paradigm for the acquisition of electronic evidence whose operation is conditioned by the peculiarities of the Internet of Things (IoT) context. As a branch of computer forensics, this discipline respects the most basic forensic principles of preservation, traceability, documentation, and authorization. The digital witness approach also promotes such principles in the context of the IoT while allowing personal devices to cooperate in digital investigations by voluntarily providing electronic evidence to the authorities. However, this solution is highly dependent on the willingness of citizens to collaborate and they may be reluctant to do so if the sensitive information within their personal devices is not sufficiently protected when shared with the investigators. In this paper, we provide the digital witness approach with a methodology that enables citizens to share their data with some privacy guarantees. We apply the PRoFIT methodology, originally defined for IoT-Forensics environments, to the digital witness approach in order to unleash its full potential. Finally, we show the feasibility of a PRoFIT-compliant digital witness with two use cases.
Stoker, Jason M.; Tyler, Dean J.; Turnipseed, D. Phil; Van Wilson, K.; Oimoen, Michael J.
2009-01-01
Hurricane Katrina was one of the largest natural disasters in U.S. history. Due to the sheer size of the affected areas, an unprecedented regional analysis at very high resolution and accuracy was needed to properly quantify and understand the effects of the hurricane and the storm tide. Many disparate sources of lidar data were acquired and processed for varying environmental reasons by pre- and post-Katrina projects. The datasets were in several formats and projections and were processed to varying phases of completion, and as a result the task of producing a seamless digital elevation dataset required a high level of coordination, research, and revision. To create a seamless digital elevation dataset, many technical issues had to be resolved before producing the desired 1/9-arc-second (3meter) grid needed as the map base for projecting the Katrina peak storm tide throughout the affected coastal region. This report presents the methodology that was developed to construct seamless digital elevation datasets from multipurpose, multi-use, and disparate lidar datasets, and describes an easily accessible Web application for viewing the maximum storm tide caused by Hurricane Katrina in southeastern Louisiana, Mississippi, and Alabama.
Chain of evidence generation for contrast enhancement in digital image forensics
NASA Astrophysics Data System (ADS)
Battiato, Sebastiano; Messina, Giuseppe; Strano, Daniela
2010-01-01
The quality of the images obtained by digital cameras has improved a lot since digital cameras early days. Unfortunately, it is not unusual in image forensics to find wrongly exposed pictures. This is mainly due to obsolete techniques or old technologies, but also due to backlight conditions. To extrapolate some invisible details a stretching of the image contrast is obviously required. The forensics rules to produce evidences require a complete documentation of the processing steps, enabling the replication of the entire process. The automation of enhancement techniques is thus quite difficult and needs to be carefully documented. This work presents an automatic procedure to find contrast enhancement settings, allowing both image correction and automatic scripting generation. The technique is based on a preprocessing step which extracts the features of the image and selects correction parameters. The parameters are thus saved through a JavaScript code that is used in the second step of the approach to correct the image. The generated script is Adobe Photoshop compliant (which is largely used in image forensics analysis) thus permitting the replication of the enhancement steps. Experiments on a dataset of images are also reported showing the effectiveness of the proposed methodology.
Reconstructing the past: methods and techniques for the digital restoration of fossils
2016-01-01
During fossilization, the remains of extinct organisms are subjected to taphonomic and diagenetic processes. As a result, fossils show a variety of preservational artefacts, which can range from small breaks and cracks, disarticulation and fragmentation, to the loss and deformation of skeletal structures and other hard parts. Such artefacts can present a considerable problem, as the preserved morphology of fossils often forms the basis for palaeontological research. Phylogenetic and taxonomic studies, inferences on appearance, ecology and behaviour and functional analyses of fossil organisms strongly rely on morphological information. As a consequence, the restoration of fossil morphology is often a necessary prerequisite for further analyses. Facilitated by recent computational advances, virtual reconstruction and restoration techniques offer versatile tools to restore the original morphology of fossils. Different methodological steps and approaches, as well as software are outlined and reviewed here, and advantages and disadvantages are discussed. Although the complexity of the restorative processes can introduce a degree of interpretation, digitally restored fossils can provide useful morphological information and can be used to obtain functional estimates. Additionally, the digital nature of the restored models can open up possibilities for education and outreach and further research. PMID:27853548
Digital methods for the history of psychology: Introduction and resources.
Fox Lee, Shayna
2016-02-01
At the York University Digital History of Psychology Laboratory, we have been working on projects that explore what digital methodologies have to offer historical research in our field. This piece provides perspective on the history and theory of digital history, as well as introductory resources for those who are curious about incorporating these methods into their own work. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Applying the SERENITY Methodology to the Domain of Trusted Electronic Archiving
NASA Astrophysics Data System (ADS)
Porekar, Jan; Klobučar, Tomaž; Šaljič, Svetlana; Gabrijelčič, Dušan
We present the application of the SERENITY methodology to the domain of long-term trusted electronic archiving, sometimes also referred to as trusted digital notary services. We address the SERENITY approach from thepoint of view of a company providing security solutions in the mentioned domain and adopt the role of a solution developer. In this chapter we show a complete vertical slice through the trusted archiving domain providing: (i) the relevant S&D properties, (ii) the S&D classes and S&D patterns on both organizational and technical level, (iii) describe how S&D patterns are integrated into a trusted longterm archiving service using the SERENITY Run-Time Framework (SRF). At the end of the chapter we put in perspective what a solution developer can learn from the process of capturing security knowledge according to SERENITY methodology and we discuss how existing implementations of archiving services can benefit from SERENITY approach in the future.
Fuzzy Logic-Based Audio Pattern Recognition
NASA Astrophysics Data System (ADS)
Malcangi, M.
2008-11-01
Audio and audio-pattern recognition is becoming one of the most important technologies to automatically control embedded systems. Fuzzy logic may be the most important enabling methodology due to its ability to rapidly and economically model such application. An audio and audio-pattern recognition engine based on fuzzy logic has been developed for use in very low-cost and deeply embedded systems to automate human-to-machine and machine-to-machine interaction. This engine consists of simple digital signal-processing algorithms for feature extraction and normalization, and a set of pattern-recognition rules manually tuned or automatically tuned by a self-learning process.
LMI designmethod for networked-based PID control
NASA Astrophysics Data System (ADS)
Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez
2016-10-01
In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.
1987-06-01
evaluation and chip layout planning for VLSI digital systems. A high-level applicative (functional) language, implemented at UCLA, allows combining of...operating system. 2.1 Introduction The complexity of VLSI requires the application of CAD tools at all levels of the design process. In order to be...effective, these tools must be adaptive to the specific design. In this project we studied a design method based on the use of applicative languages
ERIC Educational Resources Information Center
Kimura, Tadamasa
2010-01-01
The objective of this dissertation is to explore the socio-cultural contextualization of the digital divide in Japanese society. I undertake this task by developing a theoretical and methodological framework based on the notion of "culture as models," while explicating the cultural dimensions of the digital divide and the dynamics of…
Enhancing a Core Journal Collection for Digital Libraries
ERIC Educational Resources Information Center
Kovacevic, Ana; Devedzic, Vladan; Pocajt, Viktor
2010-01-01
Purpose: This paper aims to address the problem of enhancing the selection of titles offered by a digital library, by analysing the differences in these titles when they are cited by local authors in their publications and when they are listed in the digital library offer. Design/methodology/approach: Text mining techniques were used to identify…
Early Learnings from the National Library of New Zealand's National Digital Heritage Archive Project
ERIC Educational Resources Information Center
Knight, Steve
2010-01-01
Purpose: The purpose of this paper is to provide a brief description of the digital preservation programme at the National Library of New Zealand. Design/methodology/approach: Following a description of the legislative and strategic context for digital preservation in New Zealand, details are provided of the system for the National Digital…
ERIC Educational Resources Information Center
Alhajri, Salman
2016-01-01
Purpose: this paper investigates the effectiveness of teaching methods used in graphic design pedagogy in both analogue and digital education systems. Methodology and approach: the paper is based on theoretical study using a qualitative, case study approach. Comparison between the digital teaching methods and traditional teaching methods was…
Responding to the Call: Arts Methodologies Informing 21st Century Literacies
ERIC Educational Resources Information Center
Huber, Adrienne; Dinham, Judith; Chalk, Beryl
2015-01-01
With the advent of digital technologies, a new adventure began. How the world works has changed, and we cannot go back. Digitally savvy children born in the digital age (i.e., DigiKids) are interacting with and responding to rich, curatable multimodal communications as part of their daily-lived experience. For DigiKids, traditional text-based…
ERIC Educational Resources Information Center
White, Andy
2005-01-01
Purpose: This paper aims to use two case studies of digital archives designed by library and information professionals and historians to highlight the twin issues of academic authenticity and accuracy of digital representations. Design/methodology/approach: Using secondary literature, the author established a hypothesis about the way in which…
Real time flight simulation methodology
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Cook, G.; Mcvey, E. S.
1977-01-01
Substitutional methods for digitization, input signal-dependent integrator approximations, and digital autopilot design were developed. The software framework of a simulator design package is described. Included are subroutines for iterative designs of simulation models and a rudimentary graphics package.
Insertion of GaAs MMICs into EW systems
NASA Astrophysics Data System (ADS)
Schineller, E. R.; Pospishil, A.; Grzyb, J.
1989-09-01
Development activities on a microwave/mm-wave monolithic IC (MIMIC) program are described, as well as the methodology for inserting these GaAs IC chips into several EW systems. The generic EW chip set developed on the MIMIC program consists of 23 broadband chip types, including amplifiers, oscillators, mixers, switches, variable attenuators, power dividers, and power combiners. These chips are being designed for fabrication using the multifunction self-aligned gate process. The benefits from GaAs IC insertion are quantified by a comparison of hardware units fabricated with existing MIC and digital ECL technology and the same units manufactured with monolithic technology. It is found that major improvements in cost, reliability, size, weight, and performance can be realized. Examples illustrating the methodology for technology insertion are presented.
Learning by Peers: An Alternative Learning Model for Digital Inclusion of Elderly People
NASA Astrophysics Data System (ADS)
de Sales, Márcia Barros; Silveira, Ricardo Azambuja; de Sales, André Barros; de Cássia Guarezi, Rita
This paper presents a model of digital inclusion for the elderly people, using learning by peers methodology. The model’s goal was valuing and promoting the potential capabilities of the elderly people by promoting some of them to instruct other elderly people to deal with computers and to use several software tools and internet services. The project involved 66 volunteering elderly people. However, 19 of them acted effectively as multipliers and the others as students. The process was observed through the empirical technique of interaction workshops. This technique was chosen for demanding direct participation of the people involved in real interaction. We worked with peer learning to facilitate the communication between elderly-learners and elderly-multipliers, due to the similarity in language, rhythm and life history, and because they felt more secure to develop the activities with people in their age group. This multiplying model can be used in centers, organizations and other entities that work with elderly people for their digital inclusion.
A microprocessor application to a strapdown laser gyro navigator
NASA Technical Reports Server (NTRS)
Giardina, C.; Luxford, E.
1980-01-01
The replacement of analog circuit control loops for laser gyros (path length control, cross axis temperature compensation loops, dither servo and current regulators) with digital filters residing in microcomputers is addressed. In addition to the control loops, a discussion is given on applying the microprocessor hardware to compensation for coning and skulling motion where simple algorithms are processed at high speeds to compensate component output data (digital pulses) for linear and angular vibration motions. Highlights are given on the methodology and system approaches used in replacing differential equations describing the analog system in terms of the mechanized difference equations of the microprocessor. Standard one for one frequency domain techniques are employed in replacing analog transfer functions by their transform counterparts. Direct digital design techniques are also discussed along with their associated benefits. Time and memory loading analyses are also summarized, as well as signal and microprocessor architecture. Trade offs in algorithm, mechanization, time/memory loading, accuracy, and microprocessor architecture are also given.
User Requirements Analysis For Digital Library Application Using Quality Function Deployment.
NASA Astrophysics Data System (ADS)
Wulandari, Lily; Sularto, Lana; Yusnitasari, Tristyanti; Ikasari, Diana
2017-03-01
This study attemp to build Smart Digital Library to be used by the wider community wherever they are. The system is built in the form of Smart Digital Library portal which uses semantic similarity method (Semantic Similarity) to search journals, articles or books by title or author name. This method is also used to determine the recommended books to be read by visitors of Smart Digital Library based on testimony from a previous reader automatically. Steps being taken in the development of Smart Digital Library system is the analysis phase, design phase, testing and implementation phase. At this stage of the analysis using WebQual for the preparation of the instruments to be distributed to the respondents and the data obtained from the respondents will be processed using Quality Function Deployment. In the analysis phase has the purpose of identifying consumer needs and technical requirements. The analysis was performed to a digital library on the web digital library Gunadarma University, Bogor Institute of Agriculture, University of Indonesia, etc. The questionnaire was distributed to 200 respondents. The research methodology begins with the collection of user requirements and analyse it using QFD. Application design is funded by the government through a program of Featured Universities Research by the Directorate General of Higher Education (DIKTI). Conclusions from this research are identified which include the Consumer Requirements of digital library application. The elements of the consumers requirements consists of 13 elements and 25 elements of Engineering Characteristics digital library requirements. Therefore the design of digital library applications that will be built, is designed according to the findings by eliminating features that are not needed by restaurant based on QFD House of Quality.
Simulation of digital mammography images
NASA Astrophysics Data System (ADS)
Workman, Adam
2005-04-01
A number of different technologies are available for digital mammography. However, it is not clear how differences in the physical performance aspects of the different imaging technologies affect clinical performance. Randomised controlled trials provide a means of gaining information on clinical performance however do not provide direct comparison of the different digital imaging technologies. This work describes a method of simulating the performance of different digital mammography systems. The method involves modifying the imaging performance parameters of images from a small field of view (SFDM), high resolution digital imaging system used for spot imaging. Under normal operating conditions this system produces images with higher signal-to-noise ratio (SNR) over a wide spatial frequency range than current full field digital mammography (FFDM) systems. The SFDM images can be 'degraded" by computer processing to simulate the characteristics of a FFDM system. Initial work characterised the physical performance (MTF, NPS) of the SFDM detector and developed a model and method for simulating signal transfer and noise properties of a FFDM system. It was found that the SNR properties of the simulated FFDM images were very similar to those measured from an actual FFDM system verifying the methodology used. The application of this technique to clinical images from the small field system will allow the clinical performance of different FFDM systems to be simulated and directly compared using the same clinical image datasets.
Digital Historic Urban Landscape Methodology for Heritage Impact Assessment of Singapore
NASA Astrophysics Data System (ADS)
Widodo, J.; Wong, Y. C.; Ismail, F.
2017-08-01
Using the case study of Singapore's existing heritage websites, this research will probe the circumstances of the emerging technology and practice of consuming heritage architecture on a digital platform. Despite the diverse objectives, technology is assumed to help deliver greater interpretation through the use of new and high technology emphasising experience and provide visual fidelity. However, the success is limited as technology is insufficient to provide the past from multiple perspectives. Currently, existing projects provide linear narratives developed through a top-down approach that assumes the end-users as an individual entity and limits heritage as a consumable product. Through this research, we hope to uncover for better experience of digital heritage architecture where interpretation is an evolving `process' that is participatory and contributory that allows public participation, together with effective presentation, cultural learning and embodiment, to enhance the end-users' interpretation of digital heritage architecture. Additionally, this research seeks to establish an inventory in the form of a digital platform that adopts the Historic Urban Landscape (HUL) into the Singapore context to better and deepen the understandings of the public towards architectural as well as cultural heritage through an intercultural and intergenerational dialogue. Through HUL, this research hopes that it will better shape conservation strategies and urban planning.
Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua
2011-07-01
In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
New technology and regional studies in human ecology: A Papua New Guinea example
NASA Technical Reports Server (NTRS)
Morren, George E. B., Jr.
1991-01-01
Two key issues in using technologies such as digital image processing and geographic information systems are a conceptually and methodologically valid research design and the exploitation of varied sources of data. With this realized, the new technologies offer anthropologists the opportunity to test hypotheses about spatial and temporal variations in the features of interest within a regionally coherent mosaic of social groups and landscapes. Current research on the Mountain OK of Papua New Guinea is described with reference to these issues.
2016-01-01
user, the model will use surplus inventory in one category to fill shortfalls in other categories. The TFBL model also has the capability to allow...the total force. As shown in Table 2.1, we used five-character AFSCs (four digits plus the suffix) to break out pilots in the current force as...Training Unit 4th digit Suffix 4th digit Suffix 4th digit 4th digit 2 or 3 Matches a specific aircraft designation 3 Does not match a specific
Digital material laboratory: Considerations on high-porous volcanic rock
NASA Astrophysics Data System (ADS)
Saenger, Erik H.; Stöckhert, Ferdinand; Duda, Mandy; Fischer, Laura; Osorno, Maria; Steeb, Holger
2017-04-01
Digital material methodology combines modern microscopic imaging with advanced numerical simulations of the physical properties of materials. One goal is to complement physical laboratory investigations for a deeper understanding of relevant physical processes. Large-scale numerical modeling of elastic wave propagation directly from the microstructure of the porous material is integral to this technology. The parallelized finite-difference-based Stokes solver is suitable for the calculation of effective hydraulic parameters for low and high porous materials. Reticulite is formed in very high Hawaiian fire fountaining events. Hawaiian fire fountaining eruptions produce columns or fountains of lava, which can last for a few hours to days. Reticulite was originally thought to have formed from further expanded hot scoria foam. However, some researchers believe reticulite forms from magma that formed vesicles instantly, which expanded rapidly and uniformly to produce the polyhedral vesicle walls. These walls then ruptured and cooled rapidly. The (open) honeycomb network of bubbles is held together by glassy threads and forms a structure with a porosity higher than 80%. The fragile rock sample is difficult to characterize with classical experimental methods and we show how to determine porosity, effective elastic properties and Darcy permeability by using digital material methodology. A technical challenge will be to image with the CT technique the thin skin between the glassy threads visible on the microscopy image. A numerical challenge will be determination of effective material properties and viscous fluid effects on wave propagation in such a high porous material.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... comment the submission of additional information concerning the methodological changes for the digital... additional information concerning the methodological changes suggested in the comments by Mr. Shumate for the...-loss. The Commission is requesting a detailed description of the methodological changes that would be...
Promoting Culturally Respectful Cancer Education Through Digital Storytelling
Cueva, Melany; Kuhnley, Regina; Lanier, Anne; Dignan, Mark; Revels, Laura; Schoenberg, Nancy E.; Cueva, Katie
2016-01-01
Cancer is the leading cause of mortality among Alaska Native people. Over half of Alaska Native people live in rural communities where specially trained community members called Community Health Aides/Practitioners (CHA/Ps) provide health care. In response to CHA/Ps’ expressed desire to learn more about cancer, four 5-day cancer education and digital storytelling courses were provided in 2014. Throughout each course, participants explored cancer information, reflected on their personal experiences, and envisioned how they might apply their knowledge within their communities. Each course participant also created a personal and authentic digital story, a methodology increasingly embraced by Indigenous communities as a way to combine storytelling traditions with modern technology to promote both individual and community health. Opportunities to learn of CHA/Ps’ experiences with cancer and digital storytelling included a 3-page end-of-course written evaluation, a weekly story-showing log kept for 4 weeks post-course, a group teleconference held 1–2 weeks post-course, and a survey administered 6 months post-course. Participants described digital storytelling as a culturally respectful way to support cancer awareness and education. Participants described the process of creating digital stories as supporting knowledge acquisition, encouraging personal reflection, and sparking a desire to engage in cancer risk reduction activities for themselves and with their families and patients. As a result of creating a personalized digital story, CHA/Ps reported feeling differently about cancer, noting an increase in cancer knowledge and comfort to talk about cancer with clients and family. Indigenous digital stories have potential for broad use as a culturally appropriate health messaging tool. PMID:27429956
Promoting Culturally Respectful Cancer Education Through Digital Storytelling.
Cueva, Melany; Kuhnley, Regina; Lanier, Anne; Dignan, Mark; Revels, Laura; Schoenberg, Nancy E; Cueva, Katie
Cancer is the leading cause of mortality among Alaska Native people. Over half of Alaska Native people live in rural communities where specially trained community members called Community Health Aides/Practitioners (CHA/Ps) provide health care. In response to CHA/Ps' expressed desire to learn more about cancer, four 5-day cancer education and digital storytelling courses were provided in 2014. Throughout each course, participants explored cancer information, reflected on their personal experiences, and envisioned how they might apply their knowledge within their communities. Each course participant also created a personal and authentic digital story, a methodology increasingly embraced by Indigenous communities as a way to combine storytelling traditions with modern technology to promote both individual and community health. Opportunities to learn of CHA/Ps' experiences with cancer and digital storytelling included a 3-page end-of-course written evaluation, a weekly story-showing log kept for 4 weeks post-course, a group teleconference held 1-2 weeks post-course, and a survey administered 6 months post-course. Participants described digital storytelling as a culturally respectful way to support cancer awareness and education. Participants described the process of creating digital stories as supporting knowledge acquisition, encouraging personal reflection, and sparking a desire to engage in cancer risk reduction activities for themselves and with their families and patients. As a result of creating a personalized digital story, CHA/Ps reported feeling differently about cancer, noting an increase in cancer knowledge and comfort to talk about cancer with clients and family. Indigenous digital stories have potential for broad use as a culturally appropriate health messaging tool.
Wehde, M. E.
1995-01-01
The common method of digital image comparison by subtraction imposes various constraints on the image contents. Precise registration of images is required to assure proper evaluation of surface locations. The attribute being measured and the calibration and scaling of the sensor are also important to the validity and interpretability of the subtraction result. Influences of sensor gains and offsets complicate the subtraction process. The presence of any uniform systematic transformation component in one of two images to be compared distorts the subtraction results and requires analyst intervention to interpret or remove it. A new technique has been developed to overcome these constraints. Images to be compared are first transformed using the cumulative relative frequency as a transfer function. The transformed images represent the contextual relationship of each surface location with respect to all others within the image. The process of differentiating between the transformed images results in a percentile rank ordered difference. This process produces consistent terrain-change information even when the above requirements necessary for subtraction are relaxed. This technique may be valuable to an appropriately designed hierarchical terrain-monitoring methodology because it does not require human participation in the process.
Young, Jacy L; Green, Christopher D
2013-11-01
In this article, we present the results of an exploratory digital analysis of the contents of the two journals founded in the late 19th century by American psychologist G. Stanley Hall. Using the methods of the increasingly popular digital humanities, some key attributes of the American Journal of Psychology (AJP) and the Pedagogical Seminary (PS) are identified. Our analysis reaffirms some of Hall's explicit aims for the two periodicals, while also revealing a number of other features of the journals, as well as of the people who published within their pages, the methodologies they employed, and the institutions at which they worked. Notably, despite Hall's intent that his psychological journal be strictly an outlet for scientific research, the journal-like its sister pedagogically focused publication-included an array of methodologically diverse research. The multiplicity of research styles that characterize the content of Hall's journals in their initial years is, in part, a consequence of individual researchers at times crossing methodological lines and producing a diverse body of research. Along with such variety within each periodical, it is evident that the line between content appropriate to one periodical rather than the other was fluid rather than absolute. The full results of this digitally informed analysis of Hall's two journals suggest a number of novel avenues for future research and demonstrate the utility of digital methods as applied to the history of psychology. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Digital Learning Resources and Ubiquitous Technologies in Education
ERIC Educational Resources Information Center
Camilleri, Mark Anthony; Camilleri, Adriana Caterina
2017-01-01
This research explores the educators' attitudes and perceptions about their utilisation of digital learning technologies. The methodology integrates measures from "the pace of technological innovativeness" and the "technology acceptance model" to understand the rationale for further ICT investment in compulsory education. A…
Integrated Survey Procedures for the Virtual Reading and Fruition of Historical Buildings
NASA Astrophysics Data System (ADS)
Scandurra, S.; Pulcrano, M.; Cirillo, V.; Campi, M.; di Luggo, A.; Zerlenga, O.
2018-05-01
This paper presents the developments of research related to the integration of digital survey methodologies with reference to image-based and range-based technologies. Starting from the processing of point clouds, the data were processed for both the geometric interpretation of the space as well as production of three-dimensional models that describe the constitutive and morphological relationships. The subject of the study was the church of San Carlo all'Arena in Naples (Italy), with a HBIM model being produced that is semantically consistent with the real building. Starting from the data acquired, a visualization system was created for the virtual exploration of the building.
ERIC Educational Resources Information Center
Valdivia, Andrea
2017-01-01
This article accounts for an experience of digital storytelling workshops with indigenous adolescents in Chile, and proposes a theoretical and methodological approach to analyze digital creations with a dialogic and ethnographic point of view. Based on this, it discusses the possibilities of digital media production as a strategy for the…
Effects of image processing on the detective quantum efficiency
NASA Astrophysics Data System (ADS)
Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na
2010-04-01
Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.
Relationships between palaeogeography and opal occurrence in Australia: A data-mining approach
NASA Astrophysics Data System (ADS)
Landgrebe, T. C. W.; Merdith, A.; Dutkiewicz, A.; Müller, R. D.
2013-07-01
Age-coded multi-layered geological datasets are becoming increasingly prevalent with the surge in open-access geodata, yet there are few methodologies for extracting geological information and knowledge from these data. We present a novel methodology, based on the open-source GPlates software in which age-coded digital palaeogeographic maps are used to “data-mine” spatio-temporal patterns related to the occurrence of Australian opal. Our aim is to test the concept that only a particular sequence of depositional/erosional environments may lead to conditions suitable for the formation of gem quality sedimentary opal. Time-varying geographic environment properties are extracted from a digital palaeogeographic dataset of the eastern Australian Great Artesian Basin (GAB) at 1036 opal localities. We obtain a total of 52 independent ordinal sequences sampling 19 time slices from the Early Cretaceous to the present-day. We find that 95% of the known opal deposits are tied to only 27 sequences all comprising fluvial and shallow marine depositional sequences followed by a prolonged phase of erosion. We then map the total area of the GAB that matches these 27 opal-specific sequences, resulting in an opal-prospective region of only about 10% of the total area of the basin. The key patterns underlying this association involve only a small number of key environmental transitions. We demonstrate that these key associations are generally absent at arbitrary locations in the basin. This new methodology allows for the simplification of a complex time-varying geological dataset into a single map view, enabling straightforward application for opal exploration and for future co-assessment with other datasets/geological criteria. This approach may help unravel the poorly understood opal formation process using an empirical spatio-temporal data-mining methodology and readily available datasets to aid hypothesis testing.
a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation
NASA Astrophysics Data System (ADS)
Kıvılcım, C. Ö.; Duran, Z.
2016-06-01
The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.
2017-12-01
satisfactory performance. We do not use statistical models, and we do not create patterns that require supervised learning. Our methodology is intended...statistical models, and we do not create patterns that require supervised learning. Our methodology is intended for use in personal digital image...THESIS MOTIVATION .........................................................................19 III. METHODOLOGY
Dichotic and dichoptic digit perception in normal adults.
Lawfield, Angela; McFarland, Dennis J; Cacace, Anthony T
2011-06-01
Verbally based dichotic-listening experiments and reproduction-mediated response-selection strategies have been used for over four decades to study perceptual/cognitive aspects of auditory information processing and make inferences about hemispheric asymmetries and language lateralization in the brain. Test procedures using dichotic digits have also been used to assess for disorders of auditory processing. However, with this application, limitations exist and paradigms need to be developed to improve specificity of the diagnosis. Use of matched tasks in multiple sensory modalities is a logical approach to address this issue. Herein, we use dichotic listening and dichoptic viewing of visually presented digits for making this comparison. To evaluate methodological issues involved in using matched tasks of dichotic listening and dichoptic viewing in normal adults. A multivariate assessment of the effects of modality (auditory vs. visual), digit-span length (1-3 pairs), response selection (recognition vs. reproduction), and ear/visual hemifield of presentation (left vs. right) on dichotic and dichoptic digit perception. Thirty adults (12 males, 18 females) ranging in age from 18 to 30 yr with normal hearing sensitivity and normal or corrected-to-normal visual acuity. A computerized, custom-designed program was used for all data collection and analysis. A four-way repeated measures analysis of variance (ANOVA) evaluated the effects of modality, digit-span length, response selection, and ear/visual field of presentation. The ANOVA revealed that performances on dichotic listening and dichoptic viewing tasks were dependent on complex interactions between modality, digit-span length, response selection, and ear/visual hemifield of presentation. Correlation analysis suggested a common effect on overall accuracy of performance but isolated only an auditory factor for a laterality index. The variables used in this experiment affected performances in the auditory modality to a greater extent than in the visual modality. The right-ear advantage observed in the dichotic-digits task was most evident when reproduction mediated response selection was used in conjunction with three-digit pairs. This effect implies that factors such as "speech related output mechanisms" and digit-span length (working memory) contribute to laterality effects in dichotic listening performance with traditional paradigms. Thus, the use of multiple-digit pairs to avoid ceiling effects and the application of verbal reproduction as a means of response selection may accentuate the role of nonperceptual factors in performance. Ideally, tests of perceptual abilities should be relatively free of such effects. American Academy of Audiology.
A review on color normalization and color deconvolution methods in histopathology.
Onder, Devrim; Zengin, Selen; Sarioglu, Sulen
2014-01-01
The histopathologists get the benefits of wide range of colored dyes to have much useful information about the lesions and the tissue compositions. Despite its advantages, the staining process comes up with quite complex variations in staining concentrations and correlations, tissue fixation types, and fixation time periods. Together with the improvements in computing power and with the development of novel image analysis methods, these imperfections have led to the emerging of several color normalization algorithms. This article is a review of the currently available digital color normalization methods for the bright field histopathology. We describe the proposed color normalization methodologies in detail together with the lesion and tissue types used in the corresponding experiments. We also present the quantitative validation approaches for each of the proposed methodology where available.
Development of Boolean calculus and its applications. [digital systems design
NASA Technical Reports Server (NTRS)
Tapia, M. A.
1980-01-01
The development of Boolean calculus for its application to developing digital system design methodologies that would reduce system complexity, size, cost, speed, power requirements, etc., is discussed. Synthesis procedures for logic circuits are examined particularly asynchronous circuits using clock triggered flip flops.
Increasing the UAV data value by an OBIA methodology
NASA Astrophysics Data System (ADS)
García-Pedrero, Angel; Lillo-Saavedra, Mario; Rodriguez-Esparragon, Dionisio; Rodriguez-Gonzalez, Alejandro; Gonzalo-Martin, Consuelo
2017-10-01
Recently, there has been a noteworthy increment of using images registered by unmanned aerial vehicles (UAV) in different remote sensing applications. Sensors boarded on UAVs has lower operational costs and complexity than other remote sensing platforms, quicker turnaround times as well as higher spatial resolution. Concerning this last aspect, particular attention has to be paid on the limitations of classical algorithms based on pixels when they are applied to high resolution images. The objective of this study is to investigate the capability of an OBIA methodology developed for the automatic generation of a digital terrain model of an agricultural area from Digital Elevation Model (DEM) and multispectral images registered by a Parrot Sequoia multispectral sensor board on a eBee SQ agricultural drone. The proposed methodology uses a superpixel approach for obtaining context and elevation information used for merging superpixels and at the same time eliminating objects such as trees in order to generate a Digital Terrain Model (DTM) of the analyzed area. Obtained results show the potential of the approach, in terms of accuracy, when it is compared with a DTM generated by manually eliminating objects.
ERIC Educational Resources Information Center
Reins, Kevin
2007-01-01
The purpose of this study was to investigate effective uses of digital ink technology in an elementary mathematics methods course. A survey methodology was used in the study to examine the participants' perceptions toward this technology for teaching and learning. All of the items on the survey produced response means between 5.0 and 6.0, with a…
On Drift Effects in Velocity and Displacement of Greek Uncorrected Digital Strong Motion Data
NASA Astrophysics Data System (ADS)
Skarlatoudis, A.; Margaris, B.
2005-12-01
Fifty years after the first installation of analog accelerographs, digital instruments recording the strong-motion came in operation. Their advantages comparing to the analog ones are obvious and they have been described in detail in several works. Nevertheless it has been pointed out that velocity and displacement values derived from several accelerograms, recorded in various strong earthquakes worldwide (e.g. 1999 Chi-Chi, Taiwan, Hector Mine, 2002 Denali) by digital instruments, are plagued by drifts when only a simple baseline correction derived from the pre-event portion of the record is removed. In Greece a significant number of accelerographic networks and arrays have been deployed covering the whole area. Digital accelerographs now constitute a significant part of the National Strong Motion network of the country. Detailed analyses of the data processing of accelerograms recorded by digital instruments exhibited that the same drifts exist in the Greek strong motion database. In this work, a methodology proposed and described in various articles (Boore, 2001; 2003; 2005) for removing the aforementioned drifts of the accelerograms is applied. It is also attempted a careful look of the nature of the drifts for understanding the noise characteristics relative to the signal. The intrinsic behaviour of signal to noise ratio is crucial for the adequacy of baseline corrections applied on digital uncorrected accelerograms. Velocities and displacements of the uncorrected and corrected accelerograms are compared and the drift effects in the Fourier and response spectra are presented.
Teaching Multimedia Data Protection through an International Online Competition
ERIC Educational Resources Information Center
Battisti, F.; Boato, G.; Carli, M.; Neri, A.
2011-01-01
Low-cost personal computers, wireless access technologies, the Internet, and computer-equipped classrooms allow the design of novel and complementary methodologies for teaching digital information security in electrical engineering curricula. The challenges of the current digital information era require experts who are effectively able to…
Don't Break the Memory Line: Social Memory, Digital Storytelling and Local Communities
NASA Astrophysics Data System (ADS)
Ferri, Paolo; Mangiatordi, Andrea; Pozzali, Andrea
In this paper we present and analyze some of the main results obtained by the empirical research carried out within the scope of the Socrates Grundtvig Project "Memory Line", that aimed at developing instruments and methodologies in order to help overcoming the intergenerational divide. The project aimed at training groups of elderly and young citizens, resident in the project partner countries, to collect records (stories, songs, poems, experiences, etc.) and to save them in a digital form, mainly by using the methodology of digital storytelling. Focus groups and interviews with people involved in the intergenerational ateliers have been carried out in order to gather a series of first hand evidences directly from the voices of people who were actively involved in the project, and to enable an ongoing monitoring and self evaluation of the project itself.
[Digital learning object for diagnostic reasoning in nursing applied to the integumentary system].
da Costa, Cecília Passos Vaz; Luz, Maria Helena Barros Araújo
2015-12-01
To describe the creation of a digital learning object for diagnostic reasoning in nursing applied to the integumentary system at a public university of Piaui. A methodological study applied to technological production based on the pedagogical framework of problem-based learning. The methodology for creating the learning object observed the stages of analysis, design, development, implementation and evaluation recommended for contextualized instructional design. The revised taxonomy of Bloom was used to list the educational goals. The four modules of the developed learning object were inserted into the educational platform Moodle. The theoretical assumptions allowed the design of an important online resource that promotes effective learning in the scope of nursing education. This study should add value to nursing teaching practices through the use of digital learning objects for teaching diagnostic reasoning applied to skin and skin appendages.
Validating a new methodology for optical probe design and image registration in fNIRS studies
Wijeakumar, Sobanawartiny; Spencer, John P.; Bohache, Kevin; Boas, David A.; Magnotta, Vincent A.
2015-01-01
Functional near-infrared spectroscopy (fNIRS) is an imaging technique that relies on the principle of shining near-infrared light through tissue to detect changes in hemodynamic activation. An important methodological issue encountered is the creation of optimized probe geometry for fNIRS recordings. Here, across three experiments, we describe and validate a processing pipeline designed to create an optimized, yet scalable probe geometry based on selected regions of interest (ROIs) from the functional magnetic resonance imaging (fMRI) literature. In experiment 1, we created a probe geometry optimized to record changes in activation from target ROIs important for visual working memory. Positions of the sources and detectors of the probe geometry on an adult head were digitized using a motion sensor and projected onto a generic adult atlas and a segmented head obtained from the subject's MRI scan. In experiment 2, the same probe geometry was scaled down to fit a child's head and later digitized and projected onto the generic adult atlas and a segmented volume obtained from the child's MRI scan. Using visualization tools and by quantifying the amount of intersection between target ROIs and channels, we show that out of 21 ROIs, 17 and 19 ROIs intersected with fNIRS channels from the adult and child probe geometries, respectively. Further, both the adult atlas and adult subject-specific MRI approaches yielded similar results and can be used interchangeably. However, results suggest that segmented heads obtained from MRI scans be used for registering children's data. Finally, in experiment 3, we further validated our processing pipeline by creating a different probe geometry designed to record from target ROIs involved in language and motor processing. PMID:25705757
Rieken, Johannes; Garcia-Sanchez, Efraín; Trujillo, Mónica Pérez; Bear, Daniel
2015-06-01
We developed a teaching-led research project to empirically ground methodological reflection about digital ethnography. Drawing on Cordelois' collective ethnographic observation approach, fifteen emerging professionals (from a private general education university and a Police Academy in Bogota) collaborated in a method seminar on digital ethnography. They worked in cross-institutional research teams, each carrying SenseCams for 3 days. Students had a dual role as both participants and observers during self-confrontation interviews. The research design enabled emerging professionals to introspect about what it is to be a member of their institution. The SenseCam provided an additional opportunity for observation as it elicited different reactions in the two institutions. The fact that SenseCams produce sequential accounts of activity as well as its situated nature made apparent the autonomy to study and solve daily issues (e.g. transport, security, commitments) by students from the university, while students in the police academy are more focused on responding to unforeseen activities (e.g. police services, unexpected requests). Finally, our research highlights the relevance of the social dimension of introspection for digital ethnography. How digital data that captures an individual perspective is negotiated in a group becomes a key methodological question.
How do young and senior cytopathologists interact with digital cytology?
Giovagnoli, Maria Rosaria; Giarnieri, Enrico; Carico, Elisabetta; Giansanti, Daniele
2010-01-01
Today thanks to the technological advances in information technology the scenario of utilization of digital cytology has radically changed. New competitive systems, such as client-server architectures are now available in digital cytology. Their application in telemedicine should be investigated. A new interactive tool designed for the final destination user (the cytopathologist) has been proposed. Taking into account the different expertise of the subjects of the study, the investigation was focused both on the senior cytopathologist and on the younger student pathologist. The methodology was tested on 10 students of a Master in cytopathology and on 3 senior cytopathologists. The study showed that the use of digital cytology applications is effective and feasible for telediagnosis. In particular, the study on younger and senior expert investigators showed that, although they interacted with the novel technology of the virtual slide in a different manner, all of them reached the objective of a "correct diagnosis". This investigation, in consideration of the effectiveness of the digital cytology, also showed other indirect and tangible cost-beneft and quantitative advantages. In particular for the learning methodologies for the students of the Master itself and for the biomedical personnel involved in diagnosis.
Digital storytelling: an innovative tool for practice, education, and research.
Lal, Shalini; Donnelly, Catherine; Shin, Jennifer
2015-01-01
Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.
Irdis: A Digital Scene Storage And Processing System For Hardware-In-The-Loop Missile Testing
NASA Astrophysics Data System (ADS)
Sedlar, Michael F.; Griffith, Jerry A.
1988-07-01
This paper describes the implementation of a Seeker Evaluation and Test Simulation (SETS) Facility at Eglin Air Force Base. This facility will be used to evaluate imaging infrared (IIR) guided weapon systems by performing various types of laboratory tests. One such test is termed Hardware-in-the-Loop (HIL) simulation (Figure 1) in which the actual flight of a weapon system is simulated as closely as possible in the laboratory. As shown in the figure, there are four major elements in the HIL test environment; the weapon/sensor combination, an aerodynamic simulator, an imagery controller, and an infrared imagery system. The paper concentrates on the approaches and methodologies used in the imagery controller and infrared imaging system elements for generating scene information. For procurement purposes, these two elements have been combined into an Infrared Digital Injection System (IRDIS) which provides scene storage, processing, and output interface to drive a radiometric display device or to directly inject digital video into the weapon system (bypassing the sensor). The paper describes in detail how standard and custom image processing functions have been combined with off-the-shelf mass storage and computing devices to produce a system which provides high sample rates (greater than 90 Hz), a large terrain database, high weapon rates of change, and multiple independent targets. A photo based approach has been used to maximize terrain and target fidelity, thus providing a rich and complex scene for weapon/tracker evaluation.
Evaluating Usability in a Distance Digital Systems Laboratory Class
ERIC Educational Resources Information Center
Kostaras, N.; Xenos, M.; Skodras, A. N.
2011-01-01
This paper presents the usability evaluation of a digital systems laboratory class offered to distance-learning students. It details the way in which students can participate remotely in such a laboratory, the methodology employed in the usability assessment of the laboratory infrastructure (hardware and software), and also outlines the main…
From Digital Administration to Organisational Learning
ERIC Educational Resources Information Center
Elkjaer, Bente
2005-01-01
Purpose: To explore whether deliberate organisational change of a public sector organisation (a local municipality) would create an avenue for organisational learning. Design/methodology/approach: A case study was set up to study the means by which the organisational change towards a digital administration was to come about. The organisational…
Going with the Affective Flows of Digital School Absence Text Messages
ERIC Educational Resources Information Center
Bodén, Linnea
2017-01-01
Focusing on digital text messages containing information about students' absences and sent to parents by schools, the paper investigates the way school absenteeism is produced within affective assemblages. The paper unfolds a theoretical and methodological approach of "going with" the text messages, in entanglements of affective flows.…
The Impact of Digital Mobile Devices in Higher Education
ERIC Educational Resources Information Center
Sevillano-García, M.ª Luisa; Vázquez-Cano, Esteban
2015-01-01
This research examined the acceptance, incidence, and use of digital mobile devices (tablets and smartphones) among university students in the European Higher Education Area (EHEA). The research was contextualized in a sample of 419 students from three Spanish public universities. Through a quantitative methodology, we identified the factors and…
76 FR 29773 - Call for Participation in Pillbox Patient-Safety Initiative
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
... digital images and descriptive information for solid oral dosage form medications. This project seeks to... Participation, NLM seeks to evaluate the photography methodology and procedures it has developed for creating... available via a publicly accessible resource ( http://pillbox.nlm.nih.gov ) digital images and descriptive...
Do "Digital Certificates" Hold the Key to Colleges' On-Line Activities?
ERIC Educational Resources Information Center
Olsen, Florence
1999-01-01
Examines the increasing use of "digital certificates" to validate computer user identity in various applications on college and university campuses, including letting students register for courses, monitoring access to Internet2, and monitoring access to databases and electronic journals. The methodology has been developed by the…
Using PBL to Deliver Course in Digital Electronics
ERIC Educational Resources Information Center
Mantri, Archana; Dutt, Sunil; Gupta, J. P; Chitkara, Madhu
2009-01-01
Problem Based Learning (PBL) has proven to be a highly successful pedagogical model in many educational fields, although it is comparatively uncommon in technical education. It goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to an undergraduate Digital Electronics course in the…
Cho, Kyoung Won; Kim, Seong Min; Chae, Young Moon
2017-01-01
Objectives This research used queueing theory to analyze changes in outpatients' waiting times before and after the introduction of Electronic Medical Record (EMR) systems. Methods We focused on the exact drawing of two fundamental parameters for queueing analysis, arrival rate (λ) and service rate (µ), from digital data to apply queueing theory to the analysis of outpatients' waiting times. We used outpatients' reception times and consultation finish times to calculate the arrival and service rates, respectively. Results Using queueing theory, we could calculate waiting time excluding distorted values from the digital data and distortion factors, such as arrival before the hospital open time, which occurs frequently in the initial stage of a queueing system. We analyzed changes in outpatients' waiting times before and after the introduction of EMR using the methodology proposed in this paper, and found that the outpatients' waiting time decreases after the introduction of EMR. More specifically, the outpatients' waiting times in the target public hospitals have decreased by rates in the range between 44% and 78%. Conclusions It is possible to analyze waiting times while minimizing input errors and limitations influencing consultation procedures if we use digital data and apply the queueing theory. Our results verify that the introduction of EMR contributes to the improvement of patient services by decreasing outpatients' waiting time, or by increasing efficiency. It is also expected that our methodology or its expansion could contribute to the improvement of hospital service by assisting the identification and resolution of bottlenecks in the outpatient consultation process. PMID:28261529
Cho, Kyoung Won; Kim, Seong Min; Chae, Young Moon; Song, Yong Uk
2017-01-01
This research used queueing theory to analyze changes in outpatients' waiting times before and after the introduction of Electronic Medical Record (EMR) systems. We focused on the exact drawing of two fundamental parameters for queueing analysis, arrival rate (λ) and service rate (µ), from digital data to apply queueing theory to the analysis of outpatients' waiting times. We used outpatients' reception times and consultation finish times to calculate the arrival and service rates, respectively. Using queueing theory, we could calculate waiting time excluding distorted values from the digital data and distortion factors, such as arrival before the hospital open time, which occurs frequently in the initial stage of a queueing system. We analyzed changes in outpatients' waiting times before and after the introduction of EMR using the methodology proposed in this paper, and found that the outpatients' waiting time decreases after the introduction of EMR. More specifically, the outpatients' waiting times in the target public hospitals have decreased by rates in the range between 44% and 78%. It is possible to analyze waiting times while minimizing input errors and limitations influencing consultation procedures if we use digital data and apply the queueing theory. Our results verify that the introduction of EMR contributes to the improvement of patient services by decreasing outpatients' waiting time, or by increasing efficiency. It is also expected that our methodology or its expansion could contribute to the improvement of hospital service by assisting the identification and resolution of bottlenecks in the outpatient consultation process.
Recipe for Success: Digital Viewables
NASA Technical Reports Server (NTRS)
LaPha, Steven; Gaydos, Frank
2014-01-01
The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.
Christ, Roxie; Guevar, Julien; Poyade, Matthieu; Rea, Paul M
2018-01-01
Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond.
Christ, Roxie; Guevar, Julien; Poyade, Matthieu
2018-01-01
Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond. PMID:29698413
Evaluation of digital real-time PCR assay as a molecular diagnostic tool for single-cell analysis.
Chang, Chia-Hao; Mau-Hsu, Daxen; Chen, Ke-Cheng; Wei, Cheng-Wey; Chiu, Chiung-Ying; Young, Tai-Horng
2018-02-21
In a single-cell study, isolating and identifying single cells are essential, but these processes often require a large investment of time or money. The aim of this study was to isolate and analyse single cells using a novel platform, the PanelChip™ Analysis System, which includes 2500 microwells chip and a digital real-time polymerase chain reaction (dqPCR) assay, in comparison with a standard PCR (qPCR) assay. Through the serial dilution of a known concentration standard, namely pUC19, the accuracy and sensitivity levels of two methodologies were compared. The two systems were tested on the basis of expression levels of the genetic markers vimentin, E-cadherin, N-cadherin and GAPDH in A549 lung carcinoma cells at two known concentrations. Furthermore, the influence of a known PCR inhibitor commonly found in blood samples, heparin, was evaluated in both methodologies. Finally, mathematical models were proposed and separation method of single cells was verified; moreover, gene expression levels during epithelial-mesenchymal transition in single cells under TGFβ1 treatment were measured. The drawn conclusion is that dqPCR performed using PanelChip™ is superior to the standard qPCR in terms of sensitivity, precision, and heparin tolerance. The dqPCR assay is a potential tool for clinical diagnosis and single-cell applications.
Additively Manufactured IN718 Components with Wirelessly Powered and Interrogated Embedded Sensing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Attridge, Paul; Bajekal, Sanjay; Klecka, Michael
A methodology is described for embedding commercial-off-the-shelf sensors together with wireless communication and power circuit elements using direct laser metal sintered additively manufactured components. Physics based models of the additive manufacturing processes and sensor/wireless level performance models guided the design and embedment processes. A combination of cold spray deposition and laser engineered net shaping was used to fashion the transmitter/receiving elements and embed the sensors, thereby providing environmental protection and component robustness/survivability for harsh conditions. By design, this complement of analog and digital sensors were wirelessly powered and interrogated using a health and utilization monitoring system; enabling real-time, in situmore » prognostics and diagnostics.« less
NASA Astrophysics Data System (ADS)
Grubert, Emily; Siders, Anne
2016-09-01
Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
A methodology for modeling barrier island storm-impact scenarios
Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy
2017-02-16
A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.
A method to perform a fast fourier transform with primitive image transformations.
Sheridan, Phil
2007-05-01
The Fourier transform is one of the most important transformations in image processing. A major component of this influence comes from the ability to implement it efficiently on a digital computer. This paper describes a new methodology to perform a fast Fourier transform (FFT). This methodology emerges from considerations of the natural physical constraints imposed by image capture devices (camera/eye). The novel aspects of the specific FFT method described include: 1) a bit-wise reversal re-grouping operation of the conventional FFT is replaced by the use of lossless image rotation and scaling and 2) the usual arithmetic operations of complex multiplication are replaced with integer addition. The significance of the FFT presented in this paper is introduced by extending a discrete and finite image algebra, named Spiral Honeycomb Image Algebra (SHIA), to a continuous version, named SHIAC.
Creation of 3D Multi-Body Orthodontic Models by Using Independent Imaging Sensors
Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano
2013-01-01
In the field of dental health care, plaster models combined with 2D radiographs are widely used in clinical practice for orthodontic diagnoses. However, complex malocclusions can be better analyzed by exploiting 3D digital dental models, which allow virtual simulations and treatment planning processes. In this paper, dental data captured by independent imaging sensors are fused to create multi-body orthodontic models composed of teeth, oral soft tissues and alveolar bone structures. The methodology is based on integrating Cone-Beam Computed Tomography (CBCT) and surface structured light scanning. The optical scanner is used to reconstruct tooth crowns and soft tissues (visible surfaces) through the digitalization of both patients' mouth impressions and plaster casts. These data are also used to guide the segmentation of internal dental tissues by processing CBCT data sets. The 3D individual dental tissues obtained by the optical scanner and the CBCT sensor are fused within multi-body orthodontic models without human supervisions to identify target anatomical structures. The final multi-body models represent valuable virtual platforms to clinical diagnostic and treatment planning. PMID:23385416
Creation of 3D multi-body orthodontic models by using independent imaging sensors.
Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano
2013-02-05
In the field of dental health care, plaster models combined with 2D radiographs are widely used in clinical practice for orthodontic diagnoses. However, complex malocclusions can be better analyzed by exploiting 3D digital dental models, which allow virtual simulations and treatment planning processes. In this paper, dental data captured by independent imaging sensors are fused to create multi-body orthodontic models composed of teeth, oral soft tissues and alveolar bone structures. The methodology is based on integrating Cone-Beam Computed Tomography (CBCT) and surface structured light scanning. The optical scanner is used to reconstruct tooth crowns and soft tissues (visible surfaces) through the digitalization of both patients' mouth impressions and plaster casts. These data are also used to guide the segmentation of internal dental tissues by processing CBCT data sets. The 3D individual dental tissues obtained by the optical scanner and the CBCT sensor are fused within multi-body orthodontic models without human supervisions to identify target anatomical structures. The final multi-body models represent valuable virtual platforms to clinical diagnostic and treatment planning.
Determination and evaluation of acceptable force limits in single-digit tasks.
Nussbaum, Maury A; Johnson, Hope
2002-01-01
Acceptable limits derived from psychophysical methodologies have been proposed, measured, and employed in a range of applications. There is little existing work, however, on such limits for single-digit exertions and relatively limited evidence on several fundamental issues related to data collection and processing of a sequence of self-regulated exertion levels. An experimental study was conducted using 14 male and 10 female participants (age range 18-31 years) from whom maximal voluntary exertions and maximal acceptable limits (MALs) were obtained using the index finger and thumb. Moderate to high levels of consistency were found for both measures between sessions separated by one day. Single MAL values, determined from a time series of exertions, were equivalent across three divergent processing methods and between values obtained from 5- and 25-min samples. A critical interpretation of these and earlier results supports continued use of acceptable limits but also suggests that they should be used with some caution and not equated with safe limits. This research can be applied toward future development of exertion limits based on perceived acceptability.
An open repository of earthquake-triggered ground-failure inventories
Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.
2017-12-20
Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.
Enhanced optical alignment of a digital micro mirror device through Bayesian adaptive exploration
NASA Astrophysics Data System (ADS)
Wynne, Kevin B.; Knuth, Kevin H.; Petruccelli, Jonathan
2017-12-01
As the use of Digital Micro Mirror Devices (DMDs) becomes more prevalent in optics research, the ability to precisely locate the Fourier "footprint" of an image beam at the Fourier plane becomes a pressing need. In this approach, Bayesian adaptive exploration techniques were employed to characterize the size and position of the beam on a DMD located at the Fourier plane. It couples a Bayesian inference engine with an inquiry engine to implement the search. The inquiry engine explores the DMD by engaging mirrors and recording light intensity values based on the maximization of the expected information gain. Using the data collected from this exploration, the Bayesian inference engine updates the posterior probability describing the beam's characteristics. The process is iterated until the beam is located to within the desired precision. This methodology not only locates the center and radius of the beam with remarkable precision but accomplishes the task in far less time than a brute force search. The employed approach has applications to system alignment for both Fourier processing and coded aperture design.
Agarwal, Smisha; Lefevre, Amnesty E
2017-01-01
Background Despite the rapid proliferation of health interventions that employ digital tools, the evidence on the effectiveness of such approaches remains insufficient and of variable quality. To address gaps in the comprehensiveness and quality of reporting on the effectiveness of digital programs, the mHealth Technical Evidence Review Group (mTERG), convened by the World Health Organization, proposed the mHealth Evidence Reporting and Assessment (mERA) checklist to address existing gaps in the comprehensiveness and quality of reporting on the effectiveness of digital health programs. Objective We present an overview of the mERA checklist and encourage researchers working in the digital health space to use the mERA checklist for reporting their research. Methods The development of the mERA checklist consisted of convening an expert group to recommend an appropriate approach, convening a global expert review panel for checklist development, and pilot-testing the checklist. Results The mERA checklist consists of 16 core mHealth items that define what the mHealth intervention is (content), where it is being implemented (context), and how it was implemented (technical features). Additionally, a 29-item methodology checklist guides authors on reporting critical aspects of the research methodology employed in the study. We recommend that the core mERA checklist is used in conjunction with an appropriate study-design specific checklist. Conclusions The mERA checklist aims to assist authors in reporting on digital health research, guide reviewers and policymakers in synthesizing evidence, and guide journal editors in assessing the completeness in reporting on digital health studies. An increase in transparent and rigorous reporting can help identify gaps in the conduct of research and understand the effects of digital health interventions as a field of inquiry. PMID:28986340
NASA Astrophysics Data System (ADS)
Lehene, T. R.; Samoilă, V.; Soporan, V. F.; Pădurețu, S.; Vescan, M. M.
2018-06-01
The paper aims to present a methodology for the analysis of the engineering training systems at the manufacturing stage of castings through critical engineering thinking. Its use [4, 5] requires the development of procedures capable of responding to the problems faced by engineering training in terms of acquiring the tools and procedures. The structure of the analysis took into consideration the following aspects: the motivation to use the proposed procedure, considerations on the engineering behavior, the design of the reasoning adapted to the analysis of the engineering training systems, the determination of the correlations in the processes of obtaining the cast products, the definition and calibration of the digital experiment, the definition and analysis of the factors influencing the last solidification area (the nature of the alloy, the shape of the mold and the casting geometry).
NASA Astrophysics Data System (ADS)
Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.
2018-02-01
This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.
Forensic detection of noise addition in digital images
NASA Astrophysics Data System (ADS)
Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin
2014-03-01
We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.
NASA Astrophysics Data System (ADS)
Themistocleous, K.; Agapiou, A.; Hadjimitsis, D.
2016-10-01
The documentation of architectural cultural heritage sites has traditionally been expensive and labor-intensive. New innovative technologies, such as Unmanned Aerial Vehicles (UAVs), provide an affordable, reliable and straightforward method of capturing cultural heritage sites, thereby providing a more efficient and sustainable approach to documentation of cultural heritage structures. In this study, hundreds of images of the Panagia Chryseleousa church in Foinikaria, Cyprus were taken using a UAV with an attached high resolution camera. The images were processed to generate an accurate digital 3D model by using Structure in Motion techniques. Building Information Model (BIM) was then used to generate drawings of the church. The methodology described in the paper provides an accurate, simple and cost-effective method of documenting cultural heritage sites and generating digital 3D models using novel techniques and innovative methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lafata, K; Ren, L; Cai, J
2016-06-15
Purpose: To develop a methodology based on digitally-reconstructed-fluoroscopy (DRF) to quantitatively assess target localization accuracy of lung SBRT, and to evaluate using both a dynamic digital phantom and a patient dataset. Methods: For each treatment field, a 10-phase DRF is generated based on the planning 4DCT. Each frame is pre-processed with a morphological top-hat filter, and corresponding beam apertures are projected to each detector plane. A template-matching algorithm based on cross-correlation is used to detect the tumor location in each frame. Tumor motion relative beam aperture is extracted in the superior-inferior direction based on each frame’s impulse response to themore » template, and the mean tumor position (MTP) is calculated as the average tumor displacement. The DRF template coordinates are then transferred to the corresponding MV-cine dataset, which is retrospectively filtered as above. The treatment MTP is calculated within each field’s projection space, relative to the DRF-defined template. The field’s localization error is defined as the difference between the DRF-derived-MTP (planning) and the MV-cine-derived-MTP (delivery). A dynamic digital phantom was used to assess the algorithm’s ability to detect intra-fractional changes in patient alignment, by simulating different spatial variations in the MV-cine and calculating the corresponding change in MTP. Inter-and-intra-fractional variation, IGRT accuracy, and filtering effects were investigated on a patient dataset. Results: Phantom results demonstrated a high accuracy in detecting both translational and rotational variation. The lowest localization error of the patient dataset was achieved at each fraction’s first field (mean=0.38mm), with Fx3 demonstrating a particularly strong correlation between intra-fractional motion-caused localization error and treatment progress. Filtering significantly improved tracking visibility in both the DRF and MV-cine images. Conclusion: We have developed and evaluated a methodology to quantify lung SBRT target localization accuracy based on digitally-reconstructed-fluoroscopy. Our approach may be useful in potentially reducing treatment margins to optimize lung SBRT outcomes. R01-184173.« less
Bertolaccini, Luca; Rizzardi, Giovanna; Filice, Mary Jo; Terzi, Alberto
2011-05-01
Until now, only way to report air leaks (ALs) has been with an analogue score in an inherently subjective manner. The Six Sigma quality improvement methodology is a data-driven approach applicable to evaluate the quality of the quantification method of repetitive procedures. We applied the Six Sigma concept to improve the process of AL evaluation. A digital device for AL measurement (Drentech PALM, Redax S.r.l., Mirandola (MO), Italy) was applied to 49 consecutive patients, who underwent pulmonary intervention, compared with a similar population with classical chest drainage. Data recorded were postoperative AL, chest-tube removal days, number of chest roentgenograms, hospital length of stay; device setup time, average time rating AL and patient satisfaction. Bivariable comparisons were made using the Mann-Whitney test, the χ² test and Fisher's exact test. Analysis of quality was conducted using the Six Sigma methodology. There were no significant differences regarding AL (p=0.075), although not statistically significant; there was a reduction of postoperative chest X-rays (four vs five) and of hospital length of stay (6.5 vs 7.1 days); and a marginally significant difference was found between chest-tube removal days (p=0.056). There were significant differences regarding device setup time (p=0.001), average time rating AL (p=0.001), inter-observer variability (p=0.001) and patient satisfaction (p=0.002). Six Sigma analyses revealed accurate assessment of AL. Continuous digital measurement of AL reduces degree of variability of AL score, gives more assurance for tube removal, and reports AL without the apprehension of observer error. Efficiency and effectiveness improved with the use of a digital device. We have noted that the AL curves depict actually sealing of AL. The clinical importance of AL curves requires further study. Copyright © 2010 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.
Digital-Visual-Sensory-Design Anthropology: Ethnography, Imagination and Intervention
ERIC Educational Resources Information Center
Pink, Sarah
2014-01-01
In this article I outline how a digital-visual-sensory approach to anthropological ethnography might participate in the making of relationship between design and anthropology. While design anthropology is itself coming of age, the potential of its relationship with applied visual anthropology methodology and theory has not been considered in the…
Becoming a Networked Public: Digital Ethnography, Youth and Global Research Collectives
ERIC Educational Resources Information Center
Gallagher, Kathleen; Wessels, Anne; Ntelioglou, Burcu Yaman
2013-01-01
The following article describes a research context that has privileged both virtual and placed-based ethnographic fieldwork, using a hybrid methodology of live and digital communications across school sites in Toronto, Canada; Lucknow, India; Taipei, Taiwan; and Boston, USA. The multi-site ethnographic study is concerned with questions of school…
Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods
ERIC Educational Resources Information Center
Williamson, Ben
2016-01-01
This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…
ERIC Educational Resources Information Center
Mangen, Anne
2010-01-01
This article presents some theoretical-methodological reflections on the current state of the art of research on information and communication technology (ICT) in early childhood education. The implementation of ICT in preschool has triggered considerable research activity on the educational potential of digital technologies. Numerous projects and…
DOT National Transportation Integrated Search
2011-05-01
This report describes an assessment of digital elevation models (DEMs) derived from : LiDAR data for a subset of the Ports of Los Angeles and Long Beach. A methodology : based on Monte Carlo simulation was applied to investigate the accuracy of DEMs ...
ERIC Educational Resources Information Center
Barber, Wendy; Taylor, Stacey; Buchanan, Sylvia
2014-01-01
The purpose of this paper is to examine a specific online pedagogical tool, "Digital Moments" that can be an effective strategy for building online communities in a knowledge building environment. While the paper will examine the specific techniques and teaching methodologies that enabled the authors to create authentic online learning…
"Baby-Cam" and Researching with Infants: Viewer, Image and (Not) Knowing
ERIC Educational Resources Information Center
Elwick, Sheena
2015-01-01
This article offers a methodological reflection on how "baby-cam" enhanced ethically reflective attitudes in a large-scale research project that set out to research with infants in Australian early childhood education and care settings. By juxtaposing digital images produced by two different digital-camera technologies and drawing on…
Lab at Home: Hardware Kits for a Digital Design Lab
ERIC Educational Resources Information Center
Oliver, J. P.; Haim, F.
2009-01-01
An innovative laboratory methodology for an introductory digital design course is presented. Instead of having traditional lab experiences, where students have to come to school classrooms, a "lab at home" concept is proposed. Students perform real experiments in their own homes, using hardware kits specially developed for this purpose. They…
Integrating Metrics across the Marketing Curriculum: The Digital and Social Media Opportunity
ERIC Educational Resources Information Center
Spiller, Lisa; Tuten, Tracy
2015-01-01
Modern digital and social media formats have revolutionized marketing measurement, producing an abundance of data, meaningful metrics, new tools, and methodologies. This increased emphasis on metrics in the marketing industry signifies the need for increased quantitative and critical thinking content in our marketing coursework if we are to…
Peculiarities of the Digital Divide in Sub-Saharan Africa
ERIC Educational Resources Information Center
Mutula, Stephen M.
2005-01-01
Purpose: Seeks to argue that the peculiarities of sub-Saharan Africa, in terms of its socio-cultural diversity, low economic development, linguistic factors, HIV/AIDS pandemic, gender discrimination, low ICT awareness and so on, demand a new model of addressing the digital divide. Design/methodology/approach: Paper largely based on literature…
A Political Multi-Layered Approach to Researching Children's Digital Literacy Practices
ERIC Educational Resources Information Center
Koutsogiannis, Dimitris
2007-01-01
This paper attempts to present a theoretical framework for researching the out-of-school digital literacy practices of Greek adolescents. The broader aim, however, is to discuss the theoretical and methodological issues concerning research designs to investigate literacy practices in the globalisation era. Based on data representing local and…
The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools
ERIC Educational Resources Information Center
Trinter, Christine
2016-01-01
The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…
NASA Astrophysics Data System (ADS)
Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo
2015-04-01
Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography and Earth surface processes at very low-cost. This methodology should be of great help for farmers and/or technician who work at Land Reclamation Consortia or at Civil Protection for taking suitable post-event field surveys in support to flood risk and soil management.
Digital Architecture – Results From a Gap Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Thomas, Kenneth David; Fitzgerald, Kirk
The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on amore » set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.« less
Mapping coastal morphodynamics with geospatial techniques, Cape Henry, Virginia, USA
NASA Astrophysics Data System (ADS)
Allen, Thomas R.; Oertel, George F.; Gares, Paul A.
2012-01-01
The advent and proliferation of digital terrain technologies have spawned concomitant advances in coastal geomorphology. Airborne topographic Light Detection and Ranging (LiDAR) has stimulated a renaissance in coastal mapping, and field-based mapping techniques have benefitted from improvements in real-time kinematic (RTK) Global Positioning System (GPS). Varied methodologies for mapping suggest a need to match geospatial products to geomorphic forms and processes, a task that should consider product and process ontologies from each perspective. Towards such synthesis, coastal morphodynamics on a cuspate foreland are reconstructed using spatial analysis. Sequential beach ridge and swale topography are mapped using photogrammetric spot heights and airborne LiDAR data and integrated with digital bathymetry and large-scale vector shoreline data. Isobaths from bathymetric charts were digitized to determine slope and toe depth of the modern shoreface and a reconstructed three-dimensional antecedent shoreface. Triangulated irregular networks were created for the subaerial cape and subaqueous shoreface models of the cape beach ridges and sets for volumetric analyses. Results provide estimates of relative age and progradation rate and corroborate other paleogeologic sea-level rise data from the region. Swale height elevations and other measurements quantifiable in these data provide several parameters suitable for studying coastal geomorphic evolution. Mapped paleoshorelines and volumes suggest the Virginia Beach coastal compartment is related to embryonic spit development from a late Holocene shoreline located some 5 km east of the current beach.
NASA Astrophysics Data System (ADS)
Palestini, C.; Basso, A.
2017-11-01
In recent years, an increase in international investment in hardware and software technology to support programs that adopt algorithms for photomodeling or data management from laser scanners significantly reduced the costs of operations in support of Augmented Reality and Virtual Reality, designed to generate real-time explorable digital environments integrated to virtual stereoscopic headset. The research analyzes transversal methodologies related to the acquisition of these technologies in order to intervene directly on the phenomenon of acquiring the current VR tools within a specific workflow, in light of any issues related to the intensive use of such devices , outlining a quick overview of the possible "virtual migration" phenomenon, assuming a possible integration with the new internet hyper-speed systems, capable of triggering a massive cyberspace colonization process that paradoxically would also affect the everyday life and more in general, on human space perception. The contribution aims at analyzing the application systems used for low cost 3d photogrammetry by means of a precise pipeline, clarifying how a 3d model is generated, automatically retopologized, textured by color painting or photo-cloning techniques, and optimized for parametric insertion on virtual exploration platforms. Workflow analysis will follow some case studies related to photomodeling, digital retopology and "virtual 3d transfer" of some small archaeological artifacts and an architectural compartment corresponding to the pronaus of Aurum, a building designed in the 1940s by Michelucci. All operations will be conducted on cheap or free licensed software that today offer almost the same performance as their paid counterparts, progressively improving in the data processing speed and management.
NASA Astrophysics Data System (ADS)
Detrick, R. S.; Clark, D.; Gaylord, A.; Goldsmith, R.; Helly, J.; Lemmond, P.; Lerner, S.; Maffei, A.; Miller, S. P.; Norton, C.; Walden, B.
2005-12-01
The Scripps Institution of Oceanography (SIO) and the Woods Hole Oceanographic Institution (WHOI) have joined forces with the San Diego Supercomputer Center to build a testbed for multi-institutional archiving of shipboard and deep submergence vehicle data. Support has been provided by the Digital Archiving and Preservation program funded by NSF/CISE and the Library of Congress. In addition to the more than 92,000 objects stored in the SIOExplorer Digital Library, the testbed will provide access to data, photographs, video images and documents from WHOI ships, Alvin submersible and Jason ROV dives, and deep-towed vehicle surveys. An interactive digital library interface will allow combinations of distributed collections to be browsed, metadata inspected, and objects displayed or selected for download. The digital library architecture, and the search and display tools of the SIOExplorer project, are being combined with WHOI tools, such as the Alvin Framegrabber and the Jason Virtual Control Van, that have been designed using WHOI's GeoBrowser to handle the vast volumes of digital video and camera data generated by Alvin, Jason and other deep submergence vehicles. Notions of scalability will be tested, as data volumes range from 3 CDs per cruise to 200 DVDs per cruise. Much of the scalability of this proposal comes from an ability to attach digital library data and metadata acquisition processes to diverse sensor systems. We are able to run an entire digital library from a laptop computer as well as from supercomputer-center-size resources. It can be used, in the field, laboratory or classroom, covering data from acquisition-to-archive using a single coherent methodology. The design is an open architecture, supporting applications through well-defined external interfaces maintained as an open-source effort for community inclusion and enhancement.
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
NASA Technical Reports Server (NTRS)
Montgomery, O. L.
1977-01-01
Procedures developed for digitizing the transportation arteries, airports, and dock facilities of Alabama and placing them in a computerized format compatible with the Alabama Resource Information System are described. The time required to digitize by the following methods: (a) manual, (b) Telereadex 29 with film reading and digitizing system, and (c) digitizing tablets was evaluated. A method for digitizing and storing information from the U. T. M. grid cell base which was compatible with the system was developed and tested. The highways, navigable waterways, railroads, airports, and docks in the study area were digitized and the data stored. The manual method of digitizing was shown to be best for small amounts of data, while the graphic input from the digitizing tablets would be the best approach for entering the large amounts of data required for an entire state.
The influence of biological sex, sexuality and gender role on interpersonal distance.
Uzzell, David; Horne, Nathalie
2006-09-01
This research reports on a conceptually and methodologically innovative study, which sought to measure the influence of gender on interpersonal distance. In so doing, we argue for an important distinction to be made between biological sex, gender role, and sexuality. To date, however, progress in the study of interpersonal distance (IPD) has been inhibited by poor operational definitions and inadequate measurement methodologies. For our own investigation, we innovated on methodology by devising the digital video-recording IPD method (DiVRID) that records interpersonal spatial relationships using high quality digital video equipment. The findings highlighted not only the validity of our innovative method of investigation, but also that a more sophisticated conceptualization of the impact of gender on IPD is warranted than can be accounted for by biological sex differences. In this study, we found that gender role accounts for more of the variation in IPD than the conventionally reported gender variable, sex.
Library and Information Resources and Users of Digital Resources in the Humanities
ERIC Educational Resources Information Center
Warwick, Claire; Terras, Melissa; Galina, Isabel; Huntington, Paul; Pappa, Nikoleta
2008-01-01
Purpose: The purpose of this article is to discuss the results of the Log Analysis of Internet Resources in the Arts and Humanities (LAIRAH) study. It aims to concentrate upon the use and importance of information resources, physical research centres and digital finding aids in scholarly research. Design/methodology/approach: Results are presented…
ERIC Educational Resources Information Center
Li, Lan; Worch, Eric; Zhou, YuChun; Aguiton, Rhonda
2015-01-01
While teachers' conservative attitude toward technology has been identified as a barrier to effective technology integration in classrooms, it is often optimistically assumed that this issue will resolve when the digital generation enters the teaching profession (Morris, 2012). Using a mixed methodology approach, this study aimed to examine the…
Long-Term Preservation of Digital Information in China: Some Problems and Solutions
ERIC Educational Resources Information Center
Liu, Jiazhen; Du, Peng
2009-01-01
Purpose: The purpose of this paper to describe the research work on the long-term preservation of Chinese digital information funded by National Natural Science Foundation of China (NSFC) since 2001. Design/methodology/approach: The paper provides an overview, in text and figures, of ways in which e-documents originating in China, in now obsolete…
NASA Technical Reports Server (NTRS)
Mackall, D. A.; Ishmael, S. D.; Regenie, V. A.
1983-01-01
Qualification considerations for assuring the safety of a life-critical digital flight control system include four major areas: systems interactions, verification, validation, and configuration control. The AFTI/F-16 design, development, and qualification illustrate these considerations. In this paper, qualification concepts, procedures, and methodologies are discussed and illustrated through specific examples.
Data Manipulation in an XML-Based Digital Image Library
ERIC Educational Resources Information Center
Chang, Naicheng
2005-01-01
Purpose: To help to clarify the role of XML tools and standards in supporting transition and migration towards a fully XML-based environment for managing access to information. Design/methodology/approach: The Ching Digital Image Library, built on a three-tier architecture, is used as a source of examples to illustrate a number of methods of data…
ERIC Educational Resources Information Center
Mohsenzadeh, Faranak; Isfandyari-Moghaddam, Alireza
2011-01-01
Purpose: The present research aims to identify the difficulties and obstacles for developing digital libraries in the seven regional branches of Islamic Azad University (IAU), Iran, and to study the status of librarians' skills and education programmes at these institutions. Design/methodology/approach: The 40 individuals working in the regional…
Status of the Preservation of Digital Resources in China: Results of a Survey
ERIC Educational Resources Information Center
Jiazhen, Liu; Daoling, Yang
2007-01-01
Purpose: To obtain first-hand data on the main challenges in preserving digital resources in libraries, archives and information centres in China. Design/methodology/approach: The data in this paper have been acquired by e-mail questionnaire. The conclusions are based on feedback from 57 respondents, distributed in 14 provinces in China, who work…
Reorienting Self-Directed Learning for the Creative Digital Era
ERIC Educational Resources Information Center
Karakas, Fahri; Manisaligil, Alperen
2012-01-01
Purpose: The purpose of this paper is to identify the new role that human resource developers play in the globally connected workplace. Towards that end, this paper explores the changing landscape of self-directed learning (SDL) within the digital ecosystem based on the concept of World 2.0. Design/methodology/approach: This paper reviews and…
Barratt, Monica J; Potter, Gary R; Wouters, Marije; Wilkins, Chris; Werse, Bernd; Perälä, Jussi; Pedersen, Michael Mulbjerg; Nguyen, Holly; Malm, Aili; Lenton, Simon; Korf, Dirk; Klein, Axel; Heyde, Julie; Hakkarainen, Pekka; Frank, Vibeke Asmussen; Decorte, Tom; Bouchard, Martin; Blok, Thomas
2015-03-01
Internet-mediated research methods are increasingly used to access hidden populations. The International Cannabis Cultivation Questionnaire (ICCQ) is an online survey designed to facilitate international comparisons into the relatively under-researched but increasingly significant phenomenon of domestic cannabis cultivation. The Global Cannabis Cultivation Research Consortium has used the ICCQ to survey over 6000 cannabis cultivators across 11 countries. In this paper, we describe and reflect upon our methodological approach, focusing on the digital and traditional recruitment methods used to access this hidden population and the challenges of working across multiple countries, cultures and languages. Descriptive statistics showing eligibility and completion rates and recruitment source by country of residence. Over three quarters of eligible respondents who were presented with the survey were included in the final sample of n=6528. English-speaking countries expended more effort to recruit participants than non-English-speaking countries. The most effective recruitment modes were cannabis websites/groups (33%), Facebook (14%) and news articles (11%). While respondents recruited through news articles were older, growing practice variables were strikingly similar between these main recruitment modes. Through this process, we learnt that there are trade-offs between hosting multiple surveys in each country vs. using one integrated database. We also found that although perceived anonymity is routinely assumed to be a benefit of using digital research methodologies, there are significant limits to research participant anonymity in the current era of mass digital surveillance, especially when the target group is particularly concerned about evading law enforcement. Finally, we list a number of specific recommendations for future researchers utilising Internet-mediated approaches to researching hidden populations. Copyright © 2014 Elsevier B.V. All rights reserved.
Generation of 2D Land Cover Maps for Urban Areas Using Decision Tree Classification
NASA Astrophysics Data System (ADS)
Höhle, J.
2014-09-01
A 2D land cover map can automatically and efficiently be generated from high-resolution multispectral aerial images. First, a digital surface model is produced and each cell of the elevation model is then supplemented with attributes. A decision tree classification is applied to extract map objects like buildings, roads, grassland, trees, hedges, and walls from such an "intelligent" point cloud. The decision tree is derived from training areas which borders are digitized on top of a false-colour orthoimage. The produced 2D land cover map with six classes is then subsequently refined by using image analysis techniques. The proposed methodology is described step by step. The classification, assessment, and refinement is carried out by the open source software "R"; the generation of the dense and accurate digital surface model by the "Match-T DSM" program of the Trimble Company. A practical example of a 2D land cover map generation is carried out. Images of a multispectral medium-format aerial camera covering an urban area in Switzerland are used. The assessment of the produced land cover map is based on class-wise stratified sampling where reference values of samples are determined by means of stereo-observations of false-colour stereopairs. The stratified statistical assessment of the produced land cover map with six classes and based on 91 points per class reveals a high thematic accuracy for classes "building" (99 %, 95 % CI: 95 %-100 %) and "road and parking lot" (90 %, 95 % CI: 83 %-95 %). Some other accuracy measures (overall accuracy, kappa value) and their 95 % confidence intervals are derived as well. The proposed methodology has a high potential for automation and fast processing and may be applied to other scenes and sensors.
ERIC Educational Resources Information Center
Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter
2010-01-01
Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…
Digital Socrates: A System for Disseminating and Evaluating Best Practices in Education
ERIC Educational Resources Information Center
McEachron, D. L.; Bach, C.; Sualp, M.
2012-01-01
Purpose: The purpose of this paper is to examine existing learning innovation systems and propose a systematic methodology of delivering educational innovations in the right amount, in the right place and at the right time. Design/methodology/approach: Higher education is not effectively incorporating new discoveries in cognitive science and human…
Technological Leverage in Higher Education: An Evolving Pedagogy
ERIC Educational Resources Information Center
Pillai, K. Rajasekharan; Prakash, Ashish Viswanath
2017-01-01
Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…
Lewis, George K; Lewis, George K; Olbricht, William
2008-01-01
This paper explains the circuitry and signal processing to perform electrical impedance spectroscopy on piezoelectric materials and ultrasound transducers. Here, we measure and compare the impedance spectra of 2−5 MHz piezoelectrics, but the methodology applies for 700 kHz–20 MHz ultrasonic devices as well. Using a 12 ns wide 5 volt pulsing circuit as an impulse, we determine the electrical impedance curves experimentally using Ohm's law and fast Fourier transform (FFT), and compare results with mathematical models. The method allows for rapid impedance measurement for a range of frequencies using a narrow input pulse, digital oscilloscope and FFT techniques. The technique compares well to current methodologies such as network and impedance analyzers while providing additional versatility in the electrical impedance measurement. The technique is theoretically simple, easy to implement and completed with ordinary laboratory instrumentation for minimal cost. PMID:19081773
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1987-01-01
A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan
Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less
Statechart-based design controllers for FPGA partial reconfiguration
NASA Astrophysics Data System (ADS)
Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo
2015-09-01
Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.
NASA Astrophysics Data System (ADS)
Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.
2017-02-01
A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Flexible and unique representations of two-digit decimals.
Zhang, Li; Chen, Min; Lin, Chongde; Szűcs, Denes
2014-09-01
We examined the representation of two-digit decimals through studying distance and compatibility effects in magnitude comparison tasks in four experiments. Using number pairs with different leftmost digits, we found both the second digit distance effect and compatibility effect with two-digit integers but only the second digit distance effect with two-digit pure decimals. This suggests that both integers and pure decimals are processed in a compositional manner. In contrast, neither the second digit distance effect nor the compatibility effect was observed in two-digit mixed decimals, thereby showing no evidence for compositional processing of two-digit mixed decimals. However, when the relevance of the rightmost digit processing was increased by adding some decimals pairs with the same leftmost digits, both pure and mixed decimals produced the compatibility effect. Overall, results suggest that the processing of decimals is flexible and depends on the relevance of unique digit positions. This processing mode is different from integer analysis in that two-digit mixed decimals demonstrate parallel compositional processing only when the rightmost digit is relevant. Findings suggest that people probably do not represent decimals by simply ignoring the decimal point and converting them to natural numbers. Copyright © 2014 Elsevier B.V. All rights reserved.
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Medeiros, Stephen; Hagen, Scott; Weishampel, John; ...
2015-03-25
Digital elevation models (DEMs) derived from airborne lidar are traditionally unreliable in coastal salt marshes due to the inability of the laser to penetrate the dense grasses and reach the underlying soil. To that end, we present a novel processing methodology that uses ASTER Band 2 (visible red), an interferometric SAR (IfSAR) digital surface model, and lidar-derived canopy height to classify biomass density using both a three-class scheme (high, medium and low) and a two-class scheme (high and low). Elevation adjustments associated with these classes using both median and quartile approaches were applied to adjust lidar-derived elevation values closer tomore » true bare earth elevation. The performance of the method was tested on 229 elevation points in the lower Apalachicola River Marsh. The two-class quartile-based adjusted DEM produced the best results, reducing the RMS error in elevation from 0.65 m to 0.40 m, a 38% improvement. The raw mean errors for the lidar DEM and the adjusted DEM were 0.61 ± 0.24 m and 0.32 ± 0.24 m, respectively, thereby reducing the high bias by approximately 49%.« less
The impact of the condenser on cytogenetic image quality in digital microscope system.
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.
Supervised Coursework as a Way of Improving Motivation in the Learning of Digital Electronics
ERIC Educational Resources Information Center
Rengel, R.; Martin, M. J.; Vasallo, B. G.
2012-01-01
This paper presents a series of activities and educational strategies related to the teaching of digital electronics in computer engineering. The main objective of these methodologies was to develop a final tutored coursework to be carried out by the students in small teams. This coursework was conceived as consisting of advanced problems or small…
Replicas in Cultural Heritage: 3d Printing and the Museum Experience
NASA Astrophysics Data System (ADS)
Ballarin, M.; Balletti, C.; Vernier, P.
2018-05-01
3D printing has seen a recent massive diffusion for several applications, not least the field of Cultural Heritage. Being used for different purposes, such as study, analysis, conservation or access in museum exhibitions, 3D printed replicas need to undergo a process of validation also in terms of metrical precision and accuracy. The Laboratory of Photogrammetry of Iuav University of Venice has started several collaborations with Italian museum institutions firstly for the digital acquisition and then for the physical reproduction of objects of historical and artistic interest. The aim of the research is to analyse the metric characteristics of the printed model in relation to the original data, and to optimize the process that from the survey leads to the physical representation of an object. In fact, this could be acquired through different methodologies that have different precisions (multi-image photogrammetry, TOF laser scanner, triangulation based laser scanner), and it always involves a long processing phase. It should not be forgotten that the digital data have to undergo a series of simplifications, which, on one hand, eliminate the noise introduced by the acquisition process, but on the other one, they can lead to discrepancies between the physical copy and the original geometry. In this paper we will show the results obtained on a small archaeological find that was acquired and reproduced for a museum exhibition intended for blind and partially sighted people.
Model based design introduction: modeling game controllers to microprocessor architectures
NASA Astrophysics Data System (ADS)
Jungwirth, Patrick; Badawy, Abdel-Hameed
2017-04-01
We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.
High-accuracy 3-D modeling of cultural heritage: the digitizing of Donatello's "Maddalena".
Guidi, Gabriele; Beraldin, J Angelo; Atzeni, Carlo
2004-03-01
Three-dimensional digital modeling of Heritage works of art through optical scanners, has been demonstrated in recent years with results of exceptional interest. However, the routine application of three-dimensional (3-D) modeling to Heritage conservation still requires the systematic investigation of a number of technical problems. In this paper, the acquisition process of the 3-D digital model of the Maddalena by Donatello, a wooden statue representing one of the major masterpieces of the Italian Renaissance which was swept away by the Florence flood of 1966 and successively restored, is described. The paper reports all the steps of the acquisition procedure, from the project planning to the solution of the various problems due to range camera calibration and to material non optically cooperative. Since the scientific focus is centered on the 3-D model overall dimensional accuracy, a methodology for its quality control is described. Such control has demonstrated how, in some situations, the ICP-based alignment can lead to incorrect results. To circumvent this difficulty we propose an alignment technique based on the fusion of ICP with close-range digital photogrammetry and a non-invasive procedure in order to generate a final accurate model. In the end detailed results are presented, demonstrating the improvement of the final model, and how the proposed sensor fusion ensure a pre-specified level of accuracy.
Agarwal, Smisha; Lefevre, Amnesty E; Labrique, Alain B
2017-10-06
Despite the rapid proliferation of health interventions that employ digital tools, the evidence on the effectiveness of such approaches remains insufficient and of variable quality. To address gaps in the comprehensiveness and quality of reporting on the effectiveness of digital programs, the mHealth Technical Evidence Review Group (mTERG), convened by the World Health Organization, proposed the mHealth Evidence Reporting and Assessment (mERA) checklist to address existing gaps in the comprehensiveness and quality of reporting on the effectiveness of digital health programs. We present an overview of the mERA checklist and encourage researchers working in the digital health space to use the mERA checklist for reporting their research. The development of the mERA checklist consisted of convening an expert group to recommend an appropriate approach, convening a global expert review panel for checklist development, and pilot-testing the checklist. The mERA checklist consists of 16 core mHealth items that define what the mHealth intervention is (content), where it is being implemented (context), and how it was implemented (technical features). Additionally, a 29-item methodology checklist guides authors on reporting critical aspects of the research methodology employed in the study. We recommend that the core mERA checklist is used in conjunction with an appropriate study-design specific checklist. The mERA checklist aims to assist authors in reporting on digital health research, guide reviewers and policymakers in synthesizing evidence, and guide journal editors in assessing the completeness in reporting on digital health studies. An increase in transparent and rigorous reporting can help identify gaps in the conduct of research and understand the effects of digital health interventions as a field of inquiry. ©Smisha Agarwal, Amnesty E Lefevre, Alain B Labrique. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 06.10.2017.
An iPad and Android-based Application for Digitally Recording Geologic Field Data
NASA Astrophysics Data System (ADS)
Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.
2011-12-01
Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file that can be opened in Excel for digital preparation for use in other programs. We will be field-testing the App in the fall, 2011 with weekly exercises and a week-long mapping project in Wyoming. We then will want to share the beta version of the software (at the meeting) with professional geologists and students in geology programs at other academic institutions to truly test the stability of the software and to solicit suggestions for improvements and additions.
A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology
ERIC Educational Resources Information Center
Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.
2013-01-01
Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…
Toward a general ontology for digital forensic disciplines.
Karie, Nickson M; Venter, Hein S
2014-09-01
Ontologies are widely used in different disciplines as a technique for representing and reasoning about domain knowledge. However, despite the widespread ontology-related research activities and applications in different disciplines, the development of ontologies and ontology research activities is still wanting in digital forensics. This paper therefore presents the case for establishing an ontology for digital forensic disciplines. Such an ontology would enable better categorization of the digital forensic disciplines, as well as assist in the development of methodologies and specifications that can offer direction in different areas of digital forensics. This includes such areas as professional specialization, certifications, development of digital forensic tools, curricula, and educational materials. In addition, the ontology presented in this paper can be used, for example, to better organize the digital forensic domain knowledge and explicitly describe the discipline's semantics in a common way. Finally, this paper is meant to spark discussions and further research on an internationally agreed ontological distinction of the digital forensic disciplines. Digital forensic disciplines ontology is a novel approach toward organizing the digital forensic domain knowledge and constitutes the main contribution of this paper. © 2014 American Academy of Forensic Sciences.
Methodological pluralism in the teaching of Astronomy
NASA Astrophysics Data System (ADS)
de Macedo, Josué Antunes; Voelzke, Marcos Rincon
2015-04-01
This paper discusses the feasibility of using a teaching strategy called methodological pluralism, consisting of the use of various methodological resources in order to provide a meaningful learning. It is part of a doctoral thesis, which aims to investigate contributions to the use of traditional resources combined with digital technologies, in order to create autonomy for future teachers of Natural Sciences and Mathematics in relation to themes in Astronomy. It was offered an extension course at the "Federal Institution of Education, Science and Technology" in the North of Minas Gerais (FINMG), Campus Januaria, for thirty-two students of licentiate courses in Physics, Mathematics and Biological Sciences, involving themes of Astronomy, in order to search and contribute to improving the training of future teachers. The following aspects are used: the mixed methodology, with pre-experimental design, combined with content analysis. The results indicate the rates of students' prior knowledge in relation to Astronomy was low; meaningful learning indications of concepts related to Astronomy, and the feasibility of using digital resources Involving technologies, articulated with traditional materials in the teaching of Astronomy. This research sought to contribute to the initial teacher training, especially in relation to Astronomy Teaching, proposing new alternatives to promote the teaching of this area of knowledge, extending the methodological options of future teachers.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Moreira, M. A.
1983-01-01
Using digitally processed MSS/LANDSAT data as auxiliary variable, a methodology to estimate wheat (Triticum aestivum L) area by means of sampling techniques was developed. To perform this research, aerial photographs covering 720 sq km in Cruz Alta test site at the NW of Rio Grande do Sul State, were visually analyzed. LANDSAT digital data were analyzed using non-supervised and supervised classification algorithms; as post-processing the classification was submitted to spatial filtering. To estimate wheat area, the regression estimation method was applied and different sample sizes and various sampling units (10, 20, 30, 40 and 60 sq km) were tested. Based on the four decision criteria established for this research, it was concluded that: (1) as the size of sampling units decreased the percentage of sampled area required to obtain similar estimation performance also decreased; (2) the lowest percentage of the area sampled for wheat estimation with relatively high precision and accuracy through regression estimation was 90% using 10 sq km s the sampling unit; and (3) wheat area estimation by direct expansion (using only aerial photographs) was less precise and accurate when compared to those obtained by means of regression estimation.
Rockfall hazard analysis using LiDAR and spatial modeling
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho
2010-05-01
Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.
Worlddem - a Novel Global Foundation Layer
NASA Astrophysics Data System (ADS)
Riegler, G.; Hennig, S. D.; Weber, M.
2015-03-01
Airbus Defence and Space's WorldDEM™ provides a global Digital Elevation Model of unprecedented quality, accuracy, and coverage. The product will feature a vertical accuracy of 2m (relative) and better than 6m (absolute) in a 12m x 12m raster. The accuracy will surpass that of any global satellite-based elevation model available. WorldDEM is a game-changing disruptive technology and will define a new standard in global elevation models. The German radar satellites TerraSAR-X and TanDEM-X form a high-precision radar interferometer in space and acquire the data basis for the WorldDEM. This mission is performed jointly with the German Aerospace Center (DLR). Airbus DS refines the Digital Surface Model (e.g. editing of acquisition, processing artefacts and water surfaces) or generates a Digital Terrain Model. Three product levels are offered: WorldDEMcore (output of the processing, no editing is applied), WorldDEM™ (guarantees a void-free terrain description and hydrological consistency) and WorldDEM DTM (represents bare Earth elevation). Precise elevation data is the initial foundation of any accurate geospatial product, particularly when the integration of multi-source imagery and data is performed based upon it. Fused data provides for improved reliability, increased confidence and reduced ambiguity. This paper will present the current status of product development activities including methodologies and tool to generate these, like terrain and water bodies editing and DTM generation. In addition, the studies on verification & validation of the WorldDEM products will be presented.
The Relationship between Digit Span and Cognitive Processing Across Ability Groups.
ERIC Educational Resources Information Center
Schofield, Neville J.; Ashman, Adrian F.
1986-01-01
The relationship between forward and backward digit span and basic cognitive processes was examined. Subjects were administered measures of sequential processing, simultaneous processing, and planning. Correlational analyses indicated the serial processing character of forward digit span, and the relationship between backward digit span and…
Evaluation Of The Diagnostic Performance Of A Multimedia Medical Communications System.
NASA Astrophysics Data System (ADS)
Robertson, John G.; Coristine, Marjorie; Goldberg, Morris; Beeton, Carolyn; Belanger, Garry; Tombaugh, Jo W.; Hickey, Nancy M.; Millward, Steven F.; Davis, Michael; Whittingham, David
1989-05-01
The central concern of radiologists when evaluating Picture Archiving Communication System (PACS) is the diagnostic performance of digital images compared to the original analog versions of the same images. Considerable work has been done comparing the ROC curves of various types of digital systems to the corresponding analog systems for the detection of specific phantoms or diseases. Although the studies may notify the radiologists that for a specific lesion a digital system may perform as well as the analog system, it tells the radiologists very little about the impact on diagnostic performance of a digital system in the general practice of radiology. We describe in this paper an alternative method for evaluating the diagnostic performance of a digital system and a preliminary experiment we conducted to test the methodology.
Optimal CCD readout by digital correlated double sampling
NASA Astrophysics Data System (ADS)
Alessandri, C.; Abusleme, A.; Guzman, D.; Passalacqua, I.; Alvarez-Fontecilla, E.; Guarini, M.
2016-01-01
Digital correlated double sampling (DCDS), a readout technique for charge-coupled devices (CCD), is gaining popularity in astronomical applications. By using an oversampling ADC and a digital filter, a DCDS system can achieve a better performance than traditional analogue readout techniques at the expense of a more complex system analysis. Several attempts to analyse and optimize a DCDS system have been reported, but most of the work presented in the literature has been experimental. Some approximate analytical tools have been presented for independent parameters of the system, but the overall performance and trade-offs have not been yet modelled. Furthermore, there is disagreement among experimental results that cannot be explained by the analytical tools available. In this work, a theoretical analysis of a generic DCDS readout system is presented, including key aspects such as the signal conditioning stage, the ADC resolution, the sampling frequency and the digital filter implementation. By using a time-domain noise model, the effect of the digital filter is properly modelled as a discrete-time process, thus avoiding the imprecision of continuous-time approximations that have been used so far. As a result, an accurate, closed-form expression for the signal-to-noise ratio at the output of the readout system is reached. This expression can be easily optimized in order to meet a set of specifications for a given CCD, thus providing a systematic design methodology for an optimal readout system. Simulated results are presented to validate the theory, obtained with both time- and frequency-domain noise generation models for completeness.
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
ERIC Educational Resources Information Center
Selby, Les; Russell, David
2005-01-01
Purpose: To report on the progress of Digital Media U, a tailor-made portal, learning environment and management system. Design/methodology/approach: Discusses the design of the learning content domains, acquisition of the content and the systems for managing the curriculum in the future, including the application of a new model of accreditation.…
The path of least resistance: is there a better route?
Loree, Ann; Maihack, Marcia; Powell, Marge
2003-01-01
In May 2000, the radiology department at Stanford University Medical Center embarked on a five-year journey toward complete digitization. While the end goal was known, there was much less certainty about the steps involved along the way. Stanford worked with a team from GE Medical Systems to implement Six Sigma process improvement methodologies and related change management techniques. The methodical and evidence-based framework of Six Sigma significantly organized the process of "going digital" by breaking it into manageable projects with clear objectives. Stanford identified five key areas where improvement could be made: MR outpatient throughput, CT inpatient throughput, CT outpatient throughput, report turnaround time, and Lucile Packard Children's Hospital CR/Ortho throughput and digitization. The CT project is presented in this article. Although labor intensive, collecting radiology data manually is often the best way to obtain the level of detail required, unless there is a robust RIS in place with solid data integrity. To gather the necessary information without unduly impacting staff and workflow at Stanford, the consultants working onsite handled the actual observation and recording of data. Some of the changes introduced through Six Sigma may appear, at least on the surface, to be common sense. It is only by presenting clear evidence in terms of data, however, that the improvements can actually be implemented and accepted. By converting all appointments to 30 minutes and expanding hours of operation, Stanford was able to boost diagnostic imaging productivity, volume and revenue. With the ability to scan over lunch breaks and rest periods, potential appointment capacity increased by 140 CT scans per month. Overall, the CT project increased potential for outpatient appointment capacity by nearly 75% and projected over $1.5 million in additional annual gross revenue. The complex process of moving toward a digital radiology department at Stanford demonstrates that healthcare cannot be healed by technology alone. The ability to optimize patient services revolves around a combination of leading edge technology, dedicated and well-trained staff, and careful examination of processes and productivity.
The effects of gray scale image processing on digital mammography interpretation performance.
Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita
2005-05-01
To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.
Diatom Valve Three-Dimensional Representation: A New Imaging Method Based on Combined Microscopies
Ferrara, Maria Antonietta; De Tommasi, Edoardo; Coppola, Giuseppe; De Stefano, Luca; Rea, Ilaria; Dardano, Principia
2016-01-01
The frustule of diatoms, unicellular microalgae, shows very interesting photonic features, generally related to its complicated and quasi-periodic micro- and nano-structure. In order to simulate light propagation inside and through this natural structure, it is important to develop three-dimensional (3D) models for synthetic replica with high spatial resolution. In this paper, we present a new method that generates images of microscopic diatoms with high definition, by merging scanning electron microscopy and digital holography microscopy or atomic force microscopy data. Starting from two digital images, both acquired separately with standard characterization procedures, a high spatial resolution (Δz = λ/20, Δx = Δy ≅ 100 nm, at least) 3D model of the object has been generated. Then, the two sets of data have been processed by matrix formalism, using an original mathematical algorithm implemented on a commercially available software. The developed methodology could be also of broad interest in the design and fabrication of micro-opto-electro-mechanical systems. PMID:27690008
Photometric and polarimetric mapping of water turbidity and water depth
NASA Technical Reports Server (NTRS)
Halajian, J.; Hallock, H.
1973-01-01
A Digital Photometric Mapper (DPM) was used in the Fall of 1971 in an airborne survey of New York and Boston area waters to acquire photometric, spectral and polarimetric data. The object of this study is to analyze these data with quantitative computer processing techniques to assess the potential of the DPM in the measurement and regional mapping of water turbidity and depth. These techniques have been developed and an operational potential has been demonstrated. More emphasis is placed at this time on the methodology of data acquisition, analysis and display than on the quantity of data. The results illustrate the type, quantity and format of information that could be generated operationally with the DPM-type sensor characterized by high photometric stability and fast, accurate digital output. The prototype, single-channel DPM is suggested as a unique research tool for a number of new applications. For the operational mapping of water turbidity and depth, the merits of a multichannel DPM coupled with a laser system are stressed.
3D Microstructures for Materials and Damage Models
Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan
2017-02-01
Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less
Designing of a self-adaptive digital filter using genetic algorithm
NASA Astrophysics Data System (ADS)
Geng, Xuemei; Li, Hongguang; Xu, Chi
2018-04-01
This paper presents a novel methodology applying non-linear model for closed loop Sigma-Delta modulator that is based on genetic algorithm, which offers opportunity to simplify the process of tuning parameters and further improve the noise performance. The proposed Sigma-Delta modulator is able to quickly and efficiently design high performance, high order, closed loop that are robust to sensor fabrication tolerances. Simulation results with respect to the proposed Sigma-Delta modulator, SNR>122dB and the noise floor under -170dB are obtained in frequency range of [5-150Hz]. In further simulation, the robustness of the proposed Sigma-Delta modulator is analyzed.
Multiscale image processing and antiscatter grids in digital radiography.
Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D
2009-01-01
Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.
Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große
2012-01-01
This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338
Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große
2012-05-01
This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
Symbol processing in the left angular gyrus: evidence from passive perception of digits.
Price, Gavin R; Ansari, Daniel
2011-08-01
Arabic digits are one of the most ubiquitous symbol sets in the world. While there have been many investigations into the neural processing of the semantic information digits represent (e.g. through numerical comparison tasks), little is known about the neural mechanisms which support the processing of digits as visual symbols. To characterise the component neurocognitive mechanisms which underlie numerical cognition, it is essential to understand the processing of digits as a visual category, independent of numerical magnitude processing. The 'Triple Code Model' (Dehaene, 1992; Dehaene and Cohen, 1995) posits an asemantic visual code for processing Arabic digits in the ventral visual stream, yet there is currently little empirical evidence in support of this code. This outstanding question was addressed in the current functional Magnetic Resonance (fMRI) study by contrasting brain responses during the passive viewing of digits versus letters and novel symbols at short (50 ms) and long (500 ms) presentation times. The results of this study reveal increased activation for familiar symbols (digits and letters) relative to unfamiliar symbols (scrambled digits and letters) at long presentation durations in the left dorsal Angular gyrus (dAG). Furthermore, increased activation for Arabic digits was observed in the left ventral Angular gyrus (vAG) in comparison to letters, scrambled digits and scrambled letters at long presentation durations, but no digit specific activation in any region at short presentation durations. These results suggest an absence of a digit specific 'Visual Number Form Area' (VNFA) in the ventral visual cortex, and provide evidence for the role of the left ventral AG during the processing of digits in the absence of any explicit processing demands. We conclude that Arabic digit processing depends specifically on the left AG rather than a ventral visual stream VNFA. Copyright © 2011 Elsevier Inc. All rights reserved.
Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R
2000-09-01
To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.
ERIC Educational Resources Information Center
Dalbello, Marija
2008-01-01
This study examines the influence of culture on digital libraries of the first wave. The local cultures of innovation of five European national libraries (Biblioteca nacional de Portugal, Bibliotheque nationale de France, Die Deutsche Bibliothek, the National Library of Scotland, and the British Library) are reconstructed in case histories from…
The Graphical Representation of the Digital Astronaut Physiology Backbone
NASA Technical Reports Server (NTRS)
Briers, Demarcus
2010-01-01
This report summarizes my internship project with the NASA Digital Astronaut Project to analyze the Digital Astronaut (DA) physiology backbone model. The Digital Astronaut Project (DAP) applies integrated physiology models to support space biomedical operations, and to assist NASA researchers in closing knowledge gaps related to human physiologic responses to space flight. The DA physiology backbone is a set of integrated physiological equations and functions that model the interacting systems of the human body. The current release of the model is HumMod (Human Model) version 1.5 and was developed over forty years at the University of Mississippi Medical Center (UMMC). The physiology equations and functions are scripted in an XML schema specifically designed for physiology modeling by Dr. Thomas G. Coleman at UMMC. Currently it is difficult to examine the physiology backbone without being knowledgeable of the XML schema. While investigating and documenting the tags and algorithms used in the XML schema, I proposed a standard methodology for a graphical representation. This standard methodology may be used to transcribe graphical representations from the DA physiology backbone. In turn, the graphical representations can allow examination of the physiological functions and equations without the need to be familiar with the computer programming languages or markup languages used by DA modeling software.
NASA Astrophysics Data System (ADS)
Bosca, Ryan J.; Jackson, Edward F.
2016-01-01
Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.
Direct Bio-printing with Heterogeneous Topology Design.
Ahsan, Amm Nazmul; Xie, Ruinan; Khoda, Bashir
2017-01-01
Bio-additive manufacturing is a promising tool to fabricate porous scaffold structures for expediting the tissue regeneration processes. Unlike the most traditional bulk material objects, the microstructures of tissue and organs are mostly highly anisotropic, heterogeneous, and porous in nature. However, modelling the internal heterogeneity of tissues/organs structures in the traditional CAD environment is difficult and oftentimes inaccurate. Besides, the de facto STL conversion of bio-models introduces loss of information and piles up more errors in each subsequent step (build orientation, slicing, tool-path planning) of the bio-printing process plan. We are proposing a topology based scaffold design methodology to accurately represent the heterogeneous internal architecture of tissues/organs. An image analysis technique is used that digitizes the topology information contained in medical images of tissues/organs. A weighted topology reconstruction algorithm is implemented to represent the heterogeneity with parametric functions. The parametric functions are then used to map the spatial material distribution. The generated information is directly transferred to the 3D bio-printer and heterogeneous porous tissue scaffold structure is manufactured without STL file. The proposed methodology is implemented to verify the effectiveness of the approach and the designed example structure is bio-fabricated with a deposition based bio-additive manufacturing system.
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2007-01-01
The Plug-in Image Component Widget (PICWidget) is a software component for building digital imaging applications. The component is part of a methodology described in GIS Methodology for Planning Planetary-Rover Operations (NPO-41812), which appears elsewhere in this issue of NASA Tech Briefs. Planetary rover missions return a large number and wide variety of image data products that vary in complexity in many ways. Supported by a powerful, flexible image-data-processing pipeline, the PICWidget can process and render many types of imagery, including (but not limited to) thumbnail, subframed, downsampled, stereoscopic, and mosaic images; images coregistred with orbital data; and synthetic red/green/blue images. The PICWidget is capable of efficiently rendering images from data representing many more pixels than are available at a computer workstation where the images are to be displayed. The PICWidget is implemented as an Eclipse plug-in using the Standard Widget Toolkit, which provides a straightforward interface for re-use of the PICWidget in any number of application programs built upon the Eclipse application framework. Because the PICWidget is tile-based and performs aggressive tile caching, it has flexibility to perform faster or slower, depending whether more or less memory is available.
Sequential or parallel decomposed processing of two-digit numbers? Evidence from eye-tracking.
Moeller, Korbinian; Fischer, Martin H; Nuerk, Hans-Christoph; Willmes, Klaus
2009-02-01
While reaction time data have shown that decomposed processing of two-digit numbers occurs, there is little evidence about how decomposed processing functions. Poltrock and Schwartz (1984) argued that multi-digit numbers are compared in a sequential digit-by-digit fashion starting at the leftmost digit pair. In contrast, Nuerk and Willmes (2005) favoured parallel processing of the digits constituting a number. These models (i.e., sequential decomposition, parallel decomposition) make different predictions regarding the fixation pattern in a two-digit number magnitude comparison task and can therefore be differentiated by eye fixation data. We tested these models by evaluating participants' eye fixation behaviour while selecting the larger of two numbers. The stimulus set consisted of within-decade comparisons (e.g., 53_57) and between-decade comparisons (e.g., 42_57). The between-decade comparisons were further divided into compatible and incompatible trials (cf. Nuerk, Weger, & Willmes, 2001) and trials with different decade and unit distances. The observed fixation pattern implies that the comparison of two-digit numbers is not executed by sequentially comparing decade and unit digits as proposed by Poltrock and Schwartz (1984) but rather in a decomposed but parallel fashion. Moreover, the present fixation data provide first evidence that digit processing in multi-digit numbers is not a pure bottom-up effect, but is also influenced by top-down factors. Finally, implications for multi-digit number processing beyond the range of two-digit numbers are discussed.
Meyer-Delpho, C; Schubert, H-J
2015-09-01
The added value of information and communications technologies should be demonstrated precisely in such areas of care in which the importance of intersectoral and interdisciplinary cooperation is particularly high. In the context of the accompanying research of a supply concept for palliative care patients, the potential of a digital documentation process was comparatively analysed with the conventional paper-based workflow. Data were collected in the form of a multi-methodological approach and processed for the project in 3 stages: (1) Development and analysis of a palliative care process with the focus on all relevant steps of documentation. (2) Questionnaire design and the comparative mapping of specific process times. (3) Sampling, selection, and analysis of patient records and their derivable insights of process iterations. With the use of ICT, the treatment time per patient is reduced by up to 53% and achieves a reduction in costs and workload by up to 901 min. The result of an up to 213% increase in the number of patient contacts allows a higher continuity of care. Although the 16% increase in documentation loyalty improves the usability of cross-team documented information, it partially extends the workload on the level of individual actors. By using a digital health record around 31% more patients could be treated with the same staffing ratio. The multi-stage analysis of the palliative care process showed that ICT has a decisive influence on the process dimension of intersectoral cooperation. Due to favourable organisational conditions the pioneering work of palliative care also provides important guidance for a successful use of ICT technologies in the context of innovative forms of care. © Georg Thieme Verlag KG Stuttgart · New York.
Distinctive fingerprints of erosional regimes in terrestrial channel networks
NASA Astrophysics Data System (ADS)
Grau Galofre, A.; Jellinek, M.
2017-12-01
Satellite imagery and digital elevation maps capture the large scale morphology of channel networks attributed to long term erosional processes, such as fluvial, glacial, groundwater sapping and subglacial erosion. Characteristic morphologies associated with each of these styles of erosion have been studied in detail, but there exists a knowledge gap related to their parameterization and quantification. This knowledge gap prevents a rigorous analysis of the dominant processes that shaped a particular landscape, and a comparison across styles of erosion. To address this gap, we use previous morphological descriptions of glaciers, rivers, sapping valleys and tunnel valleys to identify and measure quantitative metrics diagnostic of these distinctive styles of erosion. From digital elevation models, we identify four geometric metrics: The minimum channel width, channel aspect ratio (longest length to channel width at the outlet), presence of undulating longitudinal profiles, and tributary junction angle. We also parameterize channel network complexity in terms of its stream order and fractal dimension. We then perform a statistical classification of the channel networks using a Principal Component Analysis on measurements of these six metrics on a dataset of 70 channelized systems. We show that rivers, glaciers, groundwater seepage and subglacial meltwater erode the landscape in rigorously distinguishable ways. Our methodology can more generally be applied to identify the contributions of different processes involved in carving a channel network. In particular, we are able to identify transitions from fluvial to glaciated landscapes or vice-versa.
A Nonlinear Digital Control Solution for a DC/DC Power Converter
NASA Technical Reports Server (NTRS)
Zhu, Minshao
2002-01-01
A digital Nonlinear Proportional-Integral-Derivative (NPID) control algorithm was proposed to control a 1-kW, PWM, DC/DC, switching power converter. The NPID methodology is introduced and a practical hardware control solution is obtained. The design of the controller was completed using Matlab (trademark) Simulink, while the hardware-in-the-loop testing was performed using both the dSPACE (trademark) rapid prototyping system, and a stand-alone Texas Instruments (trademark) Digital Signal Processor (DSP)-based system. The final Nonlinear digital control algorithm was implemented and tested using the ED408043-1 Westinghouse DC-DC switching power converter. The NPID test results are discussed and compared to the results of a standard Proportional-Integral (PI) controller.
An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing
2002-08-01
simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital
ERIC Educational Resources Information Center
Toh, Yancy; So, Hyo-Jeong; Seow, Peter; Chen, Wenli; Looi, Chee-Kit
2013-01-01
This paper shares the theoretical and methodological frameworks that are deployed in a 3-year study to examine how Singapore primary school students leverage on mobile technology for seamless learning. This notion of seamless learning refers to the integrated and synergistic effects of learning in both formal and informal settings, which is…
Automatic classification of atypical lymphoid B cells using digital blood image processing.
Alférez, S; Merino, A; Mujica, L E; Ruiz, M; Bigorra, L; Rodellar, J
2014-08-01
There are automated systems for digital peripheral blood (PB) cell analysis, but they operate most effectively in nonpathological blood samples. The objective of this work was to design a methodology to improve the automatic classification of abnormal lymphoid cells. We analyzed 340 digital images of individual lymphoid cells from PB films obtained in the CellaVision DM96:150 chronic lymphocytic leukemia (CLL) cells, 100 hairy cell leukemia (HCL) cells, and 90 normal lymphocytes (N). We implemented the Watershed Transformation to segment the nucleus, the cytoplasm, and the peripheral cell region. We extracted 44 features and then the clustering Fuzzy C-Means (FCM) was applied in two steps for the lymphocyte classification. The images were automatically clustered in three groups, one of them with 98% of the HCL cells. The set of the remaining cells was clustered again using FCM and texture features. The two new groups contained 83.3% of the N cells and 71.3% of the CLL cells, respectively. The approach has been able to automatically classify with high precision three types of lymphoid cells. The addition of more descriptors and other classification techniques will allow extending the classification to other classes of atypical lymphoid cells. © 2013 John Wiley & Sons Ltd.
The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284
Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Besusparis, Justinas; Meskauskas, Raimundas; Baltrusaityte, Indra; Iqbal, Yasir; Laurinavicius, Arvydas
2015-10-19
Digital image analysis (DIA) enables higher accuracy, reproducibility, and capacity to enumerate cell populations by immunohistochemistry; however, the most unique benefits may be obtained by evaluating the spatial distribution and intra-tissue variance of markers. The proliferative activity of breast cancer tissue, estimated by the Ki67 labeling index (Ki67 LI), is a prognostic and predictive biomarker requiring robust measurement methodologies. We performed DIA on whole-slide images (WSI) of 302 surgically removed Ki67-stained breast cancer specimens; the tumour classifier algorithm was used to automatically detect tumour tissue but was not trained to distinguish between invasive and non-invasive carcinoma cells. The WSI DIA-generated data were subsampled by hexagonal tiling (HexT). Distribution and texture parameters were compared to conventional WSI DIA and pathology report data. Factor analysis of the data set, including total numbers of tumor cells, the Ki67 LI and Ki67 distribution, and texture indicators, extracted 4 factors, identified as entropy, proliferation, bimodality, and cellularity. The factor scores were further utilized in cluster analysis, outlining subcategories of heterogeneous tumors with predominant entropy, bimodality, or both at different levels of proliferative activity. The methodology also allowed the visualization of Ki67 LI heterogeneity in tumors and the automated detection and quantitative evaluation of Ki67 hotspots, based on the upper quintile of the HexT data, conceptualized as the "Pareto hotspot". We conclude that systematic subsampling of DIA-generated data into HexT enables comprehensive Ki67 LI analysis that reflects aspects of intra-tumor heterogeneity and may serve as a methodology to improve digital immunohistochemistry in general.
Simulation of a complete X-ray digital radiographic system for industrial applications.
Nazemi, E; Rokrok, B; Movafeghi, A; Choopan Dastjerdi, M H
2018-05-19
Simulating X-ray images is of great importance in industry and medicine. Using such simulation permits us to optimize parameters which affect image's quality without the limitations of an experimental procedure. This study revolves around a novel methodology to simulate a complete industrial X-ray digital radiographic system composed of an X-ray tube and a computed radiography (CR) image plate using Monte Carlo N Particle eXtended (MCNPX) code. In the process of our research, an industrial X-ray tube with maximum voltage of 300 kV and current of 5 mA was simulated. A 3-layer uniform plate including a polymer overcoat layer, a phosphor layer and a polycarbonate backing layer was also defined and simulated as the CR imaging plate. To model the image formation in the image plate, at first the absorbed dose was calculated in each pixel inside the phosphor layer of CR imaging plate using the mesh tally in MCNPX code and then was converted to gray value using a mathematical relationship determined in a separate procedure. To validate the simulation results, an experimental setup was designed and the images of two step wedges created out of aluminum and steel were captured by the experiments and compared with the simulations. The results show that the simulated images are in good agreement with the experimental ones demonstrating the ability of the proposed methodology for simulating an industrial X-ray imaging system. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Roberts, Simon J.
2014-01-01
The Faculty of Engineering at The University of Nottingham, UK, has developed interdisciplinary, hands-on workshops for primary schools that introduce space technology, its relevance to everyday life and the importance of science, technology, engineering and maths. The workshop activities for 7-11 year olds highlight the roles that space and satellite technology play in observing and monitoring the Earth's biosphere as well as being vital to communications in the modern digital world. The programme also provides links to 'how science works', the environment and citizenship and uses pixel art through the medium of digital photography to demonstrate the importance of maths in a novel and unconventional manner. The interactive programme of activities provides learners with an opportunity to meet 'real' scientists and engineers, with one of the key messages from the day being that anyone can become involved in science and engineering whatever their ability or subject of interest. The methodology introduces the role of scientists and engineers using space technology themes, but it could easily be adapted for use with any inspirational topic. Analysis of learners' perceptions of science, technology, engineering and maths before and after participating in ENGage showed very positive and significant changes in their attitudes to these subjects and an increase in the number of children thinking they would be interested and capable in pursuing a career in science and engineering. This paper provides an overview of the activities, the methodology, the evaluation process and results.
Digital particle image thermometry/velocimetry: a review
NASA Astrophysics Data System (ADS)
Dabiri, Dana
2009-02-01
Digital particle image thermometry/velocimetry (DPIT/V) is a relatively new methodology that allows for measurements of simultaneous temperature and velocity within a two-dimensional domain, using thermochromic liquid crystal tracer particles as the temperature and velocity sensors. Extensive research has been carried out over recent years that have allowed the methodology and its implementation to grow and evolve. While there have been several reviews on the topic of liquid crystal thermometry (Moffat in Exp Therm Fluid Sci 3:14-32, 1990; Baughn in Int J Heat Fluid Flow 16:365-375, 1995; Roberts and East in J Spacecr Rockets 33:761-768, 1996; Wozniak et al. in Appl Sci Res 56:145-156, 1996; Behle et al. in Appl Sci Res 56:113-143, 1996; Stasiek in Heat Mass Transf 33:27-39, 1997; Stasiek and Kowalewski in Opto Electron Rev 10:1-10, 2002; Stasiek et al. in Opt Laser Technol 38:243-256, 2006; Smith et al. in Exp Fluids 30:190-201, 2001; Kowalewski et al. in Springer handbook of experimental fluid mechanics, 1st edn. Springer, Berlin, pp 487-561, 2007), the focus of the present review is to provide a relevant discussion of liquid crystals pertinent to DPIT/V. This includes a background on liquid crystals and color theory, a discussion of experimental setup parameters, a description of the methodology’s most recent advances and processing methods affecting temperature measurements, and finally an explanation of its various implementations and applications.
Applied digital signal processing systems for vortex flowmeter with digital signal processing.
Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan
2009-02-01
The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.
Hybrid acousto-optic and digital equalization for microwave digital radio channels
NASA Astrophysics Data System (ADS)
Anderson, C. S.; Vanderlugt, A.
1990-11-01
Digital radio transmission systems use complex modulation schemes that require powerful signal-processing techniques to correct channel distortions and to minimize BERs. This paper proposes combining the computation power of acoustooptic processing and the accuracy of digital processing to produce a hybrid channel equalizer that exceeds the performance of digital equalization alone. Analysis shows that a hybrid equalizer for 256-level quadrature amplitude modulation (QAM) performs better than a digital equalizer for 64-level QAM.
Dotan, Dror; Friedmann, Naama; Dehaene, Stanislas
2014-10-01
Can the meaning of two-digit Arabic numbers be accessed independently of their verbal-phonological representations? To answer this question we explored the number processing of ZN, an aphasic patient with a syntactic deficit in digit-to-verbal transcoding, who could hardly read aloud two-digit numbers, but could read them as single digits ("four, two"). Neuropsychological examination showed that ZN's deficit was neither in the digit input nor in the phonological output processes, as he could copy and repeat two-digit numbers. His deficit thus lied in a central process that converts digits to abstract number words and sends this information to phonological retrieval processes. Crucially, in spite of this deficit in number transcoding, ZN's two-digit comprehension was spared in several ways: (1) he could calculate two-digit additions; (2) he showed good performance in a two-digit comparison task, and a continuous distance effect; and (3) his performance in a task of mapping numbers to positions on an unmarked number line showed a logarithmic (nonlinear) factor, indicating that he represented two-digit Arabic numbers as holistic two-digit quantities. Thus, at least these aspects of number comprehension can be performed without converting the two-digit number from digits to verbal representation.
Integration of tablet technologies in the e-laboratory of cytology: a health technology assessment.
Giansanti, Daniele; Pochini, Marco; Giovagnoli, Maria Rosaria
2014-10-01
Although tablet systems are becoming a powerful technology, particularly useful in every application of medical imaging, to date no one has investigated the acceptance and performance of this technology in digital cytology. The specific aims of the work were (1) to design a health technology assessment (HTA) tool to assess, in terms of performance and acceptance, the introduction of tablet technologies (wearable, portable, and non portable) in the e-laboratories of cytology and (2) to test the tool in a first significant application of digital cytology. An HTA tool was proposed operating on a domain of five dimensions of investigation comprising the basic information of the product of digital cytology, the perceived subjective quality of images, the assessment of the virtual navigation on the e-slide, the assessment of the information and communication technologies features, and the diagnostic power. Six e-slides regarding studies of cervicovaginal cytology digitalized by means of an Aperio ( www.aperio.com ) scanner and uploaded onto the www.digitalslide.it Web site were used for testing the methodology on three different network connections. Three experts of cytology successfully tested the methodology on seven tablets found suitable for the study in their own standard configuration. Specific indexes furnished by the tool indicated both a high degree of performance and subjective acceptance of the investigated technology. The HTA tool thus could be useful to investigate new tablet technologies in digital cytology and furnish stakeholders with useful information that may help them make decisions involving the healthcare system. From a global point of view the study demonstrates the feasibility of using the tablet technology in digital cytology.
Metamaterial bricks and quantization of meta-surfaces
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram
2017-01-01
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units—which we call metamaterial bricks—each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators. PMID:28240283
Metamaterial bricks and quantization of meta-surfaces
NASA Astrophysics Data System (ADS)
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram
2017-02-01
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units--which we call metamaterial bricks--each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.
Metamaterial bricks and quantization of meta-surfaces.
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R; Drinkwater, Bruce W; Subramanian, Sriram
2017-02-27
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units-which we call metamaterial bricks-each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.
Digital Game-Based Learning: A Didactic Experience in the Pre-Degree Nursing Career.
Solís de Ovando, A; Rodríguez, A; Hullin, C
2018-01-01
Nowadays we are faced with a society immersed in globalization and native technology, causing a great challenge in the university teaching staff. The gamification, as teaching-learning didactic methodology, gathers the characteristics that allow to motivate and achieve an active and significant learning. The objective of this work is to show the experience of the nursing career, which combines the learning based on games and the use of digital applications.
Digital electronic engine control fault detection and accommodation flight evaluation
NASA Technical Reports Server (NTRS)
Baer-Ruedhart, J. L.
1984-01-01
The capabilities and performance of various fault detection and accommodation (FDA) schemes in existing and projected engine control systems were investigated. Flight tests of the digital electronic engine control (DEEC) in an F-15 aircraft show discrepancies between flight results and predictions based on simulation and altitude testing. The FDA methodology and logic in the DEEC system, and the results of the flight failures which occurred to date are described.
NASA Technical Reports Server (NTRS)
Magana, Mario E.
1989-01-01
The digital position controller implemented in the control computer of the 3-axis attitude motion simulator is mathematically reconstructed and documented, since the information supplied with the executable code of this controller was insufficient to make substantial modifications to it. Also developed were methodologies to introduce changes in the controller which do not require rewriting the software. Finally, recommendations are made on possible improvement to the control system performance.
Cao, Ruofan; Naivar, Mark A; Wilder, Mark; Houston, Jessica P
2014-01-01
Fluorescence lifetime measurements provide information about the fluorescence relaxation, or intensity decay, of organic fluorophores, fluorescent proteins, and other inorganic molecules that fluoresce. The fluorescence lifetime is emerging in flow cytometry and is helpful in a variety of multiparametric, single cell measurements because it is not impacted by nonlinearity that can occur with fluorescence intensity measurements. Yet time-resolved cytometry systems rely on major hardware modifications making the methodology difficult to reproduce. The motivation of this work is, by taking advantage of the dynamic nature of flow cytometry sample detection and applying digital signal processing methods, to measure fluorescence lifetimes using an unmodified flow cytometer. We collect a new lifetime-dependent parameter, referred to herein as the fluorescence-pulse-delay (FPD), and prove it is a valid representation of the average fluorescence lifetime. To verify we generated cytometric pulses in simulation, with light emitting diode (LED) pulsation, and with true fluorescence measurements of cells and microspheres. Each pulse is digitized and used in algorithms to extract an average fluorescence lifetime inherent in the signal. A range of fluorescence lifetimes is measurable with this approach including standard organic fluorophore lifetimes (∼1 to 22 ns) as well as small, simulated shifts (0.1 ns) under standard conditions (reported herein). This contribution demonstrates how digital data acquisition and signal processing can reveal time-dependent information foreshadowing the exploitation of full waveform analysis for quantification of similar photo-physical events within single cells. © 2014 The Authors. Published by Wiley Periodicals, Inc. PMID:25274073
Haenssgen, Marco J
2015-01-01
The increasing availability of online maps, satellite imagery, and digital technology can ease common constraints of survey sampling in low- and middle-income countries. However, existing approaches require specialised software and user skills, professional GPS equipment, and/or commercial data sources; they tend to neglect spatial sampling considerations when using satellite maps; and they continue to face implementation challenges analogous to conventional survey implementation methods. This paper presents an alternative way of utilising satellite maps and digital aides that aims to address these challenges. The case studies of two rural household surveys in Rajasthan (India) and Gansu (China) compare conventional survey sampling and implementation techniques with the use of online map services such as Google, Bing, and HERE maps. Modern yet basic digital technology can be integrated into the processes of preparing, implementing, and monitoring a rural household survey. Satellite-aided systematic random sampling enhanced the spatial representativeness of the village samples and entailed savings of approximately £4000 compared to conventional household listing, while reducing the duration of the main survey by at least 25 %. This low-cost/low-tech satellite-aided survey sampling approach can be useful for student researchers and resource-constrained research projects operating in low- and middle-income contexts with high survey implementation costs. While achieving transparent and efficient survey implementation at low costs, researchers aiming to adopt a similar process should be aware of the locational, technical, and logistical requirements as well as the methodological challenges of this strategy.
NASA Astrophysics Data System (ADS)
Bakuła, K.; Ostrowski, W.; Szender, M.; Plutecki, W.; Salach, A.; Górski, K.
2016-06-01
This paper presents the possibilities for using an unmanned aerial system for evaluation of the condition of levees. The unmanned aerial system is equipped with two types of sensor. One is an ultra-light laser scanner, integrated with a GNSS receiver and an INS system; the other sensor is a digital camera that acquires data with stereoscopic coverage. Sensors have been mounted on the multirotor, unmanned platform the Hawk Moth, constructed by MSP company. LiDAR data and images of levees the length of several hundred metres were acquired during testing of the platform. Flights were performed in several variants. Control points measured with the use of the GNSS technique were considered as reference data. The obtained results are presented in this paper; the methodology of processing the acquired LiDAR data, which increase in accuracy when low accuracy of the navigation systems occurs as a result of systematic errors, is also discussed. The Iterative Closest Point (ICP) algorithm, as well as measurements of control points, were used to georeference the LiDAR data. Final accuracy in the order of centimetres was obtained for generation of the digital terrain model. The final products of the proposed UAV data processing are digital elevation models, an orthophotomap and colour point clouds. The authors conclude that such a platform offers wide possibilities for low-budget flights to deliver the data, which may compete with typical direct surveying measurements performed during monitoring of such objects. However, the biggest advantage is the density and continuity of data, which allows for detection of changes in objects being monitored.
2001-09-01
3 E . METHODOLOGY...25 a. Patient Tracker 4.1..................................................................25 E . SAMPLE CLINICAL...Partially Completed or Missed?......29 E . CANDIDATE SOLUTIONS .........................................................................31 1. Success Story
NASA Astrophysics Data System (ADS)
Jantzen, Connie; Slagle, Rick
1997-05-01
The distinction between exposure time and sample rate is often the first point raised in any discussion of high speed imaging. Many high speed events require exposure times considerably shorter than those that can be achieved solely by the sample rate of the camera, where exposure time equals 1/sample rate. Gating, a method of achieving short exposure times in digital cameras, is often difficult to achieve for exposure time requirements shorter than 100 microseconds. This paper discusses the advantages and limitations of using the short duration light pulse of a near infrared laser with high speed digital imaging systems. By closely matching the output wavelength of the pulsed laser to the peak near infrared response of current sensors, high speed image capture can be accomplished at very low (visible) light levels of illumination. By virtue of the short duration light pulse, adjustable to as short as two microseconds, image capture of very high speed events can be achieved at relatively low sample rates of less than 100 pictures per second, without image blur. For our initial investigations, we chose a ballistic subject. The results of early experimentation revealed the limitations of applying traditional ballistic imaging methods when using a pulsed infrared lightsource with a digital imaging system. These early disappointing results clarified the need to further identify the unique system characteristics of the digital imager and pulsed infrared combination. It was also necessary to investigate how the infrared reflectance and transmittance of common materials affects the imaging process. This experimental work yielded a surprising, successful methodology which will prove useful in imaging ballistic and weapons tests, as well as forensics, flow visualizations, spray pattern analyses, and nocturnal animal behavioral studies.
Applications and challenges of digital pathology and whole slide imaging.
Higgins, C
2015-07-01
Virtual microscopy is a method for digitizing images of tissue on glass slides and using a computer to view, navigate, change magnification, focus and mark areas of interest. Virtual microscope systems (also called digital pathology or whole slide imaging systems) offer several advantages for biological scientists who use slides as part of their general, pharmaceutical, biotechnology or clinical research. The systems usually are based on one of two methodologies: area scanning or line scanning. Virtual microscope systems enable automatic sample detection, virtual-Z acquisition and creation of focal maps. Virtual slides are layered with multiple resolutions at each location, including the highest resolution needed to allow more detailed review of specific regions of interest. Scans may be acquired at 2, 10, 20, 40, 60 and 100 × or a combination of magnifications to highlight important detail. Digital microscopy starts when a slide collection is put into an automated or manual scanning system. The original slides are archived, then a server allows users to review multilayer digital images of the captured slides either by a closed network or by the internet. One challenge for adopting the technology is the lack of a universally accepted file format for virtual slides. Additional challenges include maintaining focus in an uneven sample, detecting specimens accurately, maximizing color fidelity with optimal brightness and contrast, optimizing resolution and keeping the images artifact-free. There are several manufacturers in the field and each has not only its own approach to these issues, but also its own image analysis software, which provides many options for users to enhance the speed, quality and accuracy of their process through virtual microscopy. Virtual microscope systems are widely used and are trusted to provide high quality solutions for teleconsultation, education, quality control, archiving, veterinary medicine, research and other fields.
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
Digital templating for THA: a simple computer-assisted application for complex hip arthritis cases.
Hafez, Mahmoud A; Ragheb, Gad; Hamed, Adel; Ali, Amr; Karim, Said
2016-10-01
Total hip arthroplasty (THA) is the standard procedure for end-stage arthritis of the hip. Its technical success relies on preoperative planning of the surgical procedure and virtual setup of the operative performance. Digital hip templating is one methodology of preoperative planning for THA which requires a digital preoperative radiograph and a computer with special software. This is a prospective study involving 23 patients (25 hips) who were candidates for complex THA surgery (unilateral or bilateral). Digital templating is done by radiographic assessment using radiographic magnification correction, leg length discrepancy and correction measurements, acetabular component and femoral component templating as well as neck resection measurement. The overall accuracy for templating the stem implant's exact size is 81%. This percentage increased to 94% when considering sizing within 1 size. Digital templating has proven effective, reliable and essential technique for preoperative planning and accurate prediction of THA sizing and alignment.
2014-01-01
Background Digital image analysis has the potential to address issues surrounding traditional histological techniques including a lack of objectivity and high variability, through the application of quantitative analysis. A key initial step in image analysis is the identification of regions of interest. A widely applied methodology is that of segmentation. This paper proposes the application of image analysis techniques to segment skin tissue with varying degrees of histopathological damage. The segmentation of human tissue is challenging as a consequence of the complexity of the tissue structures and inconsistencies in tissue preparation, hence there is a need for a new robust method with the capability to handle the additional challenges materialising from histopathological damage. Methods A new algorithm has been developed which combines enhanced colour information, created following a transformation to the L*a*b* colourspace, with general image intensity information. A colour normalisation step is included to enhance the algorithm’s robustness to variations in the lighting and staining of the input images. The resulting optimised image is subjected to thresholding and the segmentation is fine-tuned using a combination of morphological processing and object classification rules. The segmentation algorithm was tested on 40 digital images of haematoxylin & eosin (H&E) stained skin biopsies. Accuracy, sensitivity and specificity of the algorithmic procedure were assessed through the comparison of the proposed methodology against manual methods. Results Experimental results show the proposed fully automated methodology segments the epidermis with a mean specificity of 97.7%, a mean sensitivity of 89.4% and a mean accuracy of 96.5%. When a simple user interaction step is included, the specificity increases to 98.0%, the sensitivity to 91.0% and the accuracy to 96.8%. The algorithm segments effectively for different severities of tissue damage. Conclusions Epidermal segmentation is a crucial first step in a range of applications including melanoma detection and the assessment of histopathological damage in skin. The proposed methodology is able to segment the epidermis with different levels of histological damage. The basic method framework could be applied to segmentation of other epithelial tissues. PMID:24521154
NASA Astrophysics Data System (ADS)
Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.
2015-02-01
Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric survey, but also the starting point for the whole reconstruction of the city and its urban context, from the research point of view. This reconstruction process will concern even some areas that have not yet been excavated, where the application of procedural modelling can offer an important support to the reconstructive hypothesis.
Karakülah, Gökhan; Dicle, Oğuz; Koşaner, Ozgün; Suner, Aslı; Birant, Çağdaş Can; Berber, Tolga; Canbek, Sezin
2014-01-01
The lack of laboratory tests for the diagnosis of most of the congenital anomalies renders the physical examination of the case crucial for the diagnosis of the anomaly; and the cases in the diagnostic phase are mostly being evaluated in the light of the literature knowledge. In this respect, for accurate diagnosis, ,it is of great importance to provide the decision maker with decision support by presenting the literature knowledge about a particular case. Here, we demonstrated a methodology for automated scanning and determining of the phenotypic features from the case reports related to congenital anomalies in the literature with text and natural language processing methods, and we created a framework of an information source for a potential diagnostic decision support system for congenital anomalies.
Superimposition of 3D digital models: A case report.
José Viñas, María; Pie de Hierro, Verónica; M Ustrell-Torrent, Josep
2018-06-01
Superimposition of digital models may be performed to assess tooth movement in three dimensions. Detailed analysis of changes in tooth position after treatment may be achieved by this method. This article describes the method of superimposing digital models with a clinical case. It emphasizes the difficult procedure of superimposing 3D models in the lower arch. A methodology for superimposing mandibular models acquired with a structured light 3D scanner is discussed. Superimposition of digital models is useful to analyse tooth movement in the three planes of space, presenting advantages over the method of cephalogram superimposition. It seems feasible to superimpose digital models in the lower arch in patients without growth by using a coordinate system based on the palatal rugae and occlusion. The described method aims to advance the difficult procedure of superimposing digital models in the mandibular arch, but further research is nonetheless required in this field. Copyright © 2018 CEO. Published by Elsevier Masson SAS. All rights reserved.
Christodoulides, Nicolaos J.; McRae, Michael P.; Abram, Timothy J.; Simmons, Glennon W.; McDevitt, John T.
2017-01-01
The lack of standard tools and methodologies and the absence of a streamlined multimarker approval process have hindered the translation rate of new biomarkers into clinical practice for a variety of diseases afflicting humankind. Advanced novel technologies with superior analytical performance and reduced reagent costs, like the programmable bio-nano-chip system featured in this article, have potential to change the delivery of healthcare. This universal platform system has the capacity to digitize biology, resulting in a sensor modality with a capacity to learn. With well-planned device design, development, and distribution plans, there is an opportunity to translate benchtop discoveries in the genomics, proteomics, metabolomics, and glycomics fields by transforming the information content of key biomarkers into actionable signatures that can empower physicians and patients for a better management of healthcare. While the process is complicated and will take some time, showcased here are three application areas for this flexible platform that combines biomarker content with minimally invasive or non-invasive sampling, such as brush biopsy for oral cancer risk assessment; serum, plasma, and small volumes of blood for the assessment of cardiac risk and wellness; and oral fluid sampling for drugs of abuse testing at the point of need. PMID:28589118
Analysis methods for Thematic Mapper data of urban regions
NASA Technical Reports Server (NTRS)
Wang, S. C.
1984-01-01
Studies have indicated the difficulty in deriving a detailed land-use/land-cover classification for heterogeneous metropolitan areas with Landsat MSS and TM data. The major methodological issues of digital analysis which possibly have effected the results of classification are examined. In response to these methodological issues, a multichannel hierarchical clustering algorithm has been developed and tested for a more complete analysis of the data for urban areas.
Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego
2010-11-01
Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.
Pilot production system cost/benefit analysis: Digital document storage project
NASA Technical Reports Server (NTRS)
1989-01-01
The Digital Document Storage (DDS)/Pilot Production System (PPS) will provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The DDS/PPS will result in major benefits, such as improved document reproduction quality within a shorter time frame than is currently possible. In addition, the DDS/PPS will provide an important strategic value through the construction of a digital document archive. It is highly recommended that NASA proceed with the DDS Prototype System and a rapid prototyping development methodology in order to validate recent working assumptions upon which the success of the DDS/PPS is dependent.
Measurements methodology for evaluation of Digital TV operation in VHF high-band
NASA Astrophysics Data System (ADS)
Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.
2016-07-01
This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.
Collusion-resistant multimedia fingerprinting: a unified framework
NASA Astrophysics Data System (ADS)
Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray
2004-06-01
Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.
Das, Arpita; Bhattacharya, Mahua
2011-01-01
In the present work, authors have developed a treatment planning system implementing genetic based neuro-fuzzy approaches for accurate analysis of shape and margin of tumor masses appearing in breast using digital mammogram. It is obvious that a complicated structure invites the problem of over learning and misclassification. In proposed methodology, genetic algorithm (GA) has been used for searching of effective input feature vectors combined with adaptive neuro-fuzzy model for final classification of different boundaries of tumor masses. The study involves 200 digitized mammograms from MIAS and other databases and has shown 86% correct classification rate.
On the Development of Arabic Three-Digit Number Processing in Primary School Children
ERIC Educational Resources Information Center
Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph
2012-01-01
The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…
Fontaine, Guillaume; Lavallée, Andréane; Maheu-Cadotte, Marc-André; Bouix-Picasso, Julien; Bourbonnais, Anne
2018-01-30
The optimisation of health science communication (HSC) between researchers and the public is crucial. In the last decade, the rise of the digital and social media ecosystem allowed for the disintermediation of HSC. Disintermediation refers to the public's direct access to information from researchers about health science-related topics through the digital and social media ecosystem, a process that would otherwise require a human mediator, such as a journalist. Therefore, the primary aim of this scoping review is to describe the nature and the extent of the literature regarding HSC strategies involving disintermediation used by researchers with the public in the digital and social media ecosystem. The secondary aim is to describe the HSC strategies used by researchers, and the communication channels associated with these strategies. We will conduct a scoping review based on the Joanna Briggs Institute's methodology and perform a systematic search of six bibliographical databases (CINAHL, EMBASE, IBSS, PubMed, Sociological Abstracts and Web of Science), four trial registries and relevant sources of grey literature. Relevant journals and reference lists of included records will be hand-searched. Data will be managed using the EndNote software and the Rayyan web application. Two review team members will perform independently the screening process as well as the full-text assessment of included records. Descriptive data will be synthesised in a tabular format. Data regarding the nature and the extent of the literature, the HSC strategies and the associated communication channels will be presented narratively. This review does not require institutional review board approval as we will use only collected and published data. Results will allow the mapping of the literature about HSC between researchers and the public in the digital and social media ecosystem, and will be published in a peer-reviewed journal. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
BPSK Demodulation Using Digital Signal Processing
NASA Technical Reports Server (NTRS)
Garcia, Thomas R.
1996-01-01
A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.
Deriving Accessible Science Books for the Blind Students of Physics
NASA Astrophysics Data System (ADS)
Kouroupetroglou, Georgios; Kacorri, Hernisa
2010-01-01
We present a novel integrated methodology for the development and production of accessible physics and science books from the elementary up to tertiary educational levels. This language independent approach adopts the Design-for-All principles, the available international standards for alternative formats and the Universal Design for Learning (UDL) Guidelines. Moreover it supports both static (embossed and refreshable tactile) and dynamic (based on synthetic speech and other sounds) accessibility. It can produce Tactile Books (Embossed Braille and Tactile Graphics), Digital Talking Books (or Digital Audio Books), Large Print Books as well as Acoustic-Tactile Books for the blind and visually impaired students as well as but for the print-disabled. This methodology has been successfully applied in the case of blind students of the Physics, Mathematics and Informatics Departments in the University of Athens.
NASA Astrophysics Data System (ADS)
Shultheis, C. F.
1985-02-01
This technical report describes an analysis of the performance allocations for a satellite link, focusing specifically on a single-hop 7 to 8 GHz link of the Defense Satellite Communications System (DSCS). The analysis is performed for three primary reasons: (1) to reevaluate link power margin requirements for DSCS links based on digital signalling; (2) to analyze the implications of satellite availability and error rate allocations contained in proposed MIL-STD-188-323, system design and engineering standards for long haul digital transmission system performance; and (3) to standardize a methodology for determination of rain-related propagation constraints. The aforementioned methodology is then used to calculate the link margin requirements of typical DSCS binary/quaternary phase shift keying (BPSK/QPSK) links at 7 to 8 GHz for several different Earth terminal locations.
The Virtual Hospital: experiences in creating and sustaining a digital library.
D'Alessandro, M P; Galvin, J R; Erkonen, W E; Choi, T A; Lacey, D L; Colbert, S I
1998-01-01
A university and its faculty encompass a wealth of content, which is often freely supplied to commercial publishers who profit from it. Emerging digital library technology holds promise for allowing the creation of digital libraries and digital presses that can allow faculty and universities to bypass commercial publishers, retain control of their content, and distribute it directly to users, allowing the university and faculty to better serve their constituencies. The purpose of this paper is to show how this can be done. A methodology for overcoming the technical, social, political, and economic barriers involved in creating, distributing and organizing a digital library was developed, implemented, and refined over seven years. Over the seven years, 120 textbooks and booklets were placed in the Virtual Hospital digital library, from 159 authors in twenty-nine departments and four colleges at The University of Iowa. The digital library received extensive use by individuals around the world. A new paradigm for academic publishing was created, involving a university and faculty owned peer reviewed digital press implemented using digital library technology. The concept has been embraced by The University of Iowa, and it has pledged to sustain the digital press in order to allow. The University of Iowa to fulfill its mission of creating, organizing, and disseminating information better. PMID:9803300
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
The scheme machine: A case study in progress in design derivation at system levels
NASA Technical Reports Server (NTRS)
Johnson, Steven D.
1995-01-01
The Scheme Machine is one of several design projects of the Digital Design Derivation group at Indiana University. It differs from the other projects in its focus on issues of system design and its connection to surrounding research in programming language semantics, compiler construction, and programming methodology underway at Indiana and elsewhere. The genesis of the project dates to the early 1980's, when digital design derivation research branched from the surrounding research effort in programming languages. Both branches have continued to develop in parallel, with this particular project serving as a bridge. However, by 1990 there remained little real interaction between the branches and recently we have undertaken to reintegrate them. On the software side, researchers have refined a mathematically rigorous (but not mechanized) treatment starting with the fully abstract semantic definition of Scheme and resulting in an efficient implementation consisting of a compiler and virtual machine model, the latter typically realized with a general purpose microprocessor. The derivation includes a number of sophisticated factorizations and representations and is also deep example of the underlying engineering methodology. The hardware research has created a mechanized algebra supporting the tedious and massive transformations often seen at lower levels of design. This work has progressed to the point that large scale devices, such as processors, can be derived from first-order finite state machine specifications. This is roughly where the language oriented research stops; thus, together, the two efforts establish a thread from the highest levels of abstract specification to detailed digital implementation. The Scheme Machine project challenges hardware derivation research in several ways, although the individual components of the system are of a similar scale to those we have worked with before. The machine has a custom dual-ported memory to support garbage collection. It consists of four tightly coupled processes--processor, collector, allocator, memory--with a very non-trivial synchronization relationship. Finally, there are deep issues of representation for the run-time objects of a symbolic processing language. The research centers on verification through integrated formal reasoning systems, but is also involved with modeling and prototyping environments. Since the derivation algebra is basd on an executable modeling language, there is opportunity to incorporate design animation in the design process. We are looking for ways to move smoothly and incrementally from executable specifications into hardware realization. For example, we can run the garbage collector specification, a Scheme program, directly against the physical memory prototype, and similarly, the instruction processor model against the heap implementation.
ERIC Educational Resources Information Center
Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia
2002-01-01
Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…
An object-based approach for tree species extraction from digital orthophoto maps
NASA Astrophysics Data System (ADS)
Jamil, Akhtar; Bayram, Bulent
2018-05-01
Tree segmentation is an active and ongoing research area in the field of photogrammetry and remote sensing. It is more challenging due to both intra-class and inter-class similarities among various tree species. In this study, we exploited various statistical features for extraction of hazelnut trees from 1 : 5000 scaled digital orthophoto maps. Initially, the non-vegetation areas were eliminated using traditional normalized difference vegetation index (NDVI) followed by application of mean shift segmentation for transforming the pixels into meaningful homogeneous objects. In order to eliminate false positives, morphological opening and closing was employed on candidate objects. A number of heuristics were also derived to eliminate unwanted effects such as shadow and bounding box aspect ratios, before passing them into the classification stage. Finally, a knowledge based decision tree was constructed to distinguish the hazelnut trees from rest of objects which include manmade objects and other type of vegetation. We evaluated the proposed methodology on 10 sample orthophoto maps obtained from Giresun province in Turkey. The manually digitized hazelnut tree boundaries were taken as reference data for accuracy assessment. Both manually digitized and segmented tree borders were converted into binary images and the differences were calculated. According to the obtained results, the proposed methodology obtained an overall accuracy of more than 85 % for all sample images.
Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B.; Sturm, Benjamin W.
2016-02-09
According to one embodiment, a scintillator radiation detector system includes a scintillator, and a processing device for processing pulse traces corresponding to light pulses from the scintillator, where the processing device is configured to: process each pulse trace over at least two temporal windows and to use pulse digitization to improve energy resolution of the system. According to another embodiment, a scintillator radiation detector system includes a processing device configured to: fit digitized scintillation waveforms to an algorithm, perform a direct integration of fit parameters, process multiple integration windows for each digitized scintillation waveform to determine a correction factor, and apply the correction factor to each digitized scintillation waveform.
Rodriguez-Donate, Carlos; Morales-Velazquez, Luis; Osornio-Rios, Roque Alfredo; Herrera-Ruiz, Gilberto; de Jesus Romero-Troncoso, Rene
2010-01-01
Intelligent robotics demands the integration of smart sensors that allow the controller to efficiently measure physical quantities. Industrial manipulator robots require a constant monitoring of several parameters such as motion dynamics, inclination, and vibration. This work presents a novel smart sensor to estimate motion dynamics, inclination, and vibration parameters on industrial manipulator robot links based on two primary sensors: an encoder and a triaxial accelerometer. The proposed smart sensor implements a new methodology based on an oversampling technique, averaging decimation filters, FIR filters, finite differences and linear interpolation to estimate the interest parameters, which are computed online utilizing digital hardware signal processing based on field programmable gate arrays (FPGA).
Rodriguez-Donate, Carlos; Morales-Velazquez, Luis; Osornio-Rios, Roque Alfredo; Herrera-Ruiz, Gilberto; de Jesus Romero-Troncoso, Rene
2010-01-01
Intelligent robotics demands the integration of smart sensors that allow the controller to efficiently measure physical quantities. Industrial manipulator robots require a constant monitoring of several parameters such as motion dynamics, inclination, and vibration. This work presents a novel smart sensor to estimate motion dynamics, inclination, and vibration parameters on industrial manipulator robot links based on two primary sensors: an encoder and a triaxial accelerometer. The proposed smart sensor implements a new methodology based on an oversampling technique, averaging decimation filters, FIR filters, finite differences and linear interpolation to estimate the interest parameters, which are computed online utilizing digital hardware signal processing based on field programmable gate arrays (FPGA). PMID:22319345
Interactive Therapeutic Multi-sensory Environment for Cerebral Palsy People
NASA Astrophysics Data System (ADS)
Mauri, Cesar; Solanas, Agusti; Granollers, Toni; Bagés, Joan; García, Mabel
The Interactive Therapeutic Sensory Environment (ITSE) research project offers new opportunities on stimulation, interaction and interactive creation for people with moderate and severe mental and physical disabilities. Mainly based on computer vision techniques, the ITSE project allows the gathering of users’ gestures and their transformation into images, sounds and vibrations. Currently, in the APPC, we are working in a prototype that is capable of generating sounds based on the users’ motion and to process digitally the vocal sounds of the users. Tests with impaired users show that ITSE promotes participation, engagement and play. In this paper, we briefly describe the ITSE system, the experimental methodology, the preliminary results and some future goals.
NASA Astrophysics Data System (ADS)
Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.
2015-12-01
Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps, stratigraphic columns and cross-sections be produced digitally continues our integration on the use of digital technologies throughout the curriculum. Initial evaluation suggests that students using the Apps more quickly progress towards synthesis and interpretation of the data as well as a deeper understanding of complex 4D field relationships.
Levitt, Harry
2007-01-01
This article provides the author's perspective on the development of digital hearing aids and how digital signal processing approaches have led to changes in hearing aid design. Major landmarks in the evolution of digital technology are identified, and their impact on the development of digital hearing aids is discussed. Differences between analog and digital approaches to signal processing in hearing aids are identified. PMID:17301334
The technique for 3D printing patient-specific models for auricular reconstruction.
Flores, Roberto L; Liss, Hannah; Raffaelli, Samuel; Humayun, Aiza; Khouri, Kimberly S; Coelho, Paulo G; Witek, Lukasz
2017-06-01
Currently, surgeons approach autogenous microtia repair by creating a two-dimensional (2D) tracing of the unaffected ear to approximate a three-dimensional (3D) construct, a difficult process. To address these shortcomings, this study introduces the fabrication of patient-specific, sterilizable 3D printed auricular model for autogenous auricular reconstruction. A high-resolution 3D digital photograph was captured of the patient's unaffected ear and surrounding anatomic structures. The photographs were exported and uploaded into Amira, for transformation into a digital (.stl) model, which was imported into Blender, an open source software platform for digital modification of data. The unaffected auricle as digitally isolated and inverted to render a model for the contralateral side. The depths of the scapha, triangular fossa, and cymba were deepened to accentuate their contours. Extra relief was added to the helical root to further distinguish this structure. The ear was then digitally deconstructed and separated into its individual auricular components for reconstruction. The completed ear and its individual components were 3D printed using polylactic acid filament and sterilized following manufacturer specifications. The sterilized models were brought to the operating room to be utilized by the surgeon. The models allowed for more accurate anatomic measurements compared to 2D tracings, which reduced the degree of estimation required by surgeons. Approximately 20 g of the PLA filament were utilized for the construction of these models, yielding a total material cost of approximately $1. Using the methodology detailed in this report, as well as departmentally available resources (3D digital photography and 3D printing), a sterilizable, patient-specific, and inexpensive 3D auricular model was fabricated to be used intraoperatively. This technique of printing customized-to-patient models for surgeons to use as 'guides' shows great promise. Copyright © 2017 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Digital-image processing and image analysis of glacier ice
Fitzpatrick, Joan J.
2013-01-01
This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.
Digital Signal Processing and Control for the Study of Gene Networks
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun
2016-04-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks.
Shin, Yong-Jun
2016-04-22
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks
Shin, Yong-Jun
2016-01-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828
Dotan, Dror; Friedmann, Naama
2018-04-01
We propose a detailed cognitive model of multi-digit number reading. The model postulates separate processes for visual analysis of the digit string and for oral production of the verbal number. Within visual analysis, separate sub-processes encode the digit identities and the digit order, and additional sub-processes encode the number's decimal structure: its length, the positions of 0, and the way it is parsed into triplets (e.g., 314987 → 314,987). Verbal production consists of a process that generates the verbal structure of the number, and another process that retrieves the phonological forms of each number word. The verbal number structure is first encoded in a tree-like structure, similarly to syntactic trees of sentences, and then linearized to a sequence of number-word specifiers. This model is based on an investigation of the number processing abilities of seven individuals with different selective deficits in number reading. We report participants with impairment in specific sub-processes of the visual analysis of digit strings - in encoding the digit order, in encoding the number length, or in parsing the digit string to triplets. Other participants were impaired in verbal production, making errors in the number structure (shifts of digits to another decimal position, e.g., 3,040 → 30,004). Their selective deficits yielded several dissociations: first, we found a double dissociation between visual analysis deficits and verbal production deficits. Second, several dissociations were found within visual analysis: a double dissociation between errors in digit order and errors in the number length; a dissociation between order/length errors and errors in parsing the digit string into triplets; and a dissociation between the processing of different digits - impaired order encoding of the digits 2-9, without errors in the 0 position. Third, within verbal production, a dissociation was found between digit shifts and substitutions of number words. A selective deficit in any of the processes described by the model would cause difficulties in number reading, which we propose to term "dysnumeria". Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vaiopoulos, Aristidis D.; Georgopoulos, Andreas; Lozios, Stylianos G.
2012-10-01
A relatively new field of interest, which continuously gains grounds nowadays, is digital 3D modeling. However, the methodologies, the accuracy and the time and effort required to produce a high quality 3D model have been changing drastically the last few years. Whereas in the early days of digital 3D modeling, 3D models were only accessible to computer experts in animation, working many hours in expensive sophisticated software, today 3D modeling has become reasonably fast and convenient. On top of that, with online 3D modeling software, such as 123D Catch, nearly everyone can produce 3D models with minimum effort and at no cost. The only requirement is panoramic overlapping images, of the (still) objects the user wishes to model. This approach however, has limitations in the accuracy of the model. An objective of the study is to examine these limitations by assessing the accuracy of this 3D modeling methodology, with a Terrestrial Laser Scanner (TLS). Therefore, the scope of this study is to present and compare 3D models, produced with two different methods: 1) Traditional TLS method with the instrument ScanStation 2 by Leica and 2) Panoramic overlapping images obtained with DSLR camera and processed with 123D Catch free software. The main objective of the study is to evaluate advantages and disadvantages of the two 3D model producing methodologies. The area represented with the 3D models, features multi-scale folding in a cipollino marble formation. The most interesting part and most challenging to capture accurately, is an outcrop which includes vertically orientated micro folds. These micro folds have dimensions of a few centimeters while a relatively strong relief is evident between them (perhaps due to different material composition). The area of interest is located in Mt. Hymittos, Greece.
Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization
NASA Astrophysics Data System (ADS)
Repola, L.; Memmolo, R.; Signoretti, D.
2015-04-01
In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.
Radar echo processing with partitioned de-ramp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubbert, Dale F.; Tise, Bertice L.
2013-03-19
The spurious-free dynamic range of a wideband radar system is increased by apportioning de-ramp processing across analog and digital processing domains. A chirp rate offset is applied between the received waveform and the reference waveform that is used for downconversion to the intermediate frequency (IF) range. The chirp rate offset results in a residual chirp in the IF signal prior to digitization. After digitization, the residual IF chirp is removed with digital signal processing.
The application of digital signal processing techniques to a teleoperator radar system
NASA Technical Reports Server (NTRS)
Pujol, A.
1982-01-01
A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.
The place-value of a digit in multi-digit numbers is processed automatically.
Kallai, Arava Y; Tzelgov, Joseph
2012-09-01
The automatic processing of the place-value of digits in a multi-digit number was investigated in 4 experiments. Experiment 1 and two control experiments employed a numerical comparison task in which the place-value of a non-zero digit was varied in a string composed of zeros. Experiment 2 employed a physical comparison task in which strings of digits varied in their physical sizes. In both types of tasks, the place-value of the non-zero digit in the string was irrelevant to the task performed. Interference of the place-value information was found in both tasks. When the non-zero digit occupied a lower place-value, it was recognized slower as a larger digit or as written in a larger font size. We concluded that place-value in a multi-digit number is processed automatically. These results support the notion of a decomposed representation of multi-digit numbers in memory. PsycINFO Database Record (c) 2012 APA, all rights reserved.
The Need for (Digital) Story: First Graders Using Digital Tools to Tell Stories
ERIC Educational Resources Information Center
Solomon, Marva Jeanine
2010-01-01
The purpose of this study was to explore the process and product of African American First Graders as they participated in digital storytelling. Of interest was the role digital tools played in the creation process. Eight participants participated in 18 study sessions during which they composed, recorded, and then shared their digital texts with…
Rossini, Gabriele; Parrini, Simone; Castroflorio, Tommaso; Deregibus, Andrea; Debernardi, Cesare L
2016-02-01
Our objective was to assess the accuracy, validity, and reliability of measurements obtained from virtual dental study models compared with those obtained from plaster models. PubMed, PubMed Central, National Library of Medicine Medline, Embase, Cochrane Central Register of Controlled Clinical trials, Web of Knowledge, Scopus, Google Scholar, and LILACs were searched from January 2000 to November 2014. A grading system described by the Swedish Council on Technology Assessment in Health Care and the Cochrane tool for risk of bias assessment were used to rate the methodologic quality of the articles. Thirty-five relevant articles were selected. The methodologic quality was high. No significant differences were observed for most of the studies in all the measured parameters, with the exception of the American Board of Orthodontics Objective Grading System. Digital models are as reliable as traditional plaster models, with high accuracy, reliability, and reproducibility. Landmark identification, rather than the measuring device or the software, appears to be the greatest limitation. Furthermore, with their advantages in terms of cost, time, and space required, digital models could be considered the new gold standard in current practice. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
DiNardo, Thomas P.; Jackson, R. Alan
1984-01-01
An analysis of land use change for an area in Boulder County, Colorado, was conducted using digital cartographic data. The authors selected data in the Geographic Information Retrieval and Analysis System (GIRAS) format which is digitized from the 1:250,000-scale land use and land cover map series. The Map Overlay and Statistical System (MOSS) was used as an analytical tool for the study. The authors describe the methodology used in converting the GIRAS file into a MOSS format and the activities associated with the conversion.
Characterization of the faulted behavior of digital computers and fault tolerant systems
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Miner, Paul S.
1989-01-01
A development status evaluation is presented for efforts conducted at NASA-Langley since 1977, toward the characterization of the latent fault in digital fault-tolerant systems. Attention is given to the practical, high speed, generalized gate-level logic system simulator developed, as well as to the validation methodology used for the simulator, on the basis of faultable software and hardware simulations employing a prototype MIL-STD-1750A processor. After validation, latency tests will be performed.
High-density digital recording
NASA Technical Reports Server (NTRS)
Kalil, F. (Editor); Buschman, A. (Editor)
1985-01-01
The problems associated with high-density digital recording (HDDR) are discussed. Five independent users of HDDR systems and their problems, solutions, and insights are provided as guidance for other users of HDDR systems. Various pulse code modulation coding techniques are reviewed. An introduction to error detection and correction head optimization theory and perpendicular recording are provided. Competitive tape recorder manufacturers apply all of the above theories and techniques and present their offerings. The methodology used by the HDDR Users Subcommittee of THIC to evaluate parallel HDDR systems is presented.
NASA Astrophysics Data System (ADS)
Garagnani, S.; Manferdini, A. M.
2013-02-01
Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.
Sund, T; Olsen, J B
2006-09-01
To investigate whether sliding window adaptive histogram equalization (SWAHE) of digital mammograms improves the detection of simulated calcifications, as compared to images normalized by global histogram equalization (GHE). Direct digital mammograms were obtained from mammary tissue phantoms superimposed with different frames. Each frame was divided into forty squares by a wire mesh, and contained granular calcifications randomly positioned in about 50% of the squares. Three radiologists read the mammograms on a display monitor. They classified their confidence in the presence of microcalcifications in each square on a scale of 1 to 5. Images processed with GHE were first read and used as a reference. In a later session, the same images processed with SWAHE were read. The results were compared using ROC methodology. When the total areas AZ were compared, the results were completely equivocal. When comparing the high-specificity partial ROC area AZ,0.2 below false-positive fraction (FPF) 0.20, two of the three observers performed best with the images processed with SWAHE. The difference was not statistically significant. When the reader's confidence threshold in malignancy is set at a high level, increasing the contrast of mammograms with SWAHE may enhance the visibility of microcalcifications without adversely affecting the false-positive rate. When the reader's confidence threshold is set at a low level, the effect of SWAHE is an increase of false positives. Further investigation is needed to confirm the validity of the conclusions.
NASA Astrophysics Data System (ADS)
Heathfield, D.; Walker, I. J.; Grilliot, M. J.
2016-12-01
The recent emergence of terrestrial laser scanning (TLS) and unmanned aerial systems (UAS) as mapping platforms in geomorphology research has allowed for expedited acquisition of high spatial and temporal resolution, three-dimensional topographic datasets. TLS provides dense 3D `point cloud' datasets that require careful acquisition strategies and appreciable post-processing to produce accurate digital elevation models (DEMs). UAS provide overlapping nadir and oblique imagery that can be analysed using Structure from Motion (SfM) photogrammetry software to provide accurate, high-resolution orthophoto mosaics and accurate digital surface models (DSMs). Both methods yield centimeter to decimeter scale accuracy, depending on various hardware and field acquisition considerations (e.g., camera resolution, flight height, on-site GNSS control, etc.). Combined, the UAS-SfM workflow provides a comparable and more affordable solution to the more expensive TLS or aerial LiDAR methods. This paper compares and contrasts SfM and TLS survey methodologies and related workflow costs and benefits as used to quantify and examine seasonal beach-dune erosion and recovery processes at a site (Calvert Island) on British Columbia's central coast in western Canada. Seasonal SfM- and TLS-derived DEMs were used to quantify spatial patterns of surface elevation change, geomorphic responses, and related significant sediment volume changes. Cluster maps of positive (depositional) and negative (erosional) change are analysed to detect and interpret the geomorphic and sediment budget responses following an erosive water level event during winter 2016 season (Oct. 2015 - Apr. 2016). Vantage cameras also provided qualitative data on the frequency and magnitude of environmental drivers (e.g., tide, wave, wind forcing) of erosion and deposition events during the observation period. In addition, we evaluate the costs, time expenditures, and accuracy considerations for both SfM and TLS methodologies.
Real time flight simulation methodology
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Cook, G.; Mcvey, E. S.
1976-01-01
An example sensitivity study is presented to demonstrate how a digital autopilot designer could make a decision on minimum sampling rate for computer specification. It consists of comparing the simulated step response of an existing analog autopilot and its associated aircraft dynamics to the digital version operating at various sampling frequencies and specifying a sampling frequency that results in an acceptable change in relative stability. In general, the zero order hold introduces phase lag which will increase overshoot and settling time. It should be noted that this solution is for substituting a digital autopilot for a continuous autopilot. A complete redesign could result in results which more closely resemble the continuous results or which conform better to original design goals.
Self-tuning control of attitude and momentum management for the Space Station
NASA Technical Reports Server (NTRS)
Shieh, L. S.; Sunkel, J. W.; Yuan, Z. Z.; Zhao, X. M.
1992-01-01
This paper presents a hybrid state-space self-tuning design methodology using dual-rate sampling for suboptimal digital adaptive control of attitude and momentum management for the Space Station. This new hybrid adaptive control scheme combines an on-line recursive estimation algorithm for indirectly identifying the parameters of a continuous-time system from the available fast-rate sampled data of the inputs and states and a controller synthesis algorithm for indirectly finding the slow-rate suboptimal digital controller from the designed optimal analog controller. The proposed method enables the development of digitally implementable control algorithms for the robust control of Space Station Freedom with unknown environmental disturbances and slowly time-varying dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamada, Yuki; Grippo, Mark A.
2015-01-01
A monitoring plan that incorporates regional datasets and integrates cost-effective data collection methods is necessary to sustain the long-term environmental monitoring of utility-scale solar energy development in expansive, environmentally sensitive desert environments. Using very high spatial resolution (VHSR; 15 cm) multispectral imagery collected in November 2012 and January 2014, an image processing routine was developed to characterize ephemeral streams, vegetation, and land surface in the southwestern United States where increased utility-scale solar development is anticipated. In addition to knowledge about desert landscapes, the methodology integrates existing spectral indices and transformation (e.g., visible atmospherically resistant index and principal components); a newlymore » developed index, erosion resistance index (ERI); and digital terrain and surface models, all of which were derived from a common VHSR image. The methodology identified fine-scale ephemeral streams with greater detail than the National Hydrography Dataset and accurately estimated vegetation distribution and fractional cover of various surface types. The ERI classified surface types that have a range of erosive potentials. The remote-sensing methodology could ultimately reduce uncertainty and monitoring costs for all stakeholders by providing a cost-effective monitoring approach that accurately characterizes the land resources at potential development sites.« less
NASA Astrophysics Data System (ADS)
Ryan, Jonathan C.; Hubbard, Alun; Box, Jason E.; Brough, Stephen; Cameron, Karen; Cook, Joseph M.; Cooper, Matthew; Doyle, Samuel H.; Edwards, Arwyn; Holt, Tom; Irvine-Fynn, Tristram; Jones, Christine; Pitcher, Lincoln H.; Rennermalm, Asa K.; Smith, Laurence C.; Stibal, Marek; Snooke, Neal
2017-05-01
Measurements of albedo are a prerequisite for modelling surface melt across the Earth's cryosphere, yet available satellite products are limited in spatial and/or temporal resolution. Here, we present a practical methodology to obtain centimetre resolution albedo products with accuracies of 5% using consumer-grade digital camera and unmanned aerial vehicle (UAV) technologies. Our method comprises a workflow for processing, correcting and calibrating raw digital images using a white reference target, and upward and downward shortwave radiation measurements from broadband silicon pyranometers. We demonstrate the method with a set of UAV sorties over the western, K-sector of the Greenland Ice Sheet. The resulting albedo product, UAV10A1, covers 280 km2, at a resolution of 20 cm per pixel and has a root-mean-square difference of 3.7% compared to MOD10A1 and 4.9% compared to ground-based broadband pyranometer measurements. By continuously measuring downward solar irradiance, the technique overcomes previous limitations due to variable illumination conditions during and between surveys over glaciated terrain. The current miniaturization of multispectral sensors and incorporation of upward facing radiation sensors on UAV packages means that this technique will likely become increasingly attractive in field studies and used in a wide range of applications for high temporal and spatial resolution surface mapping of debris, dust, cryoconite and bioalbedo and for directly constraining surface energy balance models.
Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia
2015-11-01
The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Measuring systems of hard to get objects: problems with analysis of measurement results
NASA Astrophysics Data System (ADS)
Gilewska, Grazyna
2005-02-01
The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.
Fully printable, strain-engineered electronic wrap for customizable soft electronics.
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-24
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
NASA Astrophysics Data System (ADS)
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-01-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form. PMID:28338055
A decomposition approach to the design of a multiferroic memory bit
NASA Astrophysics Data System (ADS)
Acevedo, Ruben; Liang, Cheng-Yen; Carman, Gregory P.; Sepulveda, Abdon E.
2017-06-01
The objective of this paper is to present a methodology for the design of a memory bit to minimize the energy required to write data at the bit level. By straining a ferromagnetic nickel nano-dot by means of a piezoelectric substrate, its magnetization vector rotates between two stable states defined as a 1 and 0 for digital memory. The memory bit geometry, actuation mechanism and voltage control law were used as design variables. The approach used was to decompose the overall design process into simpler sub-problems whose structure can be exploited for a more efficient solution. This method minimizes the number of fully dynamic coupled finite element analyses required to converge to a near optimal design, thus decreasing the computational time for the design process. An in-plane sample design problem is presented to illustrate the advantages and flexibility of the procedure.
Blurriness in Live Forensics: An Introduction
NASA Astrophysics Data System (ADS)
Savoldi, Antonio; Gubian, Paolo
The Live Forensics discipline aims at answering basic questions related to a digital crime, which usually involves a computer-based system. The investigation should be carried out with the very goal to establish which processes were running, when they were started and by whom, what specific activities those processes were doing and the state of active network connections. Besides, a set of tools needs to be launched on the running system by altering, as a consequence of the Locard’s exchange principle [2], the system’s memory. All the methodologies for the live forensics field proposed until now have a basic, albeit important, weakness, which is the inability to quantify the perturbation, or blurriness, of the system’s memory of the investigated computer. This is the very last goal of this paper: to provide a set of guidelines which can be effectively used for measuring the uncertainty of the collected volatile memory on a live system being investigated.
Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6
NASA Technical Reports Server (NTRS)
Lee, George
1993-01-01
A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.
Digital modeling of end-mill cutting tools for FEM applications from the active cutting contour
NASA Astrophysics Data System (ADS)
Salguero, Jorge; Marcos, M.; Batista, M.; Gómez, A.; Mayuet, P.; Bienvenido, R.
2012-04-01
A very current technique in the research field of machining by material removal is the use of simulations using the Finite Element Method (FEM). Nevertheless, and although is widely used in processes that allows approximations to orthogonal cutting, such as shaping, is scarcely used in more complexes processes, such as milling. This fact is due principally to the complex geometry of the cutting tools in these processes, and the need to realize the studi es in an oblique cutting configuration. This paper shows a methodology for the geometrical characterization of commercial endmill cutting tools, by the extraction of the cutting tool contour, making use of optical metrology, and using this geometry to model the active cutting zone with a 3D CAD software. This model is easily exportable to different CAD formats, such as IGES or STEP, and importable from FEM software, where is possible to study the behavior in service of the same ones.
Towards Knowledge Management for Smart Manufacturing.
Feng, Shaw C; Bernstein, William Z; Hedberg, Thomas; Feeney, Allison Barnard
2017-09-01
The need for capturing knowledge in the digital form in design, process planning, production, and inspection has increasingly become an issue in manufacturing industries as the variety and complexity of product lifecycle applications increase. Both knowledge and data need to be well managed for quality assurance, lifecycle-impact assessment, and design improvement. Some technical barriers exist today that inhibit industry from fully utilizing design, planning, processing, and inspection knowledge. The primary barrier is a lack of a well-accepted mechanism that enables users to integrate data and knowledge. This paper prescribes knowledge management to address a lack of mechanisms for integrating, sharing, and updating domain-specific knowledge in smart manufacturing. Aspects of the knowledge constructs include conceptual design, detailed design, process planning, material property, production, and inspection. The main contribution of this paper is to provide a methodology on what knowledge manufacturing organizations access, update, and archive in the context of smart manufacturing. The case study in this paper provides some example knowledge objects to enable smart manufacturing.
Interactive, Online, Adsorption Lab to Support Discovery of the Scientific Process
NASA Astrophysics Data System (ADS)
Carroll, K. C.; Ulery, A. L.; Chamberlin, B.; Dettmer, A.
2014-12-01
Science students require more than methods practice in lab activities; they must gain an understanding of the application of the scientific process through lab work. Large classes, time constraints, and funding may limit student access to science labs, denying students access to the types of experiential learning needed to motivate and develop new scientists. Interactive, discovery-based computer simulations and virtual labs provide an alternative, low-risk opportunity for learners to engage in lab processes and activities. Students can conduct experiments, collect data, draw conclusions, and even abort a session. We have developed an online virtual lab, through which students can interactively develop as scientists as they learn about scientific concepts, lab equipment, and proper lab techniques. Our first lab topic is adsorption of chemicals to soil, but the methodology is transferrable to other topics. In addition to learning the specific procedures involved in each lab, the online activities will prompt exploration and practice in key scientific and mathematical concepts, such as unit conversion, significant digits, assessing risks, evaluating bias, and assessing quantity and quality of data. These labs are not designed to replace traditional lab instruction, but to supplement instruction on challenging or particularly time-consuming concepts. To complement classroom instruction, students can engage in a lab experience outside the lab and over a shorter time period than often required with real-world adsorption studies. More importantly, students can reflect, discuss, review, and even fail at their lab experience as part of the process to see why natural processes and scientific approaches work the way they do. Our Media Productions team has completed a series of online digital labs available at virtuallabs.nmsu.edu and scienceofsoil.com, and these virtual labs are being integrated into coursework to evaluate changes in student learning.
Fundamentals of in Situ Digital Camera Methodology for Water Quality Monitoring of Coast and Ocean
Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave
2009-01-01
Conventional digital cameras, the Nikon Coolpix885® and the SeaLife ECOshot®, were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method. PMID:22346729
NASA Technical Reports Server (NTRS)
Patterson, G.
1973-01-01
The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.
Placement-aware decomposition of a digital standard cells library for double patterning lithography
NASA Astrophysics Data System (ADS)
Wassal, Amr G.; Sharaf, Heba; Hammouda, Sherif
2012-11-01
To continue scaling the circuit features down, Double Patterning (DP) technology is needed in 22nm technologies and lower. DP requires decomposing the layout features into two masks for pitch relaxation, such that the spacing between any two features on each mask is greater than the minimum allowed mask spacing. The relaxed pitches of each mask are then processed on two separate exposure steps. In many cases, post-layout decomposition fails to decompose the layout into two masks due to the presence of conflicts. Post-layout decomposition of a standard cells block can result in native conflicts inside the cells (internal conflict), or native conflicts on the boundary between two cells (boundary conflict). Resolving native conflicts requires a redesign and/or multiple iterations for the placement and routing phases to get a clean decomposition. Therefore, DP compliance must be considered in earlier phases, before getting the final placed cell block. The main focus of this paper is generating a library of decomposed standard cells to be used in a DP-aware placer. This library should contain all possible decompositions for each standard cell, i.e., these decompositions consider all possible combinations of boundary conditions. However, the large number of combinations of boundary conditions for each standard cell will significantly increase the processing time and effort required to obtain all possible decompositions. Therefore, an efficient methodology is required to reduce this large number of combinations. In this paper, three different reduction methodologies are proposed to reduce the number of different combinations processed to get the decomposed library. Experimental results show a significant reduction in the number of combinations and decompositions needed for the library processing. To generate and verify the proposed flow and methodologies, a prototype for a placement-aware DP-ready cell-library is developed with an optimized number of cell views.
Digital image processing for photo-reconnaissance applications
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1972-01-01
Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.
Stewardship of very large digital data archives
NASA Technical Reports Server (NTRS)
Savage, Patric
1991-01-01
An archive is a permanent store. There are relatively few very large digital data archives in existence. Most business records are expired within five or ten years. Many kinds of business records that do have long lives are embedded in data bases that are continually updated and re-issued cyclically. Also, a great deal of permanent business records are actually archived as microfilm, fiche, or optical disk images - their digital version being an operational convenience rather than an archive. The problems forseen in stewarding the very large digital data archives that will accumulate during the mission of the Earth Observing System (EOS) are addressed. It focuses on the function of shepherding archived digital data into an endless future. Stewardship entails storing and protecting the archive and providing meaningful service to the community of users. The steward will (1) provide against loss due to physical phenomena; (2) assure that data is not lost due to storage technology obsolescence; and (3) maintain data in a current formatting methodology.
To zoom or not to zoom: do we have enough pixels?
NASA Astrophysics Data System (ADS)
Youngworth, Richard N.; Herman, Eric
2015-09-01
Common lexicon in imaging systems includes the frequently used term digital zoom. Of course this term is somewhat of a misnomer as there is no actual zooming in such systems. Instead, digital zoom describes the zoom effect that comes with an image rewriting or reprinting that perhaps can be more accurately described as cropping and enlarging an image (a pixel remapping) for viewing. If done properly, users of the overall hybrid digital-optical system do not know the methodology employed. Hence the essential question, pondered and manipulated since the advent of mature digital image science, really becomes "do we have enough pixels to avoid optical zoom." This paper discusses known imaging factors for hybrid digital-optical systems, most notably resolution considerations. The paper is fundamentally about communication, and thereby includes information useful to the greater consumer, technical, and business community who all have an interest in understanding the key technical details that have driven the amazing technology and development of zoom systems.
Italian University Students and Digital Technologies: Some Results from a Field Research
NASA Astrophysics Data System (ADS)
Ferri, Paolo; Cavalli, Nicola; Costa, Elisabetta; Mangiatordi, Andrea; Mizzella, Stefano; Pozzali, Andrea; Scenini, Francesca
Developments in information and communication technologies have raised the issue of how a kind of intergenerational digital divide can take place between "digital natives" and "digital immigrants". This can in turn have important consequences for the organization of educative systems. In this paper we present the result of a research performed during the course of 2008 to study how university students in Italy make use of digital technologies. The methodology was based on a mix of quantitative and qualitative approaches. A survey research was done, on a sample of 1186 students of the University of Milan-Bicocca, based on a questionnaire administrated through the Intranet of the University. A series of focus groups and in depth interviews with students, parents, and new media experts was furthermore performed. The results are consistent with the presence of a strong intergenerational divide. The implications of the results for the future organization of educative systems are discussed in the paper.
Quality assurance and quality control in mammography: a review of available guidance worldwide.
Reis, Cláudia; Pascoal, Ana; Sakellaris, Taxiarchis; Koutalonis, Manthos
2013-10-01
Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources. •An effective QA program should be practical to implement in a clinical setting. •QA should address the various stages of the imaging chain: acquisition, processing and display. •AEC system QC testing is simple to implement and provides information on equipment performance.
NASA Technical Reports Server (NTRS)
Padula, Santo, II
2009-01-01
The ability to sufficiently measure orbiter window defects to allow for window recertification has been an ongoing challenge for the orbiter vehicle program. The recent Columbia accident has forced even tighter constraints on the criteria that must be met in order to recertify windows for flight. As a result, new techniques are being investigated to improve the reliability, accuracy and resolution of the defect detection process. The methodology devised in this work, which is based on the utilization of a vertical scanning interferometric (VSI) tool, shows great promise for meeting the ever increasing requirements for defect detection. This methodology has the potential of a 10-100 fold greater resolution of the true defect depth than can be obtained from the currently employed micrometer based methodology. An added benefit is that it also produces a digital elevation map of the defect, thereby providing information about the defect morphology which can be utilized to ascertain the type of debris that induced the damage. However, in order to successfully implement such a tool, a greater understanding of the resolution capability and measurement repeatability must be obtained. This work focused on assessing the variability of the VSI-based measurement methodology and revealed that the VSI measurement tool was more repeatable and more precise than the current micrometer based approach, even in situations where operator variation could affect the measurement. The analysis also showed that the VSI technique was relatively insensitive to the hardware and software settings employed, making the technique extremely robust and desirable
"Scratch"ing below the Surface: Mathematics through an Alternative Digital Lens?
ERIC Educational Resources Information Center
Calder, Nigel; Taylor, Merilyn
2010-01-01
A key element in the examination of how students process mathematics through digital technologies is considering the ways that digital pedagogical media might influence the learning process. How might students' understanding emerge through engagement in a digital-learning environment? Interactive software that has cross-curricula implications and…
An Interactive Graphics Program for Investigating Digital Signal Processing.
ERIC Educational Resources Information Center
Miller, Billy K.; And Others
1983-01-01
Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)
León, María Cosio; Nieto-Hipólito, Juan Ivan; Garibaldi-Beltrán, Julián; Amaya-Parra, Guillermo; Luque-Morales, Priscy; Magaña-Espinoza, Pedro; Aguilar-Velazco, José
2016-06-01
Wellness is a term often used to talk about optimal health as "dynamic balance of physical, emotional, social, spiritual, and intellectual health." While healthcare is a term about care offered to patients for improving their health. We use both terms, as well as the Business Model Canvas (BMC) methodology, to design a digital ecosystem model for healthcare and wellness called DE4HW; the model considers economic, technological, and legal asymmetries, which are present on e-services beyond geographical regions. BMC methodology was embedded into the global project strategy called: IBOT (Initiate, Build, Operate and Transfer); it is a methodology to establish a functional, integrated national telemedicine network and virtual education network; of which we took its phases rationale. The results in this work illustrate the design of DE4HW model, into the first phase of IBOT, enriched with the BMC, which enables us to define actors, their interactions, rules and protocols, in order to build DE4HW, while IBOT strategy manages the project goal, up to the transfer phase, where an integral service platform of healthcare and wellness is turned over to stakeholders.
Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.
2016-01-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674
Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D
2017-03-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.
Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images
Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, Stephanie L.; Velasco, M.G.
2002-01-01
Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.
NASA Astrophysics Data System (ADS)
Donnay, Karsten
2015-03-01
The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
Acoustics based assessment of respiratory diseases using GMM classification.
Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J
2010-01-01
The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.
A unified approach for development of Urdu Corpus for OCR and demographic purpose
NASA Astrophysics Data System (ADS)
Choudhary, Prakash; Nain, Neeta; Ahmed, Mushtaq
2015-02-01
This paper presents a methodology for the development of an Urdu handwritten text image Corpus and application of Corpus linguistics in the field of OCR and information retrieval from handwritten document. Compared to other language scripts, Urdu script is little bit complicated for data entry. To enter a single character it requires a combination of multiple keys entry. Here, a mixed approach is proposed and demonstrated for building Urdu Corpus for OCR and Demographic data collection. Demographic part of database could be used to train a system to fetch the data automatically, which will be helpful to simplify existing manual data-processing task involved in the field of data collection such as input forms like Passport, Ration Card, Voting Card, AADHAR, Driving licence, Indian Railway Reservation, Census data etc. This would increase the participation of Urdu language community in understanding and taking benefit of the Government schemes. To make availability and applicability of database in a vast area of corpus linguistics, we propose a methodology for data collection, mark-up, digital transcription, and XML metadata information for benchmarking.
Intelligent approach to prognostic enhancements of diagnostic systems
NASA Astrophysics Data System (ADS)
Vachtsevanos, George; Wang, Peng; Khiripet, Noppadon; Thakker, Ash; Galie, Thomas R.
2001-07-01
This paper introduces a novel methodology to prognostics based on a dynamic wavelet neural network construct and notions from the virtual sensor area. This research has been motivated and supported by the U.S. Navy's active interest in integrating advanced diagnostic and prognostic algorithms in existing Naval digital control and monitoring systems. A rudimentary diagnostic platform is assumed to be available providing timely information about incipient or impending failure conditions. We focus on the development of a prognostic algorithm capable of predicting accurately and reliably the remaining useful lifetime of a failing machine or component. The prognostic module consists of a virtual sensor and a dynamic wavelet neural network as the predictor. The virtual sensor employs process data to map real measurements into difficult to monitor fault quantities. The prognosticator uses a dynamic wavelet neural network as a nonlinear predictor. Means to manage uncertainty and performance metrics are suggested for comparison purposes. An interface to an available shipboard Integrated Condition Assessment System is described and applications to shipboard equipment are discussed. Typical results from pump failures are presented to illustrate the effectiveness of the methodology.
How Digital Image Processing Became Really Easy
NASA Astrophysics Data System (ADS)
Cannon, Michael
1988-02-01
In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.
Estimating Coastal Digital Elevation Model (DEM) Uncertainty
NASA Astrophysics Data System (ADS)
Amante, C.; Mesick, S.
2017-12-01
Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.
Barone, Sandro; Neri, Paolo; Paoli, Alessandro; Razionale, Armando Viviano
2018-01-01
Orthodontic treatments are usually performed using fixed brackets or removable oral appliances, which are traditionally made from alginate impressions and wax registrations. Among removable devices, eruption guidance appliances are used for early orthodontic treatments in order to intercept and prevent malocclusion problems. Commercially available eruption guidance appliances, however, are symmetric devices produced using a few standard sizes. For this reason, they are not able to meet all the specific patient's needs since the actual dental anatomies present various geometries and asymmetric conditions. In this article, a computer-aided design-based methodology for the design and manufacturing of a patient-specific eruption guidance appliances is presented. The proposed approach is based on the digitalization of several steps of the overall process: from the digital reconstruction of patients' anatomies to the manufacturing of customized appliances. A finite element model has been developed to evaluate the temporomandibular joint disks stress level caused by using symmetric eruption guidance appliances with different teeth misalignment conditions. The developed model can then be used to guide the design of a patient-specific appliance with the aim at reducing the patient discomfort. At this purpose, two different customization levels are proposed in order to face both arches and single tooth misalignment issues. A low-cost manufacturing process, based on an additive manufacturing technique, is finally presented and discussed.
Quantitative PCR and Digital PCR for Detection of Ascaris lumbricoides Eggs in Reclaimed Water
Santísima-Trinidad, Ana Belén; Bornay-Llinares, Fernando Jorge; Martín González, Marcos; Pascual Valero, José Antonio; Ros Muñoz, Margarita
2017-01-01
The reuse of reclaimed water from wastewater depuration is a widespread and necessary practice in many areas around the world and must be accompanied by adequate and continuous quality control. Ascaris lumbricoides is one of the soil-transmitted helminths (STH) with risk for humans due to its high infectivity and an important determinant of transmission is the inadequacy of water supplies and sanitation. The World Health Organization (WHO) recommends a limit equal to or lower than one parasitic helminth egg per liter, to reuse reclaimed water for unrestricted irrigation. We present two new protocols of DNA extraction from large volumes of reclaimed water. Quantitative PCR (qPCR) and digital PCR (dPCR) were able to detect low amounts of A. lumbricoides eggs. By using the first extraction protocol, which processes 500 mL of reclaimed water, qPCR can detect DNA concentrations as low as one A. lumbricoides egg equivalent, while dPCR can detect DNA concentrations as low as five A. lumbricoides egg equivalents. By using the second protocol, which processes 10 L of reclaimed water, qPCR was able to detect DNA concentrations equivalent to 20 A. lumbricoides eggs. This fact indicated the importance of developing new methodologies to detect helminth eggs with higher sensitivity and precision avoiding possible human infection risks. PMID:28377928
E-inclusion Process and Societal Digital Skill Development
ERIC Educational Resources Information Center
Vitolina, Ieva
2015-01-01
Nowadays, the focus shifts from information and communication technology access to skills and knowledge. Moreover, lack of digital skills is an obstacle in the process of learning new digital competences using technologies and e-learning. The objective of this research is to investigate how to facilitate students to use the acquired digital skills…
The AAPM/RSNA physics tutorial for residents: digital fluoroscopy.
Pooley, R A; McKinney, J M; Miller, D A
2001-01-01
A digital fluoroscopy system is most commonly configured as a conventional fluoroscopy system (tube, table, image intensifier, video system) in which the analog video signal is converted to and stored as digital data. Other methods of acquiring the digital data (eg, digital or charge-coupled device video and flat-panel detectors) will become more prevalent in the future. Fundamental concepts related to digital imaging in general include binary numbers, pixels, and gray levels. Digital image data allow the convenient use of several image processing techniques including last image hold, gray-scale processing, temporal frame averaging, and edge enhancement. Real-time subtraction of digital fluoroscopic images after injection of contrast material has led to widespread use of digital subtraction angiography (DSA). Additional image processing techniques used with DSA include road mapping, image fade, mask pixel shift, frame summation, and vessel size measurement. Peripheral angiography performed with an automatic moving table allows imaging of the peripheral vasculature with a single contrast material injection.
ERPs and oscillations during encoding predict retrieval of digit memory in superior mnemonists.
Pan, Yafeng; Li, Xianchun; Chen, Xi; Ku, Yixuan; Dong, Yujie; Dou, Zheng; He, Lin; Hu, Yi; Li, Weidong; Zhou, Xiaolin
2017-10-01
Previous studies have consistently demonstrated that superior mnemonists (SMs) outperform normal individuals in domain-specific memory tasks. However, the neural correlates of memory-related processes remain unclear. In the current EEG study, SMs and control participants performed a digit memory task during which their brain activity was recorded. Chinese SMs used a digit-image mnemonic for encoding digits, in which they associated 2-digit groups with images immediately after the presentation of each even-position digit in sequences. Behaviorally, SMs' memory of digit sequences was better than the controls'. During encoding in the study phase, SMs showed an increased right central P2 (150-250ms post onset) and a larger right posterior high-alpha (10-14Hz, 500-1720ms) oscillation on digits at even-positions compared with digits at odd-positions. Both P2 and high-alpha oscillations in the study phase co-varied with performance in the recall phase, but only in SMs, indicating that neural dynamics during encoding could predict successful retrieval of digit memory in SMs. Our findings suggest that representation of a digit sequence in SMs using mnemonics may recruit both the early-stage attention allocation process and the sustained information preservation process. This study provides evidence for the role of dynamic and efficient neural encoding processes in mnemonists. Copyright © 2017. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Pesci, Arianna; Fabris, Massimo; Conforti, Dario; Loddo, Fabiana; Baldi, Paolo; Anzidei, Marco
2007-05-01
This work deals with the integration of different surveying methodologies for the definition of very accurate Digital Terrain Models (DTM) and/or Digital Surface Models (DSM): in particular, the aerial digital photogrammetry and the terrestrial laser scanning were used to survey the Vesuvio volcano, allowing the total coverage of the internal cone and surroundings (the whole surveyed area was about 3 km × 3 km). The possibility to reach a very high precision, especially from the laser scanner data set, allowed a detailed description of the morphology of the volcano. The comparisons of models obtained in repeated surveys allow a detailed map of residuals providing a data set that can be used for detailed studies of the morphological evolution. Moreover, the reflectivity information, highly correlated to materials properties, allows for the measurement and quantification of some morphological variations in areas where structural discontinuities and displacements are present.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keller, Todd M.; Benjamin, Jacob S.; Wright, Virginia L.
This paper will describe a practical methodology for understanding the cyber risk of a digital asset. This research attempts to gain a greater understanding of the cyber risk posed by a hardware-based computer asset by considering it as a sum of its hardware and software based sub-components.
DESDynI Quad First Stage Processor - A Four Channel Digitizer and Digital Beam Forming Processor
NASA Technical Reports Server (NTRS)
Chuang, Chung-Lun; Shaffer, Scott; Smythe, Robert; Niamsuwan, Noppasin; Li, Samuel; Liao, Eric; Lim, Chester; Morfopolous, Arin; Veilleux, Louise
2013-01-01
The proposed Deformation, Eco-Systems, and Dynamics of Ice Radar (DESDynI-R) L-band SAR instrument employs multiple digital channels to optimize resolution while keeping a large swath on a single pass. High-speed digitization with very fine synchronization and digital beam forming are necessary in order to facilitate this new technique. The Quad First Stage Processor (qFSP) was developed to achieve both the processing performance as well as the digitizing fidelity in order to accomplish this sweeping SAR technique. The qFSP utilizes high precision and high-speed analog to digital converters (ADCs), each with a finely adjustable clock distribution network to digitize the channels at the fidelity necessary to allow for digital beam forming. The Xilinx produced FX130T Virtex 5 part handles the processing to digitally calibrate each channel as well as filter and beam form the receive signals. Demonstrating the digital processing required for digital beam forming and digital calibration is instrumental to the viability of the proposed DESDynI instrument. The qFSP development brings this implementation to Technology Readiness Level (TRL) 6. This paper will detail the design and development of the prototype qFSP as well as the preliminary results from hardware tests.
Kidney transplantation process in Brazil represented in business process modeling notation.
Peres Penteado, A; Molina Cohrs, F; Diniz Hummel, A; Erbs, J; Maciel, R F; Feijó Ortolani, C L; de Aguiar Roza, B; Torres Pisa, I
2015-05-01
Kidney transplantation is considered to be the best treatment for people with chronic kidney failure, because it improves the patients' quality of life and increases their length of survival compared with patients undergoing dialysis. The kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no visual representation of this process. The aim of this study was to analyze official documents to construct a representation of the kidney transplantation process in Brazil with the use of business process modeling notation (BPMN). The methodology for this study was based on an exploratory observational study, document analysis, and construction of process diagrams with the use of BPMN. Two rounds of validations by specialists were conducted. The result includes the kidney transplantation process in Brazil representation with the use of BPMN. We analyzed 2 digital documents that resulted in 2 processes with 45 total of activities and events, 6 organizations involved, and 6 different stages of the process. The constructed representation makes it easier to understand the rules for the business of kidney transplantation and can be used by the health care professionals involved in the various activities within this process. Construction of a representation with language appropriate for the Brazilian lay public is underway. Copyright © 2015 Elsevier Inc. All rights reserved.
Five task clusters that enable efficient and effective digitization of biological collections
Nelson, Gil; Paul, Deborah; Riccardi, Gregory; Mast, Austin R.
2012-01-01
Abstract This paper describes and illustrates five major clusters of related tasks (herein referred to as task clusters) that are common to efficient and effective practices in the digitization of biological specimen data and media. Examples of these clusters come from the observation of diverse digitization processes. The staff of iDigBio (The U.S. National Science Foundation’s National Resource for Advancing Digitization of Biological Collections) visited active biological and paleontological collections digitization programs for the purpose of documenting and assessing current digitization practices and tools. These observations identified five task clusters that comprise the digitization process leading up to data publication: (1) pre-digitization curation and staging, (2) specimen image capture, (3) specimen image processing, (4) electronic data capture, and (5) georeferencing locality descriptions. While not all institutions are completing each of these task clusters for each specimen, these clusters describe a composite picture of digitization of biological and paleontological specimens across the programs that were observed. We describe these clusters, three workflow patterns that dominate the implemention of these clusters, and offer a set of workflow recommendations for digitization programs. PMID:22859876
NASA Astrophysics Data System (ADS)
Pásztor, László; Dobos, Endre; Szabó, József; Bakacsi, Zsófia; Laborczi, Annamária
2013-04-01
There is a heap of evidences that demands on soil related information have been significant worldwide and it is still increasing. Soil maps were typically used for long time to satisfy these demands. By the spread of GI technology, spatial soil information systems (SSIS) and digital soil mapping (DSM) took the role of traditional soil maps. Due to the relatively high costs of data collection, new conventional soil surveys and inventories are getting less and less frequent, which fact valorises legacy soil information and the systems which are serving the their digitally processed version. The existing data contain a wealth of information that can be exploited by proper methodology. Not only the degree of current needs for soil information has changed but also its nature. Traditionally the agricultural functions of soils were focussed on, which was also reflected in the methodology of data collection and mapping. Recently the multifunctionality of soils is getting to gain more and more ground; consequently information related to additional functions of soils becomes identically important. The new types of information requirements however cannot be fulfilled generally with new data collections at least not on such a level as it was done in the frame of traditional soil surveys. Soil monitoring systems have been established for the collection of recent information on the various elements of the DPSIR (Driving Forces-Pressures-State-Impacts-Responses) framework, but the primary goal of these systems has not been mapping by all means. And definitely this is the case concerning the two recently working Hungarian soil monitoring systems. In Hungary, presently soil data requirements are fulfilled with the recently available datasets either by their direct usage or after certain specific and generally fortuitous, thematic and/or spatial inference. Due to the more and more frequently emerging discrepancies between the available and the expected data, there might be notable imperfection as for the accuracy and reliability of the delivered products. Since, similarly to the great majority of the world, large-scale, comprehensive new surveys cannot be expected in the near future, the actually available legacy data should be relied on. With a recently started project we would like to significantly extend the potential, how countrywide soil information requirements could be satisfied. In the frame of our project we plan the execution of spatial and thematic data mining of significant amount of soil related information available in the form of legacy soil data as well as digital databases and spatial soil information systems. In the course of the analyses we will lean on auxiliary, spatial data themes related to environmental elements. Based on the established relationships we will convert and integrate the specific data sets for the regionalization of the various, derived soil parameters. By the aid of GIS and geostatistical tools we will carry out the spatial extension of certain pedological variables featuring the (including degradation) state, processes or functions of soils. We plan to compile digital soil maps which fulfil optimally the national and international demands from points of view of thematic, spatial and temporal accuracy. The targeted spatial resolution of the proposed countrywide, digital, thematic soil property and function maps is at least 1:50.000 (approx. 50-100 meter raster). Our stressful objective is the definite solution of the regionalization of the information collected in the frame of two recent, contemporary, national, systematic soil data collection (not designed for mapping purpose) on the recent state of soils, in order to produce countrywide maps for the spatial inventory of certain soil properties, processes and functions with sufficient accuracy and reliability.
LANDSAT information for state planning
NASA Technical Reports Server (NTRS)
Faust, N. L.; Spann, G. W.
1977-01-01
The transfer of remote sensing technology for the digital processing of LANDSAT data to state and local agencies in Georgia and other southeastern states is discussed. The project consists of a series of workshops, seminars, and demonstration efforts, and transfer of NASA-developed hardware concepts and computer software to state agencies. Throughout the multi-year effort, digital processing techniques have been emphasized classification algorithms. Software for LANDSAT data rectification and processing have been developed and/or transferred. A hardware system is available at EES (engineering experiment station) to allow user interactive processing of LANDSAT data. Seminars and workshops emphasize the digital approach to LANDSAT data utilization and the system improvements scheduled for LANDSATs C and D. Results of the project indicate a substantially increased awareness of the utility of digital LANDSAT processing techniques among the agencies contracted throughout the southeast. In Georgia, several agencies have jointly funded a program to map the entire state using digitally processed LANDSAT data.
NASA Astrophysics Data System (ADS)
Banfi, F.
2017-08-01
Architecture, Engineering and Construction (AEC) industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM) methodologies. This innovative approach needs new developments for information and communication technologies (ICT) in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data) and improve the management of building during its life of cycle (LC). The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact), to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact). This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L'Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service) through the use of new Grade of Generation (GoG) and Information (GoI) (management impact). Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA) and an Automatic Verification System (AVS ) at the same time.
Is complex signal processing for bone conduction hearing aids useful?
Kompis, Martin; Kurz, Anja; Pfiffner, Flurin; Senn, Pascal; Arnold, Andreas; Caversaccio, Marco
2014-05-01
To establish whether complex signal processing is beneficial for users of bone anchored hearing aids. Review and analysis of two studies from our own group, each comparing a speech processor with basic digital signal processing (either Baha Divino or Baha Intenso) and a processor with complex digital signal processing (either Baha BP100 or Baha BP110 power). The main differences between basic and complex signal processing are the number of audiologist accessible frequency channels and the availability and complexity of the directional multi-microphone noise reduction and loudness compression systems. Both studies show a small, statistically non-significant improvement of speech understanding in quiet with the complex digital signal processing. The average improvement for speech in noise is +0.9 dB, if speech and noise are emitted both from the front of the listener. If noise is emitted from the rear and speech from the front of the listener, the advantage of the devices with complex digital signal processing as opposed to those with basic signal processing increases, on average, to +3.2 dB (range +2.3 … +5.1 dB, p ≤ 0.0032). Complex digital signal processing does indeed improve speech understanding, especially in noise coming from the rear. This finding has been supported by another study, which has been published recently by a different research group. When compared to basic digital signal processing, complex digital signal processing can increase speech understanding of users of bone anchored hearing aids. The benefit is most significant for speech understanding in noise.
Richardson, Jonathan; McDonald, Joe
2016-10-01
The move to a digital health service may improve some components of health systems: information, communication and documentation of care. This article gives a brief definition and history of what is meant by an electronic health record (EHR). There is some evidence of benefits in a number of areas, including legibility, accuracy and the secondary use of information, but there is a need for further research, which may need to use different methodologies to analyse the impact an EHR has on patients, professionals and providers.
Shen, Qinhua; Kirschbaum, Miko U F; Hedley, Mike J; Camps Arbestain, Marta
2016-01-01
This study aimed to develop and test an unbiased and rapid methodology to estimate the length of external arbuscular mycorrhizal fungal (AMF) hyphae in soil. The traditional visual gridline intersection (VGI) method, which consists in a direct visual examination of the intersections of hyphae with gridlines on a microscope eyepiece after aqueous extraction, membrane-filtration, and staining (e.g., with trypan blue), was refined. For this, (i) images of the stained hyphae were taken by using a digital photomicrography technique to avoid the use of the microscope and the method was referred to as "digital gridline intersection" (DGI) method; and (ii), the images taken in (i) were processed and the hyphal length was measured by using ImageJ software, referred to as the "photomicrography-ImageJ processing" (PIP) method. The DGI and PIP methods were tested using known grade lengths of possum fur. Then they were applied to measure the hyphal lengths in soils with contrasting phosphorus (P) fertility status. Linear regressions were obtained between the known lengths (Lknown) of possum fur and the values determined by using either the DGI (LDGI) (LDGI = 0.37 + 0.97 × Lknown, r2 = 0.86) or PIP (LPIP) methods (LPIP = 0.33 + 1.01 × Lknown, r2 = 0.98). There were no significant (P > 0.05) differences between the LDGI and LPIP values. While both methods provided accurate estimation (slope of regression being 1.0), the PIP method was more precise, as reflected by a higher value of r2 and lower coefficients of variation. The average hyphal lengths (6.5-19.4 m g-1) obtained by the use of these methods were in the range of those typically reported in the literature (3-30 m g-1). Roots growing in P-deficient soil developed 2.5 times as many hyphae as roots growing in P-rich soil (17.4 vs 7.2 m g-1). These tests confirmed that the use of digital photomicrography in conjunction with either the grid-line intersection principle or image processing is a suitable method for the measurement of AMF hyphal lengths in soils for comparative investigations.
Development of Coriolis mass flowmeter with digital drive and signal processing technology.
Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun
2013-09-01
Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Digital signal processing in microwave radiometers
NASA Technical Reports Server (NTRS)
Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.
1980-01-01
A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.
ERIC Educational Resources Information Center
Onaral, Banu; And Others
This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…
Keyes, S D; Gillard, F; Soper, N; Mavrogordato, M N; Sinclair, I; Roose, T
2016-06-14
The mechanical impedance of soils inhibits the growth of plant roots, often being the most significant physical limitation to root system development. Non-invasive imaging techniques have recently been used to investigate the development of root system architecture over time, but the relationship with soil deformation is usually neglected. Correlative mapping approaches parameterised using 2D and 3D image data have recently gained prominence for quantifying physical deformation in composite materials including fibre-reinforced polymers and trabecular bone. Digital Image Correlation (DIC) and Digital Volume Correlation (DVC) are computational techniques which use the inherent material texture of surfaces and volumes, captured using imaging techniques, to map full-field deformation components in samples during physical loading. Here we develop an experimental assay and methodology for four-dimensional, in vivo X-ray Computed Tomography (XCT) and apply a Digital Volume Correlation (DVC) approach to the data to quantify deformation. The method is validated for a field-derived soil under conditions of uniaxial compression, and a calibration study is used to quantify thresholds of displacement and strain measurement. The validated and calibrated approach is then demonstrated for an in vivo test case in which an extending maize root in field-derived soil was imaged hourly using XCT over a growth period of 19h. This allowed full-field soil deformation data and 3D root tip dynamics to be quantified in parallel for the first time. This fusion of methods paves the way for comparative studies of contrasting soils and plant genotypes, improving our understanding of the fundamental mechanical processes which influence root system development. Copyright © 2016 Elsevier Ltd. All rights reserved.
D Reconstruction of AN Underwater Archaelogical Site: Comparison Between Low Cost Cameras
NASA Astrophysics Data System (ADS)
Capra, A.; Dubbini, M.; Bertacchini, E.; Castagnetti, C.; Mancini, F.
2015-04-01
The 3D reconstruction with a metric content of a submerged area, where objects and structures of archaeological interest are found, could play an important role in the research and study activities and even in the digitization of the cultural heritage. The reconstruction of 3D object, of interest for archaeologists, constitutes a starting point in the classification and description of object in digital format and for successive fruition by user after delivering through several media. The starting point is a metric evaluation of the site obtained with photogrammetric surveying and appropriate 3D restitution. The authors have been applying the underwater photogrammetric technique since several years using underwater digital cameras and, in this paper, digital low cost cameras (off-the-shelf). Results of tests made on submerged objects with three cameras are presented: Canon Power Shot G12, Intova Sport HD e GoPro HERO 2. The experimentation had the goal to evaluate the precision in self-calibration procedures, essential for multimedia underwater photogrammetry, and to analyze the quality of 3D restitution. Precisions obtained in the calibration and orientation procedures was assessed by using three cameras, and an homogeneous set control points. Data were processed with Agisoft Photoscan. Successively, 3D models were created and the comparison of the models derived from the use of different cameras was performed. Different potentialities of the used cameras are reported in the discussion section. The 3D restitution of objects and structures was integrated with sea bottom floor morphology in order to achieve a comprehensive description of the site. A possible methodology of survey and representation of submerged objects is therefore illustrated, considering an automatic and a semi-automatic approach.
DMD: a digital light processing application to projection displays
NASA Astrophysics Data System (ADS)
Feather, Gary A.
1989-01-01
Summary Revolutionary technologies achieve rapid product and subsequent business diffusion only when the in- ventors focus on technology application, maturation, and proliferation. A revolutionary technology is emerg- ing with micro-electromechanical systems (MEMS). MEMS are being developed by leveraging mature semi- conductor processing coupled with mechanical systems into complete, integrated, useful systems. The digital micromirror device (DMD), a Texas Instruments invented MEMS, has focused on its application to projec- tion displays. The DMD has demonstrated its application as a digital light processor, processing and produc- ing compelling computer and video projection displays. This tutorial discusses requirements in the projection display market and the potential solutions offered by this digital light processing system. The seminar in- cludes an evaluation of the market, system needs, design, fabrication, application, and performance results of a system using digital light processing solutions.
Seamless lesion insertion in digital mammography: methodology and reader study
NASA Astrophysics Data System (ADS)
Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman
2016-03-01
Collection of large repositories of clinical images containing verified cancer locations is costly and time consuming due to difficulties associated with both the accumulation of data and establishment of the ground truth. This problem poses a significant challenge to the development of machine learning algorithms that require large amounts of data to properly train and avoid overfitting. In this paper we expand the methods in our previous publications by making several modifications that significantly increase the speed of our insertion algorithms, thereby allowing them to be used for inserting lesions that are much larger in size. These algorithms have been incorporated into an image composition tool that we have made publicly available. This tool allows users to modify or supplement existing datasets by seamlessly inserting a real breast mass or micro-calcification cluster extracted from a source digital mammogram into a different location on another mammogram. We demonstrate examples of the performance of this tool on clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM). Finally, we report the results of a reader study evaluating the realism of inserted lesions compared to clinical lesions. Analysis of the radiologist scores in the study using receiver operating characteristic (ROC) methodology indicates that inserted lesions cannot be reliably distinguished from clinical lesions.
Barisoni, Laura; Troost, Jonathan P; Nast, Cynthia; Bagnasco, Serena; Avila-Casado, Carmen; Hodgin, Jeffrey; Palmer, Matthew; Rosenberg, Avi; Gasim, Adil; Liensziewski, Chrysta; Merlino, Lino; Chien, Hui-Ping; Chang, Anthony; Meehan, Shane M; Gaut, Joseph; Song, Peter; Holzman, Lawrence; Gibson, Debbie; Kretzler, Matthias; Gillespie, Brenda W; Hewitt, Stephen M
2016-07-01
The multicenter Nephrotic Syndrome Study Network (NEPTUNE) digital pathology scoring system employs a novel and comprehensive methodology to document pathologic features from whole-slide images, immunofluorescence and ultrastructural digital images. To estimate inter- and intra-reader concordance of this descriptor-based approach, data from 12 pathologists (eight NEPTUNE and four non-NEPTUNE) with experience from training to 30 years were collected. A descriptor reference manual was generated and a webinar-based protocol for consensus/cross-training implemented. Intra-reader concordance for 51 glomerular descriptors was evaluated on jpeg images by seven NEPTUNE pathologists scoring 131 glomeruli three times (Tests I, II, and III), each test following a consensus webinar review. Inter-reader concordance of glomerular descriptors was evaluated in 315 glomeruli by all pathologists; interstitial fibrosis and tubular atrophy (244 cases, whole-slide images) and four ultrastructural podocyte descriptors (178 cases, jpeg images) were evaluated once by six and five pathologists, respectively. Cohen's kappa for inter-reader concordance for 48/51 glomerular descriptors with sufficient observations was moderate (0.40
Mapping soil texture classes and optimization of the result by accuracy assessment
NASA Astrophysics Data System (ADS)
Laborczi, Annamária; Takács, Katalin; Bakacsi, Zsófia; Szabó, József; Pásztor, László
2014-05-01
There are increasing demands nowadays on spatial soil information in order to support environmental related and land use management decisions. The GlobalSoilMap.net (GSM) project aims to make a new digital soil map of the world using state-of-the-art and emerging technologies for soil mapping and predicting soil properties at fine resolution. Sand, silt and clay are among the mandatory GSM soil properties. Furthermore, soil texture class information is input data of significant agro-meteorological and hydrological models. Our present work aims to compare and evaluate different digital soil mapping methods and variables for producing the most accurate spatial prediction of texture classes in Hungary. In addition to the Hungarian Soil Information and Monitoring System as our basic data, digital elevation model and its derived components, geological database, and physical property maps of the Digital Kreybig Soil Information System have been applied as auxiliary elements. Two approaches have been applied for the mapping process. At first the sand, silt and clay rasters have been computed independently using regression kriging (RK). From these rasters, according to the USDA categories, we have compiled the texture class map. Different combinations of reference and training soil data and auxiliary covariables have resulted several different maps. However, these results consequentially include the uncertainty factor of the three kriged rasters. Therefore we have suited data mining methods as the other approach of digital soil mapping. By working out of classification trees and random forests we have got directly the texture class maps. In this way the various results can be compared to the RK maps. The performance of the different methods and data has been examined by testing the accuracy of the geostatistically computed and the directly classified results. We have used the GSM methodology to assess the most predictive and accurate way for getting the best among the several result maps. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).
Digital pulse processing in Mössbauer spectroscopy
NASA Astrophysics Data System (ADS)
Veiga, A.; Grunfeld, C. M.
2014-04-01
In this work we present some advances towards full digitization of the detection subsystem of a Mössbauer transmission spectrometer. We show how, using adequate instrumentation, preamplifier output of a proportional counter can be digitized with no deterioration in spectrum quality, avoiding the need of a shaping amplifier. A pipelined architecture is proposed for a digital processor, which constitutes a versatile platform for the development of pulse processing techniques. Requirements for minimization of the analog processing are considered and experimental results are presented.
Product Lifecycle Management and the Quest for Sustainable Space Explorations
NASA Technical Reports Server (NTRS)
Caruso, Pamela W.; Dumbacher, Daniel L.
2010-01-01
Product Lifecycle Management (PLM) is an outcome of lean thinking to eliminate waste and increase productivity. PLM is inextricably tied to the systems engineering business philosophy, coupled with a methodology by which personnel, processes and practices, and information technology combine to form an architecture platform for product design, development, manufacturing, operations, and decommissioning. In this model, which is being implemented by the Engineering Directorate at the National Aeronautics and Space Administration's (NASA's) Marshall Space Flight Center, total lifecycle costs are important variables for critical decision-making. With the ultimate goal to deliver quality products that meet or exceed requirements on time and within budget, PLM is a powerful concept to shape everything from engineering trade studies and testing goals, to integrated vehicle operations and retirement scenarios. This paper will demonstrate how the Engineering Directorate is implementing PLM as part of an overall strategy to deliver safe, reliable, and affordable space exploration solutions. It has been 30 years since the United States fielded the Space Shuttle. The next generation space transportation system requires a paradigm shift such that digital tools and knowledge management, which are central elements of PLM, are used consistently to maximum effect. The outcome is a better use of scarce resources, along with more focus on stakeholder and customer requirements, as a new portfolio of enabling tools becomes second nature to the workforce. This paper will use the design and manufacturing processes, which have transitioned to digital-based activities, to show how PLM supports the comprehensive systems engineering and integration function. It also will go through a launch countdown scenario where an anomaly is detected to show how the virtual vehicle created from paperless processes will help solve technical challenges and improve the likelihood of launching on schedule, with less hands-on labor needed for processing and troubleshooting.
Detailed Characterization of Nearshore Processes During NCEX
NASA Astrophysics Data System (ADS)
Holland, K.; Kaihatu, J. M.; Plant, N.
2004-12-01
Recent technology advances have allowed the coupling of remote sensing methods with advanced wave and circulation models to yield detailed characterizations of nearshore processes. This methodology was demonstrated as part of the Nearshore Canyon EXperiment (NCEX) in La Jolla, CA during Fall 2003. An array of high-resolution, color digital cameras was installed to monitor an alongshore distance of nearly 2 km out to depths of 25 m. This digital imagery was analyzed over the three-month period through an automated process to produce hourly estimates of wave period, wave direction, breaker height, shoreline position, sandbar location, and bathymetry at numerous locations during daylight hours. Interesting wave propagation patterns in the vicinity of the canyons were observed. In addition, directional wave spectra and swash / surf flow velocities were estimated using more computationally intensive methods. These measurements were used to provide forcing and boundary conditions for the Delft3D wave and circulation model, giving additional estimates of nearshore processes such as dissipation and rip currents. An optimal approach for coupling these remotely sensed observations to the numerical model was selected to yield accurate, but also timely characterizations. This involved assimilation of directional spectral estimates near the offshore boundary to mimic forcing conditions achieved under traditional approaches involving nested domains. Measurements of breaker heights and flow speeds were also used to adaptively tune model parameters to provide enhanced accuracy. Comparisons of model predictions and video observations show significant correlation. As compared to nesting within larger-scale and coarser resolution models, the advantages of providing boundary conditions data using remote sensing is much improved resolution and fidelity. For example, rip current development was both modeled and observed. These results indicate that this approach to data-model coupling is tenable and may be useful in near-real-time characterizations required by many applied scenarios.
Color imaging technologies in the prepress industry
NASA Astrophysics Data System (ADS)
Silverman, Lee
1992-05-01
Over much of the last half century, electronic technologies have played an increasing role in the prepress production of film and plates prepared for printing presses. The last decade has seen an explosion of technologies capable of supplementing this production. The most outstanding technology infusing this growth has been the microcomputer, but other component technologies have also diversified the capacity for high-quality scanning of photographs. In addition, some fundamental software and affordable laser recorder technologies have provided new approaches to the merging of typographic and halftoned photographic data onto film. The next decade will evolve the methods and the technologies to achieve superior text and image communication on mass distribution media used in the printed page or instead of the printed page. This paper focuses on three domains of electronic prepress classified as the input, transformation, and output phases of the production process. The evolution of the component technologies in each of these three phases is described. The unique attributes in each are defined and then follows a discussion of the pertinent technologies which overlap all three domains. Unique to input is sensor technology and analogue to digital conversion. Unique to the transformation phase is the display on monitor for soft proofing and interactive processing. The display requires special technologies for digital frame storage and high-speed, gamma- compensated, digital to analogue conversion. Unique to output is the need for halftoning and binary recording device linearization or calibration. Specialized direct digital color technologies now allow color quality proofing without the need for writing intermediate separation films, but ultimately these technologies will be supplanted by direct printing technologies. First, dry film processing, then direct plate writing, and finally direct application of ink or toner onto paper at the 20 - 30 thousand impressions per hour now achieved by offset printing. In summary, a review of technological evolution guides industry methodologies that will define a transformation of workflow in graphic arts during the next decade. Prepress production will integrate component technologies with microcomputers in order to optimize the production cycle from graphic design to printed piece. These changes will drastically alter the business structures and tools used to put type and photographs on paper in the volumes expected from printing presses.
Digital image transformation and rectification of spacecraft and radar images
Wu, S.S.C.
1985-01-01
Digital image transformation and rectification can be described in three categories: (1) digital rectification of spacecraft pictures on workable stereoplotters; (2) digital correction of radar image geometry; and (3) digital reconstruction of shaded relief maps and perspective views including stereograms. Digital rectification can make high-oblique pictures workable on stereoplotters that would otherwise not accommodate such extreme tilt angles. It also enables panoramic line-scan geometry to be used to compile contour maps with photogrammetric plotters. Rectifications were digitally processed on both Viking Orbiter and Lander pictures of Mars as well as radar images taken by various radar systems. By merging digital terrain data with image data, perspective and three-dimensional views of Olympus Mons and Tithonium Chasma, also of Mars, are reconstructed through digital image processing. ?? 1985.
Silicon CMOS optical receiver circuits with integrated thin-film compound semiconductor detectors
NASA Astrophysics Data System (ADS)
Brooke, Martin A.; Lee, Myunghee; Jokerst, Nan Marie; Camperi-Ginestet, C.
1995-04-01
While many circuit designers have tackled the problem of CMOS digital communications receiver design, few have considered the problem of circuitry suitable for an all CMOS digital IC fabrication process. Faced with a high speed receiver design the circuit designer will soon conclude that a high speed analog-oriented fabrication process provides superior performance advantages to a digital CMOS process. However, for applications where there are overwhelming reasons to integrate the receivers on the same IC as large amounts of conventional digital circuitry, the low yield and high cost of the exotic analog-oriented fabrication is no longer an option. The issues that result from a requirement to use a digital CMOS IC process cut across all aspects of receiver design, and result in significant differences in circuit design philosophy and topology. Digital ICs are primarily designed to yield small, fast CMOS devices for digital logic gates, thus no effort is put into providing accurate or high speed resistances, or capacitors. This lack of any reliable resistance or capacitance has a significant impact on receiver design. Since resistance optimization is not a prerogative of the digital IC process engineer, the wisest option is thus to not use these elements, opting instead for active circuitry to replace the functions normally ascribed to resistance and capacitance. Depending on the application receiver noise may be a dominant design constraint. The noise performance of CMOS amplifiers is different than bipolar or GaAs MESFET circuits, shot noise is generally insignificant when compared to channel thermal noise. As a result the optimal input stage topology is significantly different for the different technologies. It is found that, at speeds of operation approaching the limits of the digital CMOS process, open loop designs have noise-power-gain-bandwidth tradeoff performance superior to feedback designs. Furthermore, the lack of good resisters and capacitors complicates the use of feedback circuits. Thus feedback is generally not used in the front-end of our digital process CMOS receivers.
Unified Digital Image Display And Processing System
NASA Astrophysics Data System (ADS)
Horii, Steven C.; Maguire, Gerald Q.; Noz, Marilyn E.; Schimpf, James H.
1981-11-01
Our institution like many others, is faced with a proliferation of medical imaging techniques. Many of these methods give rise to digital images (e.g. digital radiography, computerized tomography (CT) , nuclear medicine and ultrasound). We feel that a unified, digital system approach to image management (storage, transmission and retrieval), image processing and image display will help in integrating these new modalities into the present diagnostic radiology operations. Future techniques are likely to employ digital images, so such a system could readily be expanded to include other image sources. We presently have the core of such a system. We can both view and process digital nuclear medicine (conventional gamma camera) images, positron emission tomography (PET) and CT images on a single system. Images from our recently installed digital radiographic unit can be added. Our paper describes our present system, explains the rationale for its configuration, and describes the directions in which it will expand.
Hydrographic Basins Analysis Using Digital Terrain Modelling
NASA Astrophysics Data System (ADS)
Mihaela, Pişleagă; -Minda Codruţa, Bădăluţă; Gabriel, Eleş; Daniela, Popescu
2017-10-01
The paper, emphasis the link between digital terrain modelling and studies of hydrographic basins, concerning the hydrological processes analysis. Given the evolution of computing techniques but also of the software digital terrain modelling made its presence felt increasingly, and established itself as a basic concept in many areas, due to many advantages. At present, most digital terrain modelling is derived from three alternative sources such as ground surveys, photogrammetric data capture or from digitized cartographic sources. A wide range of features may be extracted from digital terrain models, such as surface, specific points and landmarks, linear features but also areal futures like drainage basins, hills or hydrological basins. The paper highlights how the use appropriate software for the preparation of a digital terrain model, a model which is subsequently used to study hydrographic basins according to various geomorphological parameters. As a final goal, it shows the link between digital terrain modelling and hydrographic basins study that can be used to optimize the correlation between digital model terrain and hydrological processes in order to obtain results as close to the real field processes.
The evaluation of a novel haptic-enabled virtual reality approach for computer-aided cephalometry.
Medellín-Castillo, H I; Govea-Valladares, E H; Pérez-Guerrero, C N; Gil-Valladares, J; Lim, Theodore; Ritchie, James M
2016-07-01
In oral and maxillofacial surgery, conventional radiographic cephalometry is one of the standard auxiliary tools for diagnosis and surgical planning. While contemporary computer-assisted cephalometric systems and methodologies support cephalometric analysis, they tend neither to be practical nor intuitive for practitioners. This is particularly the case for 3D methods since the associated landmarking process is difficult and time consuming. In addition to this, there are no 3D cephalometry norms or standards defined; therefore new landmark selection methods are required which will help facilitate their establishment. This paper presents and evaluates a novel haptic-enabled landmarking approach to overcome some of the difficulties and disadvantages of the current landmarking processes used in 2D and 3D cephalometry. In order to evaluate this new system's feasibility and performance, 21 dental surgeons (comprising 7 Novices, 7 Semi-experts and 7 Experts) performed a range of case studies using a haptic-enabled 2D, 2½D and 3D digital cephalometric analyses. The results compared the 2D, 2½D and 3D cephalometric values, errors and standard deviations for each case study and associated group of participants and revealed that 3D cephalometry significantly reduced landmarking errors and variability compared to 2D methods. Through enhancing the process by providing a sense of touch, the haptic-enabled 3D digital cephalometric approach was found to be feasible and more intuitive than its counterparts as well effective at reducing errors, the variability of the measurements taken and associated task completion times. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Optimization of image processing algorithms on mobile platforms
NASA Astrophysics Data System (ADS)
Poudel, Pramod; Shirvaikar, Mukul
2011-03-01
This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.
Gratton, David G; Kwon, So Ran; Blanchette, Derek; Aquilino, Steven A
2016-01-01
The aim of this study was to evaluate the effect of digital tooth preparation imaging and evaluation technology on dental students' technical abilities, self-evaluation skills, and the assessment of their simulated clinical work. A total of 80 second-year students at one U.S. dental school were assigned to one of three groups: control (n=40), E4D Compare (n=20), and Sirona prepCheck (n=20). Students in the control group were taught by traditional teaching methodologies, and the technology-assisted groups received both traditional training and supplementary feedback from the corresponding digital system. Three outcomes were measured: faculty technical score, self-evaluation score, and E4D Compare scores at 0.30 mm tolerance. Correlations were determined between the groups' scores from visual assessment and self-evaluation and between the visual assessment and digital scores. The results showed that the visual assessment and self-evaluation scores did not differ among groups (p>0.05). Overall, correlations between visual and digital assessment scores were modest though statistically significant (5% level of significance). These results suggest that the use of digital tooth preparation evaluation technology did not impact the students' prosthodontic technical and self-evaluation skills. Visual scores given by faculty and digital assessment scores correlated moderately in only two instances.
The digital storytelling process: A comparative analysis from various experts
NASA Astrophysics Data System (ADS)
Hussain, Hashiroh; Shiratuddin, Norshuhada
2016-08-01
Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.
Paim, Crislaine Pires Padilha; Goldmeier, Silvia
2017-01-10
Existing research suggests that digital games can be used effectively for educational purposes at any level of training. Perioperative nursing educators can use games to complement curricula, in guidance and staff development programs, to foster team collaboration, and to give support to critical thinking in nursing practice because it is a complex environment. To describe the process of developing an educational game to set up surgical instruments on the Mayo stand or back table as a resource to assist the instructor in surgical instrumentation training for students and nursing health professionals in continued education. The study was characterized by applied research in production technology. It included the phases of analysis and design, development, and evaluation. The objectives of the educational game were developed through Bloom's taxonomy. Parallel to the physical development of the educational game, a proposed model for the use of digital elements in educational game activities was applied to develop the game content. The development of the game called "Playing with Tweezers" was carried out in 3 phases and was evaluated by 15 participants, comprising students and professional experts in various areas of knowledge such as nursing, information technology, and education. An environment was created with an initial screen, menu buttons containing the rules of the game, and virtual tour modes for learning and assessment. The "digital" nursing student needs engagement, stimulation, reality, and entertainment, not just readings. "Playing with Tweezers" is an example of educational gaming as an innovative teaching strategy in nursing that encourages the strategy of involving the use of educational games to support theoretical or practical classroom teaching. Thus, the teacher does not work with only 1 type of teaching methodology, but with a combination of different methodologies. In addition, we cannot forget that skill training in an educational game does not replace curricular practice, but helps. ©Crislaine Pires Padilha Paim, Silvia Goldmeier. Originally published in JMIR Serious Games (http://games.jmir.org), 10.01.2017.
RTDS implementation of an improved sliding mode based inverter controller for PV system.
Islam, Gazi; Muyeen, S M; Al-Durra, Ahmed; Hasanien, Hany M
2016-05-01
This paper proposes a novel approach for testing dynamics and control aspects of a large scale photovoltaic (PV) system in real time along with resolving design hindrances of controller parameters using Real Time Digital Simulator (RTDS). In general, the harmonic profile of a fast controller has wide distribution due to the large bandwidth of the controller. The major contribution of this paper is that the proposed control strategy gives an improved voltage harmonic profile and distribute it more around the switching frequency along with fast transient response; filter design, thus, becomes easier. The implementation of a control strategy with high bandwidth in small time steps of Real Time Digital Simulator (RTDS) is not straight forward. This paper shows a good methodology for the practitioners to implement such control scheme in RTDS. As a part of the industrial process, the controller parameters are optimized using particle swarm optimization (PSO) technique to improve the low voltage ride through (LVRT) performance under network disturbance. The response surface methodology (RSM) is well adapted to build analytical models for recovery time (Rt), maximum percentage overshoot (MPOS), settling time (Ts), and steady state error (Ess) of the voltage profile immediate after inverter under disturbance. A systematic approach of controller parameter optimization is detailed. The transient performance of the PSO based optimization method applied to the proposed sliding mode controlled PV inverter is compared with the results from genetic algorithm (GA) based optimization technique. The reported real time implementation challenges and controller optimization procedure are applicable to other control applications in the field of renewable and distributed generation systems. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
How well does multiple OCR error correction generalize?
NASA Astrophysics Data System (ADS)
Lund, William B.; Ringger, Eric K.; Walker, Daniel D.
2013-12-01
As the digitization of historical documents, such as newspapers, becomes more common, the need of the archive patron for accurate digital text from those documents increases. Building on our earlier work, the contributions of this paper are: 1. in demonstrating the applicability of novel methods for correcting optical character recognition (OCR) on disparate data sets, including a new synthetic training set, 2. enhancing the correction algorithm with novel features, and 3. assessing the data requirements of the correction learning method. First, we correct errors using conditional random fields (CRF) trained on synthetic training data sets in order to demonstrate the applicability of the methodology to unrelated test sets. Second, we show the strength of lexical features from the training sets on two unrelated test sets, yielding a relative reduction in word error rate on the test sets of 6.52%. New features capture the recurrence of hypothesis tokens and yield an additional relative reduction in WER of 2.30%. Further, we show that only 2.0% of the full training corpus of over 500,000 feature cases is needed to achieve correction results comparable to those using the entire training corpus, effectively reducing both the complexity of the training process and the learned correction model.
Experimental flights using a small unmanned aircraft system for mapping emergent sandbars
Kinzel, Paul J.; Bauer, Mark A.; Feller, Mark R.; Holmquist-Johnson, Christopher; Preston, Todd
2015-01-01
The US Geological Survey and Parallel Inc. conducted experimental flights with the Tarantula Hawk (T-Hawk) unmanned aircraft system (UAS ) at the Dyer and Cottonwood Ranch properties located along reaches of the Platte River near Overton, Nebraska, in July 2013. We equipped the T-Hawk UAS platform with a consumer-grade digital camera to collect imagery of emergent sandbars in the reaches and used photogrammetric software and surveyed control points to generate orthophotographs and digital elevation models (DEMS ) of the reaches. To optimize the image alignment process, we retained and/or eliminated tie points based on their relative errors and spatial resolution, whereby minimizing the total error in the project. Additionally, we collected seven transects that traversed emergent sandbars concurrently with global positioning system location data to evaluate the accuracy of the UAS survey methodology. The root mean square errors for the elevation of emergent points along each transect across the DEMS ranged from 0.04 to 0.12 m. If adequate survey control is established, a UAS combined with photogrammetry software shows promise for accurate monitoring of emergent sandbar morphology and river management activities in short (1–2 km) river reaches.
NASA Astrophysics Data System (ADS)
Azevedo, Isabel; Richardson, Martin; Bernardo, Luis Miguel
2012-03-01
The speed at which our world is changing is reflected in the shifting way artistic images are created and produced. Holography can be used as a medium to express the perception of space with light and colour and to make the material and the immaterial experiments with optical and digital holography. This paper intends to be a reflection on the final product of that process surrounding a debate of ideas for new experimental methodologies applied to holographic images. Holography is a time-based medium and the irretrievable linear flow of time is responsible for a drama, unique to traditional cinematography. If the viewers move to left or right, they see glimpses of the next scene or the previous one perceived a second ago. This interaction of synthetic space arises questions such as: can we see, in "reality", two forms in the same space? Trying to answer this question, a series of works has been created. These concepts are embryonic to a series of digital art holograms and lenticulars technique's titled "Across Light: Through Colour". They required some technical research and comparison between effects from different camera types, using Canon IS3 and Sony HDR CX105.
Fuzzy set methods for object recognition in space applications
NASA Technical Reports Server (NTRS)
Keller, James M.
1991-01-01
Progress on the following tasks is reported: (1) fuzzy set-based decision making methodologies; (2) feature calculation; (3) clustering for curve and surface fitting; and (4) acquisition of images. The general structure for networks based on fuzzy set connectives which are being used for information fusion and decision making in space applications is described. The structure and training techniques for such networks consisting of generalized means and gamma-operators are described. The use of other hybrid operators in multicriteria decision making is currently being examined. Numerous classical features on image regions such as gray level statistics, edge and curve primitives, texture measures from cooccurrance matrix, and size and shape parameters were implemented. Several fractal geometric features which may have a considerable impact on characterizing cluttered background, such as clouds, dense star patterns, or some planetary surfaces, were used. A new approach to a fuzzy C-shell algorithm is addressed. NASA personnel are in the process of acquiring suitable simulation data and hopefully videotaped actual shuttle imagery. Photographs have been digitized to use in the algorithms. Also, a model of the shuttle was assembled and a mechanism to orient this model in 3-D to digitize for experiments on pose estimation is being constructed.
A digital-receiver for the MurchisonWidefield Array
NASA Astrophysics Data System (ADS)
Prabu, Thiagaraj; Srivani, K. S.; Roshi, D. Anish; Kamini, P. A.; Madhavi, S.; Emrich, David; Crosse, Brian; Williams, Andrew J.; Waterson, Mark; Deshpande, Avinash A.; Shankar, N. Udaya; Subrahmanyan, Ravi; Briggs, Frank H.; Goeke, Robert F.; Tingay, Steven J.; Johnston-Hollitt, Melanie; R, Gopalakrishna M.; Morgan, Edward H.; Pathikulangara, Joseph; Bunton, John D.; Hampson, Grant; Williams, Christopher; Ord, Stephen M.; Wayth, Randall B.; Kumar, Deepak; Morales, Miguel F.; deSouza, Ludi; Kratzenberg, Eric; Pallot, D.; McWhirter, Russell; Hazelton, Bryna J.; Arcus, Wayne; Barnes, David G.; Bernardi, Gianni; Booler, T.; Bowman, Judd D.; Cappallo, Roger J.; Corey, Brian E.; Greenhill, Lincoln J.; Herne, David; Hewitt, Jacqueline N.; Kaplan, David L.; Kasper, Justin C.; Kincaid, Barton B.; Koenig, Ronald; Lonsdale, Colin J.; Lynch, Mervyn J.; Mitchell, Daniel A.; Oberoi, Divya; Remillard, Ronald A.; Rogers, Alan E.; Salah, Joseph E.; Sault, Robert J.; Stevens, Jamie B.; Tremblay, S.; Webster, Rachel L.; Whitney, Alan R.; Wyithe, Stuart B.
2015-03-01
An FPGA-based digital-receiver has been developed for a low-frequency imaging radio interferometer, the Murchison Widefield Array (MWA). The MWA, located at the Murchison Radio-astronomy Observatory (MRO) in Western Australia, consists of 128 dual-polarized aperture-array elements (tiles) operating between 80 and 300 MHz, with a total processed bandwidth of 30.72 MHz for each polarization. Radio-frequency signals from the tiles are amplified and band limited using analog signal conditioning units; sampled and channelized by digital-receivers. The signals from eight tiles are processed by a single digital-receiver, thus requiring 16 digital-receivers for the MWA. The main function of the digital-receivers is to digitize the broad-band signals from each tile, channelize them to form the sky-band, and transport it through optical fibers to a centrally located correlator for further processing. The digital-receiver firmware also implements functions to measure the signal power, perform power equalization across the band, detect interference-like events, and invoke diagnostic modes. The digital-receiver is controlled by high-level programs running on a single-board-computer. This paper presents the digital-receiver design, implementation, current status, and plans for future enhancements.
Development of a compact and cost effective multi-input digital signal processing system
NASA Astrophysics Data System (ADS)
Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun
2018-01-01
A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.
ACTS Satellite Telemammography Network Experiments
NASA Technical Reports Server (NTRS)
Kachmar, Brian A.; Kerczewski, Robert J.
2000-01-01
The Satellite Networks and Architectures Branch of NASA's Glenn Research Center has developed and demonstrated several advanced satellite communications technologies through the Advanced Communications Technology Satellite (ACTS) program. One of these technologies is the implementation of a Satellite Telemammography Network (STN) encompassing NASA Glenn, the Cleveland Clinic Foundation. the University of Virginia, and the Ashtabula County Medical Center. This paper will present a look at the STN from its beginnings to the impact it may have on future telemedicine applications. Results obtained using the experimental ACTS satellite demonstrate the feasibility of Satellite Telemammography. These results have improved teleradiology processes and mammography image manipulation, and enabled advances in remote screening methodologies. Future implementation of satellite telemammography using next generation commercial satellite networks will be explored. In addition, the technical aspects of the project will be discussed, in particular how the project has evolved from using NASA developed hardware and software to commercial off the shelf (COTS) products. Development of asymmetrical link technologies was an outcome of this work. Improvements in the display of digital mammographic images, better understanding of end-to-end system requirements, and advances in radiological image compression were achieved as a result of the research. Finally, rigorous clinical medical studies are required for new technologies such as digital satellite telemammography to gain acceptance in the medical establishment. These experiments produced data that were useful in two key medical studies that addressed the diagnostic accuracy of compressed satellite transmitted digital mammography images. The results of these studies will also be discussed.
Graduate Entrepreneurs: Intentions, Barriers and Solutions
ERIC Educational Resources Information Center
Smith, Kelly; Beasley, Martin
2011-01-01
Purpose: This paper aims to investigate the factors that influenced seven graduates in the creative and digital industries to start their own businesses in Barnsley, South Yorkshire, UK--an area with lack of employing establishments and locally registered businesses. Design/methodology/approach: Questionnaires and semi-structured interviews…
Stakeholder Perceptions of ICT Usage across Management Institutes
ERIC Educational Resources Information Center
Goyal, Ela; Purohit, Seema; Bhagat, Manju
2013-01-01
Information and communication technology (ICT) which includes radio, television and newer digital technology such as computers and the internet, are potentially powerful tools for extending educational opportunities, formal and non-formal, to one and all. It provides opportunities to deploy innovative teaching methodologies and interesting…
A comparison of methods for DPLL loop filter design
NASA Technical Reports Server (NTRS)
Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.
1986-01-01
Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.
Coal resource assessments using coal availability and recoverability methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrbacher, T.J.
1997-12-01
The U.S. Geological Survey (USGS), in conjunction with state geological surveys and other federal agencies, has initiated a study and developed methodology to reassess the nation`s major coal resources. This study differs from previous coal resource assessments of the USGS, U.S. Bureau of Mines, and the Department of Energy`s Energy Information Administration, because this program: (1) Identifies and characterizes the coal beds and coal zones that will provide the bulk of the nation`s coal-derived energy during the first quarter of the twenty-first century; (2) organizes geologic, chemical, environmental, and geographic information in digital format and makes these data available tomore » the public through the Internet or other digital media, such as CD ROMs; (3) includes coal resource availability and coal recoverability analyses for selected areas; (4) provides economic assessments and coal recoverability analyses for selected areas; (5) provides methodology to perform socio-economic impact analysis related to coal mining in specific geographical areas as small as a county.« less
Resistance Curves in the Tensile and Compressive Longitudinal Failure of Composites
NASA Technical Reports Server (NTRS)
Camanho, Pedro P.; Catalanotti, Giuseppe; Davila, Carlos G.; Lopes, Claudio S.; Bessa, Miguel A.; Xavier, Jose C.
2010-01-01
This paper presents a new methodology to measure the crack resistance curves associated with fiber-dominated failure modes in polymer-matrix composites. These crack resistance curves not only characterize the fracture toughness of the material, but are also the basis for the identification of the parameters of the softening laws used in the analytical and numerical simulation of fracture in composite materials. The method proposed is based on the identification of the crack tip location by the use of Digital Image Correlation and the calculation of the J-integral directly from the test data using a simple expression derived for cross-ply composite laminates. It is shown that the results obtained using the proposed methodology yield crack resistance curves similar to those obtained using FEM-based methods in compact tension carbon-epoxy specimens. However, it is also shown that the Digital Image Correlation based technique can be used to extract crack resistance curves in compact compression tests for which FEM-based techniques are inadequate.
NASA Technical Reports Server (NTRS)
Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.
2009-01-01
A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.
Effects of a cochlear implant simulation on immediate memory in normal-hearing adults
Burkholder, Rose A.; Pisoni, David B.; Svirsky, Mario A.
2012-01-01
This study assessed the effects of stimulus misidentification and memory processing errors on immediate memory span in 25 normal-hearing adults exposed to degraded auditory input simulating signals provided by a cochlear implant. The identification accuracy of degraded digits in isolation was measured before digit span testing. Forward and backward digit spans were shorter when digits were degraded than when they were normal. Participants’ normal digit spans and their accuracy in identifying isolated digits were used to predict digit spans in the degraded speech condition. The observed digit spans in degraded conditions did not differ significantly from predicted digit spans. This suggests that the decrease in memory span is related primarily to misidentification of digits rather than memory processing errors related to cognitive load. These findings provide complementary information to earlier research on auditory memory span of listeners exposed to degraded speech either experimentally or as a consequence of a hearing-impairment. PMID:16317807
Digital development of products with NX9 for academical areas
NASA Astrophysics Data System (ADS)
Goanta, A. M.
2015-11-01
International competitiveness forced the manufacturing enterprises to look for new ways to accelerate the development of digital products through innovation, global alliances and strategic partnerships. In an environment of global research and development of distributed geographically, all members of the joint teams made up of companies and universities need to access updated and accurate information about products created by any of the type employed, student, teacher. Current design processes involve more complex products consisting of elements of design created by multiple teams, disciplines and suppliers using independent CAD systems. Even when using a 3D CAD mature technology, many companies fail to significantly reduce losses in the process, improve product quality or product type to ensure successful innovations to market arouse interest. These challenges require a radical rethinking of the business model, which belongs to the field of design, which must be based on digital development of products based on integrated files. Through this work, the author has proposed to provide both synthesis and transformations brought news of the integrated NX [1, 2, 3] from Siemens PLM Software 9, following a news results detailed documentary study, and personal results obtained by applying the same version, the digital and integrated development of a product type device test beams. Based on educational license received for NX 9 was made a detailed study of the innovations made by this release, and the application of some of them went to graphical modelling and getting all the documentation of a test device bearing beams. Also, were synthesized in terms of methodology, the steps to take to obtain graphical documentation. The results consist of: 3D models of all parts and assembly 3D model of the three-dimensional constraints of all component parts and not least respectively all drawings and assembly drawing. The most important consequence of the paper is the obtaining of integrated files that can be subjected to further analysis type CAE / CAM / PDM software components by the same company. Additional advantages related files by the synthesis of integrated CAD / CAE / CAM / PDM.
Carvalho, Fabiola B; Gonçalves, Marcelo; Tanomaru-Filho, Mário
2007-04-01
The purpose of this study was to describe a new technique by using Adobe Photoshop CS (San Jose, CA) image-analysis software to evaluate the radiographic changes of chronic periapical lesions after root canal treatment by digital subtraction radiography. Thirteen upper anterior human teeth with pulp necrosis and radiographic image of chronic periapical lesion were endodontically treated and radiographed 0, 2, 4, and 6 months after root canal treatment by using a film holder. The radiographic films were automatically developed and digitized. The radiographic images taken 0, 2, 4, and 6 months after root canal therapy were submitted to digital subtraction in pairs (0 and 2 months, 2 and 4 months, and 4 and 6 months) choosing "image," "calculation," "subtract," and "new document" tools from Adobe Photoshop CS image-analysis software toolbar. The resulting images showed areas of periapical healing in all cases. According to this methodology, the healing or expansion of periapical lesions can be evaluated by means of digital subtraction radiography by using Adobe Photoshop CS software.
Status of emerging standards for removable computer storage media and related contributions of NIST
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1992-01-01
Standards for removable computer storage media are needed so that users may reliably interchange data both within and among various computer installations. Furthermore, media interchange standards support competition in industry and prevent sole-source lock-in. NIST participates in magnetic tape and optical disk standards development through Technical Committees X3B5, Digital Magnetic Tapes, X3B11, Optical Digital Data Disk, and the Joint Technical Commission on Data Permanence. NIST also participates in other relevant national and international standards committees for removable computer storage media. Industry standards for digital magnetic tapes require the use of Standard Reference Materials (SRM's) developed and maintained by NIST. In addition, NIST has been studying care and handling procedures required for digital magnetic tapes. NIST has developed a methodology for determining the life expectancy of optical disks. NIST is developing care and handling procedures for optical digital data disks and is involved in a program to investigate error reporting capabilities of optical disk drives. This presentation reflects the status of emerging magnetic tape and optical disk standards, as well as NIST's contributions in support of these standards.
NASA Astrophysics Data System (ADS)
Robbins, William L.; Conklin, James J.
1995-10-01
Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.
Exploring the Developmental Changes in Automatic Two-Digit Number Processing
ERIC Educational Resources Information Center
Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey
2011-01-01
Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…
Preliminary development of digital signal processing in microwave radiometers
NASA Technical Reports Server (NTRS)
Stanley, W. D.
1980-01-01
Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.
The documentation and reintegration of a lost past
NASA Astrophysics Data System (ADS)
Balletti, C.; Brussa, N.; Gottardi, C.; Guerra, F.
2014-05-01
The paper describes how new digital methodologies can be used within the field of Cultural Heritage, not only with the aim of documenting the actual state of an architecture but to review the past transformations it has undergone, conserving and representing these histories as well. Over the last few years, the methodologies of acquisition and integrated representation for 3D patrimony documentation have developed and consolidated considerably: the possibilities of the digital realm can augment the understanding and the valorisation of a monument. The specific case offered in the present paper, the Scuola Vecchia della Misericordia in Venice, is a significant example. It suggests not only the theme of the "no longer existing" regarding its façade, which has undergone evident modifications, but also the recontextualization of a number of decorative elements, such as the bas-reliefs which once marked the entrance and are today conserved in the Victoria and Albert Museum, London. The described experience shows how integrated methodology, from high resolution laser scanning and photogrammetric survey to 3d modelling, can develop a reliably virtual diachronical reconstruction from different sources of an historical building. Geomatic tools combined with computer graphics provide a better understanding of building history through the use historical documents, playing a paramount role in preserving and valorizing the cultural and environmental heritage.
NASA Astrophysics Data System (ADS)
Belfort, Benjamin; Weill, Sylvain; Lehmann, François
2017-04-01
A novel, non-invasive imaging technique that determines 2D maps of water content in unsaturated porous media is presented. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage / imbibition experiment in a 2D flow tank with inner dimensions of 40 cm x 14 cm x 6 cm (L x W x D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using numerical simulations with a state-of-the-art computational code that solves the Richards. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Application examples to a larger flow tank with various boundary conditions are finally presented to illustrate the potential of the methodology.
Digital images in the map revision process
NASA Astrophysics Data System (ADS)
Newby, P. R. T.
Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.
Digitalisierung - Management Zwischen 0 und 1
NASA Astrophysics Data System (ADS)
Friedrich, Stefan; Rachholz, Josef
2017-09-01
Digitization as a process of expressing actions and values by codes 0 and 1 has already has become part of our lives. Digitization enables enterprises to improve production, sales and to increase volume of production. However, no standard digitization strategy has been yet developed. Even in the digitized business process management system, the most important position remains to a human being. The improvement of software products, their availability and the education system in the area of introduction and use of information technology is thus a striking feature in development of managing (but also other) current processes.
Educational-research laboratory "electric circuits" on the base of digital technologies
NASA Astrophysics Data System (ADS)
Koroteyev, V. I.; Florentsev, V. V.; Florentseva, N. I.
2017-01-01
The problem of research activity of trainees' activation in the educational-research laboratory "Electric Circuits" using innovative methodological solutions and digital technologies is considered. The main task is in creation of the unified experimental research information-educational environment "Electrical Engineering". The problems arising during the developing and application of the modern software and hardware, experimental and research stands and digital control and measuring systems are presented. This paper presents the main stages of development and creation of educational-research laboratory "Electrical Circuits" at the Department of Electrical Engineering of NRNU MEPhI. The authors also consider the analogues of the described research complex offered by various educational institutions and companies. The analysis of their strengths and weaknesses, on which the advantages of the proposed solution are based, is held.
Determination of Local Densities in Accreted Ice Samples Using X-Rays and Digital Imaging
NASA Technical Reports Server (NTRS)
Broughton, Howard; Sims, James; Vargas, Mario
1996-01-01
At the NASA Lewis Research Center's Icing Research Tunnel ice shapes, similar to those which develop in-flight icing conditions, were formed on an airfoil. Under cold room conditions these experimental samples were carefully removed from the airfoil, sliced into thin sections, and x-rayed. The resulting microradiographs were developed and the film digitized using a high resolution scanner to extract fine detail in the radiographs. A procedure was devised to calibrate the scanner and to maintain repeatability during the experiment. The techniques of image acquisition and analysis provide accurate local density measurements and reveal the internal characteristics of the accreted ice with greater detail. This paper will discuss the methodology by which these samples were prepared with emphasis on the digital imaging techniques.
Measuring Children’s Media Use in the Digital Age
Vandewater, Elizabeth A.; Lee, Sook-Jung
2009-01-01
In this new and rapidly changing era of digital technology, there is increasing consensus among media scholars that there is an urgent need to develop measurement approaches which more adequately capture media use The overarching goal of this paper is facilitate the development of measurement approaches appropriate for capturing children’s media use in the digital age. The paper outlines various approaches to measurement, focusing mainly on those which have figured prominently in major existing studies of children’s media use. We identify issues related to each technique, including advantages and disadvantages. We also include a review of existing empirical comparisons of various methodologies. The paper is intended to foster discussion of the best ways to further research and knowledge regarding the impact of media on children. PMID:19763246
Scott, Ian A; Sullivan, Clair; Staib, Andrew
2018-05-24
Objective In an era of rapid digitisation of Australian hospitals, practical guidance is needed in how to successfully implement electronic medical records (EMRs) as both a technical innovation and a major transformative change in clinical care. The aim of the present study was to develop a checklist that clearly and comprehensively defines the steps that best prepare hospitals for EMR implementation and digital transformation. Methods The checklist was developed using a formal methodological framework comprised of: literature reviews of relevant issues; an interactive workshop involving a multidisciplinary group of digital leads from Queensland hospitals; a draft document based on literature and workshop proceedings; and a review and feedback from senior clinical leads. Results The final checklist comprised 19 questions, 13 related to EMR implementation and six to digital transformation. Questions related to the former included organisational considerations (leadership, governance, change leaders, implementation plan), technical considerations (vendor choice, information technology and project management teams, system and hardware alignment with clinician workflows, interoperability with legacy systems) and training (user training, post-go-live contingency plans, roll-out sequence, staff support at point of care). Questions related to digital transformation included cultural considerations (clinically focused vision statement and communication strategy, readiness for change surveys), management of digital disruption syndromes and plans for further improvement in patient care (post-go-live optimisation of digital system, quality and benefit evaluation, ongoing digital innovation). Conclusion This evidence-based, field-tested checklist provides guidance to hospitals planning EMR implementation and separates readiness for EMR from readiness for digital transformation. What is known about the topic? Many hospitals throughout Australia have implemented, or are planning to implement, hospital wide electronic medical records (EMRs) with varying degrees of functionality. Few hospitals have implemented a complete end-to-end digital system with the ability to bring about major transformation in clinical care. Although the many challenges in implementing EMRs have been well documented, they have not been incorporated into an evidence-based, field-tested checklist that can practically assist hospitals in preparing for EMR implementation as both a technical innovation and a vehicle for major digital transformation of care. What does this paper add? This paper outlines a 19-question checklist that was developed using a formal methodological framework comprising literature review of relevant issues, proceedings from an interactive workshop involving a multidisciplinary group of digital leads from hospitals throughout Queensland, including three hospitals undertaking EMR implementation and one hospital with complete end-to-end EMR, and review of a draft checklist by senior clinical leads within a statewide digital healthcare improvement network. The checklist distinguishes between issues pertaining to EMR as a technical innovation and EMR as a vehicle for digital transformation of patient care. What are the implications for practitioners? Successful implementation of a hospital-wide EMR requires senior managers, clinical leads, information technology teams and project management teams to fully address key operational and strategic issues. Using an issues checklist may help prevent any one issue being inadvertently overlooked or underemphasised in the planning and implementation stages, and ensure the EMR is fully adopted and optimally used by clinician users in an ongoing digital transformation of care.
NASA Astrophysics Data System (ADS)
Chen, Liang-Chia; Lin, Grier C. I.
1997-12-01
A vision-drive automatic digitization process for free-form surface reconstruction has been developed, with a coordinate measurement machine (CMM) equipped with a touch-triggered probe and a CCD camera, in reverse engineering physical models. The process integrates 3D stereo detection, data filtering, Delaunay triangulation, adaptive surface digitization into a single process of surface reconstruction. By using this innovative approach, surface reconstruction can be implemented automatically and accurately. Least-squares B- spline surface models with the controlled accuracy of digitization can be generated for further application in product design and manufacturing processes. One industrial application indicates that this approach is feasible, and the processing time required in reverse engineering process can be significantly reduced up to more than 85%.
Digital signal processor and processing method for GPS receivers
NASA Technical Reports Server (NTRS)
Thomas, Jr., Jess B. (Inventor)
1989-01-01
A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.
Marin, Diego; Gegundez-Arias, Manuel E; Suero, Angel; Bravo, Jose M
2015-02-01
Development of automatic retinal disease diagnosis systems based on retinal image computer analysis can provide remarkably quicker screening programs for early detection. Such systems are mainly focused on the detection of the earliest ophthalmic signs of illness and require previous identification of fundal landmark features such as optic disc (OD), fovea or blood vessels. A methodology for accurate center-position location and OD retinal region segmentation on digital fundus images is presented in this paper. The methodology performs a set of iterative opening-closing morphological operations on the original retinography intensity channel to produce a bright region-enhanced image. Taking blood vessel confluence at the OD into account, a 2-step automatic thresholding procedure is then applied to obtain a reduced region of interest, where the center and the OD pixel region are finally obtained by performing the circular Hough transform on a set of OD boundary candidates generated through the application of the Prewitt edge detector. The methodology was evaluated on 1200 and 1748 fundus images from the publicly available MESSIDOR and MESSIDOR-2 databases, acquired from diabetic patients and thus being clinical cases of interest within the framework of automated diagnosis of retinal diseases associated to diabetes mellitus. This methodology proved highly accurate in OD-center location: average Euclidean distance between the methodology-provided and actual OD-center position was 6.08, 9.22 and 9.72 pixels for retinas of 910, 1380 and 1455 pixels in size, respectively. On the other hand, OD segmentation evaluation was performed in terms of Jaccard and Dice coefficients, as well as the mean average distance between estimated and actual OD boundaries. Comparison with the results reported by other reviewed OD segmentation methodologies shows our proposal renders better overall performance. Its effectiveness and robustness make this proposed automated OD location and segmentation method a suitable tool to be integrated into a complete prescreening system for early diagnosis of retinal diseases. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
A system for automatic evaluation of simulation software
NASA Technical Reports Server (NTRS)
Ryan, J. P.; Hodges, B. C.
1976-01-01
Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring
ERIC Educational Resources Information Center
Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri
2017-01-01
Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…
DELIVERing Library Resources to the Virtual Learning Environment
ERIC Educational Resources Information Center
Secker, Jane
2005-01-01
Purpose: Examines a project to integrate digital libraries and virtual learning environments (VLE) focusing on requirements for online reading list systems. Design/methodology/approach: Conducted a user needs analysis using interviews and focus groups and evaluated three reading or resource list management systems. Findings: Provides a technical…
Collaboration in Cultural Heritage Digitisation in East Asia
ERIC Educational Resources Information Center
Lee, Hyuk-Jin
2010-01-01
Purpose: The purpose of this paper is to review the current status of collaboration in cultural heritage preservation in East Asia, including digital projects, and to suggest practical improvements based on a cultural structuralism perspective. Design/methodology/approach: Through exploratory research, the paper addresses aspects for successful…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-29
... specific areas in a relatively fast and accurate way that may be used to estimate and update Section 8 Fair... survey methodologies to collect gross rent data for specific areas in a relatively fast and accurate way...
Programmable rate modem utilizing digital signal processing techniques
NASA Technical Reports Server (NTRS)
Naveh, Arad
1992-01-01
The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.
Data reduction complex analog-to-digital data processing requirements for onsite test facilities
NASA Technical Reports Server (NTRS)
Debbrecht, J. D.
1976-01-01
The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.
NASA Astrophysics Data System (ADS)
Schweier, C.; Markus, M.; Steinle, E.
2004-04-01
Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.
NASA Technical Reports Server (NTRS)
Perez, Reinaldo J.
2011-01-01
Single Event Transients in analog and digital electronics from space generated high energetic nuclear particles can disrupt either temporarily and sometimes permanently the functionality and performance of electronics in space vehicles. This work first provides some insights into the modeling of SET in electronic circuits that can be used in SPICE-like simulators. The work is then directed to present methodologies, one of which was developed by this author, for the assessment of SET at different levels of integration in electronics, from the circuit level to the subsystem level.
Digital-flutter-suppression-system investigations for the active flexible wing wind-tunnel model
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Mukhopadhyay, Vivek; Hoadley, Sherwood Tiffany; Cole, Stanley R.; Buttrill, Carey S.
1990-01-01
Active flutter suppression control laws were designed, implemented, and tested on an aeroelastically-scaled wind-tunnel model in the NASA Langley Transonic Dynamics Tunnel. One of the control laws was successful in stabilizing the model while the dynamic pressure was increased to 24 percent greater than the measured open-loop flutter boundary. Other accomplishments included the design, implementation, and successful operation of a one-of-a-kind digital controller, the design and use of two simulation methods to support the project, and the development and successful use of a methodology for online controller performance evaluation.
Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.
Hardt, Marah J; Flett, Keith; Howell, Colleen J
2017-08-01
Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.
Digital-flutter-suppression-system investigations for the active flexible wing wind-tunnel model
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Mukhopadhyay, Vivek; Hoadley, Sherwood T.; Cole, Stanley R.; Buttrill, Carey S.; Houck, Jacob A.
1990-01-01
Active flutter suppression control laws were designed, implemented, and tested on an aeroelastically-scaled wind tunnel model in the NASA Langley Transonic Dynamics Tunnel. One of the control laws was successful in stabilizing the model while the dynamic pressure was increased to 24 percent greater than the measured open-loop flutter boundary. Other accomplishments included the design, implementation, and successful operation of a one-of-a-kind digital controller, the design and use of two simulation methods to support the project, and the development and successful use of a methodology for on-line controller performance evaluation.
Statistical EMC: A new dimension electromagnetic compatibility of digital electronic systems
NASA Astrophysics Data System (ADS)
Tsaliovich, Anatoly
Electromagnetic compatibility compliance test results are used as a database for addressing three classes of electromagnetic-compatibility (EMC) related problems: statistical EMC profiles of digital electronic systems, the effect of equipment-under-test (EUT) parameters on the electromagnetic emission characteristics, and EMC measurement specifics. Open area test site (OATS) and absorber line shielded room (AR) results are compared for equipment-under-test highest radiated emissions. The suggested statistical evaluation methodology can be utilized to correlate the results of different EMC test techniques, characterize the EMC performance of electronic systems and components, and develop recommendations for electronic product optimal EMC design.
3D measurement by digital photogrammetry
NASA Astrophysics Data System (ADS)
Schneider, Carl T.
1993-12-01
Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.
Matrix Analysis of the Digital Divide in eHealth Services Using Awareness, Want, and Adoption Gap
2012-01-01
Background The digital divide usually refers to access or usage, but some studies have identified two other divides: awareness and demand (want). Given that the hierarchical stages of the innovation adoption process of a customer are interrelated, it is necessary and meaningful to analyze the digital divide in eHealth services through three main stages, namely, awareness, want, and adoption. Objective By following the three main integrated stages of the innovation diffusion theory, from the customer segment viewpoint, this study aimed to propose a new matrix analysis of the digital divide using the awareness, want, and adoption gap ratio (AWAG). I compared the digital divide among different groups. Furthermore, I conducted an empirical study on eHealth services to present the practicability of the proposed methodology. Methods Through a review and discussion of the literature, I proposed hypotheses and a new matrix analysis. To test the proposed method, 3074 Taiwanese respondents, aged 15 years and older, were surveyed by telephone. I used the stratified simple random sampling method, with sample size allocation proportioned by the population distribution of 23 cities and counties (strata). Results This study proposed the AWAG segment matrix to analyze the digital divide in eHealth services. First, awareness and want rates were divided into two levels at the middle point of 50%, and then the 2-dimensional cross of the awareness and want segment matrix was divided into four categories: opened group, desire-deficiency group, perception-deficiency group, and closed group. Second, according to the degrees of awareness and want, each category was further divided into four subcategories. I also defined four possible strategies, namely, hold, improve, evaluate, and leave, for different regions in the proposed matrix. An empirical test on two recently promoted eHealth services, the digital medical service (DMS) and the digital home care service (DHCS), was conducted. Results showed that for both eHealth services, the digital divides of awareness, want, and adoption existed across demographic variables, as well as between computer owners and nonowners, and between Internet users and nonusers. With respect to the analysis of the AWAG segment matrix for DMS, most of the segments, except for people with marriage status of Other or without computers, were positioned in the opened group. With respect to DHCS, segments were separately positioned in the opened, perception-deficiency, and closed groups. Conclusions Adoption does not closely follow people’s awareness or want, and a huge digital divide in adoption exists in DHS and DHCS. Thus, a strategy to promote adoption should be used for most demographic segments. PMID:22329958
Wexler, Lisa; Gubrium, Aline; Griffin, Megan; DiFulvio, Gloria
2013-07-01
Using a positive youth development framework, this article describes how a 3-year digital storytelling project and the 566 digital stories produced from it in Northwest Alaska promote protective factors in the lives of Alaska Native youth and serve as digital "hope kits," a suicide prevention approach that emphasizes young people's reasons for living. Digital stories are short, participant-produced videos that combine photos, music, and voice. We present process data that indicate the ways that digital stories serve as a platform for youth to reflect on and represent their lives, important relationships and achievements. In so doing, youth use the digital storytelling process to identify and highlight encouraging aspects of their lives, and develop more certain and positive identity formations. These processes are correlated with positive youth health outcomes. In addition, the digital stories themselves serve as reminders of the young people's personal assets--their reasons for living--after the workshop ends. Young people in this project often showed their digital stories to those who were featured positively within as a way to strengthen these interpersonal relationships. Evaluation data from the project show that digital storytelling workshops and outputs are a promising positive youth development approach. The project and the qualitative data demonstrate the need for further studies focusing on outcomes related to suicide prevention.
Defining event reconstruction of digital crime scenes.
Carrier, Brian D; Spafford, Eugene H
2004-11-01
Event reconstruction plays a critical role in solving physical crimes by explaining why a piece of physical evidence has certain characteristics. With digital crimes, the current focus has been on the recognition and identification of digital evidence using an object's characteristics, but not on the identification of the events that caused the characteristics. This paper examines digital event reconstruction and proposes a process model and procedure that can be used for a digital crime scene. The model has been designed so that it can apply to physical crime scenes, can support the unique aspects of a digital crime scene, and can be implemented in software to automate part of the process. We also examine the differences between physical event reconstruction and digital event reconstruction.
Preservation of Earth Science Data History with Digital Content Repository Technology
NASA Astrophysics Data System (ADS)
Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.
2011-12-01
An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.
Smart Sensor for Online Detection of Multiple-Combined Faults in VSD-Fed Induction Motors
Garcia-Ramirez, Armando G.; Osornio-Rios, Roque A.; Granados-Lieberman, David; Garcia-Perez, Arturo; Romero-Troncoso, Rene J.
2012-01-01
Induction motors fed through variable speed drives (VSD) are widely used in different industrial processes. Nowadays, the industry demands the integration of smart sensors to improve the fault detection in order to reduce cost, maintenance and power consumption. Induction motors can develop one or more faults at the same time that can be produce severe damages. The combined fault identification in induction motors is a demanding task, but it has been rarely considered in spite of being a common situation, because it is difficult to identify two or more faults simultaneously. This work presents a smart sensor for online detection of simple and multiple-combined faults in induction motors fed through a VSD in a wide frequency range covering low frequencies from 3 Hz and high frequencies up to 60 Hz based on a primary sensor being a commercially available current clamp or a hall-effect sensor. The proposed smart sensor implements a methodology based on the fast Fourier transform (FFT), RMS calculation and artificial neural networks (ANN), which are processed online using digital hardware signal processing based on field programmable gate array (FPGA).
NASA Astrophysics Data System (ADS)
Silva, Orildo L.; Bezerra, Francisco H. R.; Maia, Rubson P.; Cazarin, Caroline L.
2017-10-01
This paper analyzes different types of karst landforms and their relationships with fracture systems, sedimentary bedding, and fluvial processes. We mapped karst features in the Cretaceous carbonates of the Jandaíra Formation in the Potiguar Basin, Brazil. We used high-resolution digital elevation models acquired using LiDAR and aerial orthophotographs acquired using an unmanned aerial vehicle (UAV). We grouped and described karst evolution according to scale and degree of karstification. These degrees of karst evolution are coeval. Fractures are opened by dissolution, forming vertical fluid conduits, whereas coeval dissolution occurs along horizontal layers. This conduit system acts as pathways for water flow. The enlargement of conduits contributes to the collapse of blocks in sinkholes and expansion of caves during an intermediate degree of karstification. Propagation of dissolution can cause the coalescence of sinkholes and the capture of small streams. Fluvial processes dominate karst dissolution at an advanced degree of karstification. Comparisons with previously published ground-penetrating radar (GPR), borehole and seismic surveys in sedimentary basins indicate that these structures can be partially preserved during burial.
Laadan, Oren; Nieh, Jason; Phung, Dan
2012-10-02
Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Darren M.
Sandia National Laboratories has tested and evaluated Geotech Smart24 data acquisition system with active Fortezza crypto card data signing and authentication. The test results included in this report were in response to static and tonal-dynamic input signals. Most test methodologies used were based on IEEE Standards 1057 for Digitizing Waveform Recorders and 1241 for Analog to Digital Converters; others were designed by Sandia specifically for infrasound application evaluation and for supplementary criteria not addressed in the IEEE standards. The objective of this work was to evaluate the overall technical performance of the Geotech Smart24 digitizer with a Fortezza PCMCIA cryptomore » card actively implementing the signing of data packets. The results of this evaluation were compared to relevant specifications provided within manufacturer's documentation notes. The tests performed were chosen to demonstrate different performance aspects of the digitizer under test. The performance aspects tested include determining noise floor, least significant bit (LSB), dynamic range, cross-talk, relative channel-to-channel timing, time-tag accuracy, analog bandwidth and calibrator performance.« less
Feasibility of a GNSS-Probe for Creating Digital Maps of High Accuracy and Integrity
NASA Astrophysics Data System (ADS)
Vartziotis, Dimitris; Poulis, Alkis; Minogiannis, Alexandros; Siozos, Panayiotis; Goudas, Iraklis; Samson, Jaron; Tossaint, Michel
The “ROADSCANNER” project addresses the need for increased accuracy and integrity Digital Maps (DM) utilizing the latest developments in GNSS, in order to provide the required datasets for novel applications, such as navigation based Safety Applications, Advanced Driver Assistance Systems (ADAS) and Digital Automotive Simulations. The activity covered in the current paper is the feasibility study, preliminary tests, initial product design and development plan for an EGNOS enabled vehicle probe. The vehicle probe will be used for generating high accuracy, high integrity and ADAS compatible digital maps of roads, employing a multiple passes methodology supported by sophisticated refinement algorithms. Furthermore, the vehicle probe will be equipped with pavement scanning and other data fusion equipment, in order to produce 3D road surface models compatible with standards of road-tire simulation applications. The project was assigned to NIKI Ltd under the 1st Call for Ideas in the frame of the ESA - Greece Task Force.
Hallas, Gary; Monis, Paul
2015-01-01
The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.
Advancing Models and Theories for Digital Behavior Change Interventions.
Hekler, Eric B; Michie, Susan; Pavel, Misha; Rivera, Daniel E; Collins, Linda M; Jimison, Holly B; Garnett, Claire; Parral, Skye; Spruijt-Metz, Donna
2016-11-01
To be suitable for informing digital behavior change interventions, theories and models of behavior change need to capture individual variation and changes over time. The aim of this paper is to provide recommendations for development of models and theories that are informed by, and can inform, digital behavior change interventions based on discussions by international experts, including behavioral, computer, and health scientists and engineers. The proposed framework stipulates the use of a state-space representation to define when, where, for whom, and in what state for that person, an intervention will produce a targeted effect. The "state" is that of the individual based on multiple variables that define the "space" when a mechanism of action may produce the effect. A state-space representation can be used to help guide theorizing and identify crossdisciplinary methodologic strategies for improving measurement, experimental design, and analysis that can feasibly match the complexity of real-world behavior change via digital behavior change interventions. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.