Science.gov

Sample records for semi-automatic computer system

  1. A semi-automatic Parachute Separation System for Balloon Payloads

    NASA Astrophysics Data System (ADS)

    Farman, M. E.; Barsic, J. E.

    When operating stratospheric balloons with scientific payloads at the National Scientific Balloon Facility, the current practice for separating the payload from the parachute after descent requires the sending of manual commands over a UHF channel from the chase aircraft or the ground control site. While this procedure generally works well, there have been occasions when, due to shadowing of the receive antenna, unfavorable aircraft attitude or even lack of a chase aircraft, the command has not been received and the parachute has failed to separate. In these circumstances, the payload may be dragged, with the consequent danger of damage to expensive and sometimes irreplaceable scientific instrumentation. The NSBF has developed a system designed to automatically separate the parachute without the necessity for commanding after touchdown. The most important criterion for such a design is that it should be fail-safe; a free-fall of the payload would of course be a disaster. This design incorporates many safety features and underwent extensive evaluation and testing for several years before it was adopted operationally. It is currently used as a backup to the commanded release, activated only when a chase aircraft is not available, at night or in exceptionally poor visibility conditions. This paper describes the design, development, testing and operation of the system, which is known as the Semi-Automatic Parachute Release (SAPR).

  2. Semi-automatic microdrive system for positioning electrodes during electrophysiological recordings from rat brain

    NASA Astrophysics Data System (ADS)

    Dabrowski, Piotr; Kublik, Ewa; Mozaryn, Jakub

    2015-09-01

    Electrophysiological recording of neuronal action potentials from behaving animals requires portable, precise and reliable devices for positioning of multiple microelectrodes in the brain. We propose a semi-automatic microdrive system for independent positioning of up to 8 electrodes (or tetrodes) in a rat (or larger animals). Device is intended to be used in chronic, long term recording applications in freely moving animals. Our design is based on independent stepper motors with lead screws which will offer single steps of ~ μm semi-automatically controlled from the computer. Microdrive system prototype for one electrode was developed and tested. Because of the lack of the systematic test procedures dedicated to such applications, we propose the evaluation of the prototype similar to ISO norm for industrial robots. To this end we designed and implemented magnetic linear and rotary encoders that provided information about electrode displacement and motor shaft movement. On the basis of these measurements we estimated repeatability, accuracy and backlash of the drive. According to the given assumptions and preliminary tests, the device should provide greater accuracy than hand-controlled manipulators available on the market. Automatic positioning will also shorten the course of the experiment and improve the acquisition of signals from multiple neuronal populations.

  3. A semi-automatic parachute separation system for balloon payloads

    NASA Astrophysics Data System (ADS)

    Farman, M.

    At the National Scientific balloon Facility (NSBF), when operating stratospheric balloons with scientific payloads, the current practice for separating the payload from the parachute after descent requires the sending of commands, over a UHF uplink, from the chase airplane or the ground control site. While this generally works well, there have been occasions when, due to shadowing of the receive antenna or unfavorable aircraft attitude, the command has not been received and the parachute has failed to separate. In these circumstances the payload may be dragged for long distances before being recovered, with consequent danger of damage to expensive and sometimes irreplaceable scientific instrumentation. The NSBF has therefore proposed a system which would automatically separate the parachute without the necessity for commanding after touchdown. Such a system is now under development.. Mechanical automatic release systems have been tried in the past with only limited success. The current design uses an electronic system based on a tilt sensor which measures the angle that the suspension train subtends relative to the gravity vector. With the suspension vertical, there is minimum output from the sensor. When the payload touches down, the parachute tilts and in any tilt direction the sensor output increases until a predetermined threshold is reached. At this point, a threshold detector is activated which fires the pyrotechnic cutter to release the parachute. The threshold level is adjustable prior to the flight to enable the optimum tilt angle to be determined from flight experience. The system will not operate until armed by command. This command is sent during the descent when communication with the on-board systems is still normally reliable. A safety interlock is included to inhibit arming if the threshold is already high at the time the command is sent. While this is intended to be the primary system, the manual option would be retained as a back- up. A market

  4. Computer vision techniques for semi-automatic reconstruction of ripped-up documents

    NASA Astrophysics Data System (ADS)

    De Smet, Patrick; De Bock, Johan; Corluy, Els

    2003-08-01

    This paper investigates the use of computer vision techniques to aid in the semi-automatic reconstruction of torn or ripped-up documents. First, we discuss a procedure for obtaining a digital database of a given set of paper fragments using a flatbed image scanner, a brightly coloured scanner background, and a region growing algorithm. The contour of each segmented piece of paper is then traced around using a chain code algorithm and the contours are annotated by calculating a set of feature vectors. Next, the contours of the fragments are matched against each other using the annotated feature information and a string matching algorithm. Finally, the matching results are used to reposition the paper fragments so that a jigsaw puzzle reconstruction of the document can be obtained. For each of the three major components, i.e., segmentation, matching, and global document reconstruction, we briefly discuss a set of prototype GUI tools for guiding and presenting the obtained results. We discuss the performance and the reconstruction results that can be obtained, and show that the proposed framework can offer an interesting set of tools to forensic investigators.

  5. FishCam - A semi-automatic video-based monitoring system of fish migration

    NASA Astrophysics Data System (ADS)

    Kratzert, Frederik; Mader, Helmut

    2016-04-01

    One of the main objectives of the Water Framework Directive is to preserve and restore the continuum of river networks. Regarding vertebrate migration, fish passes are widely used measure to overcome anthropogenic constructions. Functionality of this measure needs to be verified by monitoring. In this study we propose a newly developed monitoring system, named FishCam, to observe fish migration especially in fish passes without contact and without imposing stress on fish. To avoid time and cost consuming field work for fish pass monitoring, this project aims to develop a semi-automatic monitoring system that enables a continuous observation of fish migration. The system consists of a detection tunnel and a high resolution camera, which is mainly based on the technology of security cameras. If changes in the image, e.g. by migrating fish or drifting particles, are detected by a motion sensor, the camera system starts recording and continues until no further motion is detectable. An ongoing key challenge in this project is the development of robust software, which counts, measures and classifies the passing fish. To achieve this goal, many different computer vision tasks and classification steps have to be combined. Moving objects have to be detected and separated from the static part of the image, objects have to be tracked throughout the entire video and fish have to be separated from non-fish objects (e.g. foliage and woody debris, shadows and light reflections). Subsequently, the length of all detected fish needs to be determined and fish should be classified into species. The object classification in fish and non-fish objects is realized through ensembles of state-of-the-art classifiers on a single image per object. The choice of the best image for classification is implemented through a newly developed "fish benchmark" value. This value compares the actual shape of the object with a schematic model of side-specific fish. To enable an automatization of the

  6. A semi-automatic computer-aided method for surgical template design.

    PubMed

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-01-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method. PMID:26843434

  7. A semi-automatic computer-aided method for surgical template design

    PubMed Central

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-01-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method. PMID:26843434

  8. A semi-automatic computer-aided method for surgical template design

    NASA Astrophysics Data System (ADS)

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  9. Semi-automatic system for UV images analysis of historical musical instruments

    NASA Astrophysics Data System (ADS)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  10. Robust semi-automatic segmentation of pulmonary subsolid nodules in chest computed tomography scans

    NASA Astrophysics Data System (ADS)

    Lassen, B. C.; Jacobs, C.; Kuhnigk, J.-M.; van Ginneken, B.; van Rikxoort, E. M.

    2015-02-01

    The malignancy of lung nodules is most often detected by analyzing changes of the nodule diameter in follow-up scans. A recent study showed that comparing the volume or the mass of a nodule over time is much more significant than comparing the diameter. Since the survival rate is higher when the disease is still in an early stage it is important to detect the growth rate as soon as possible. However manual segmentation of a volume is time-consuming. Whereas there are several well evaluated methods for the segmentation of solid nodules, less work is done on subsolid nodules which actually show a higher malignancy rate than solid nodules. In this work we present a fast, semi-automatic method for segmentation of subsolid nodules. As minimal user interaction the method expects a user-drawn stroke on the largest diameter of the nodule. First, a threshold-based region growing is performed based on intensity analysis of the nodule region and surrounding parenchyma. In the next step the chest wall is removed by a combination of a connected component analyses and convex hull calculation. Finally, attached vessels are detached by morphological operations. The method was evaluated on all nodules of the publicly available LIDC/IDRI database that were manually segmented and rated as non-solid or part-solid by four radiologists (Dataset 1) and three radiologists (Dataset 2). For these 59 nodules the Jaccard index for the agreement of the proposed method with the manual reference segmentations was 0.52/0.50 (Dataset 1/Dataset 2) compared to an inter-observer agreement of the manual segmentations of 0.54/0.58 (Dataset 1/Dataset 2). Furthermore, the inter-observer agreement using the proposed method (i.e. different input strokes) was analyzed and gave a Jaccard index of 0.74/0.74 (Dataset 1/Dataset 2). The presented method provides satisfactory segmentation results with minimal observer effort in minimal time and can reduce the inter-observer variability for segmentation of

  11. Semi-automatic transmission

    SciTech Connect

    Morscheck, T.J.; Davis, A.R.; Huggins, M.J.

    1987-06-30

    This patent describes a semi-automatic mechanical change gear transmission system of the type comprising: a mechanical change gear transmission of the type comprising a transmission housing, an input shaft rotatably supported in the housing and driven by an engine through a nonpositive coupling, an output shaft rotatably supported in the housing and a plurality of selectable ratio gears selectively engageable one at a time to a first transmission element by means of positive, nonsynchronized jaw clutch assemblies for providing a plurality of manually selectable drive ratios between the input and output shafts, each of the jaw clutch assemblies comprising a first jaw clutch member rotatably associated with the first transmission element and a second jaw clutch member rotatably associated with a second transmission element, each of the first jaw clutch members axially moveable relative to the first transmission element; manually operated means for engaging and disengaging the nonpositive coupling; manually operated shifting means for engaging selected ratio gears to and disengaging selected ratio gears from the first transmission element; selection for sensing the identity of the particular ratio gear selected for manual engagement or disengagement from the first transmission element and for providing a signal; first and second rotational speed sensors for sensing the rotational speed of the first and second transmission elements and providing signals; a power synchronizer assembly selectively actuable for selectively varying the rotational speed of the second transmission element and the second jaw clutch members rotatably associated therewith; and a central processing unit semi-automatic mechanical change gear transmission system.

  12. Building a semi-automatic ontology learning and construction system for geosciences

    NASA Astrophysics Data System (ADS)

    Babaie, H. A.; Sunderraman, R.; Zhu, Y.

    2013-12-01

    We are developing an ontology learning and construction framework that allows continuous, semi-automatic knowledge extraction, verification, validation, and maintenance by potentially a very large group of collaborating domain experts in any geosciences field. The system brings geoscientists from the side-lines to the center stage of ontology building, allowing them to collaboratively construct and enrich new ontologies, and merge, align, and integrate existing ontologies and tools. These constantly evolving ontologies can more effectively address community's interests, purposes, tools, and change. The goal is to minimize the cost and time of building ontologies, and maximize the quality, usability, and adoption of ontologies by the community. Our system will be a domain-independent ontology learning framework that applies natural language processing, allowing users to enter their ontology in a semi-structured form, and a combined Semantic Web and Social Web approach that lets direct participation of geoscientists who have no skill in the design and development of their domain ontologies. A controlled natural language (CNL) interface and an integrated authoring and editing tool automatically convert syntactically correct CNL text into formal OWL constructs. The WebProtege-based system will allow a potentially large group of geoscientists, from multiple domains, to crowd source and participate in the structuring of their knowledge model by sharing their knowledge through critiquing, testing, verifying, adopting, and updating of the concept models (ontologies). We will use cloud storage for all data and knowledge base components of the system, such as users, domain ontologies, discussion forums, and semantic wikis that can be accessed and queried by geoscientists in each domain. We will use NoSQL databases such as MongoDB as a service in the cloud environment. MongoDB uses the lightweight JSON format, which makes it convenient and easy to build Web applications using

  13. HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics

    NASA Astrophysics Data System (ADS)

    Wiebusch, Martin

    2015-10-01

    This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.

  14. [Digital storage and semi-automatic analysis of esophageal pressure signals. Evaluation of a commercialized system (PC Polygraft, Synectics)].

    PubMed

    Bruley des Varannes, S; Pujol, P; Salim, B; Cherbut, C; Cloarec, D; Galmiche, J P

    1989-11-01

    The aim of this work was to evaluate a new commercially available pressure recording system (PC Polygraf, Synectics) and to compare this system with a classical method using perfused catheters. The PC Polygraf uses microtransducers and allows direct digitized storage and semi-automatic analysis of data. In the first part of this study, manometric assessment was conducted using only perfused catheters. The transducers were connected to both an analog recorder and to a PC Polygraf. Using the two methods of analysis, contraction amplitudes were strongly correlated (r = 0.99; p less than 0.0001) whereas durations were significantly but loosely correlated (r = 0.51; p less than 0.001). Resting LES pressure was significantly correlated (r = 0.87; p less than 0.05). In the second part of this study, simultaneous recordings of esophageal pressure were conducted in 7 patients, by placing side by side the two tubes (microtransducers and perfused catheters) with the sideholes at the same level. The characteristics of the waves were determined both by visual analysis of analog tracing and by semi-automatic analysis of digitized recording with adequate program. Mean amplitude was lower with the microtransducers than with the perfused catheters (60 vs 68 cm H2O; p less than 0.05), but the duration of waves was not significantly different when using both systems. Values obtained for each of these parameters using both methods were significantly correlated (amplitude: r = 0.74; duration: r = 0.51). The localization and the measure of the basal tone of sphincter were found to be difficult when using microtransducers. These results show that PC Polygraf allows a satisfactory analysis of esophageal pressure signals. However, only perfused catheters offer an excellent reliability for complete studies of both sphincter and peristaltism. PMID:2612832

  15. A semi-automatic measurement system based on digital image analysis for the application to the single fiber fragmentation test

    NASA Astrophysics Data System (ADS)

    Blobel, Swen; Thielsch, Karin; Ulbricht, Volker

    2013-04-01

    The computational prediction of the effective macroscopic material behavior of fiber reinforced composites is a goal of research to exploit the potential of these materials. Besides the mechanical characteristics of the material components, an extensive knowledge of the mechanical interaction between these components is necessary in order to set-up suitable models of the local material structure. For example, an experimental investigation of the micromechanical damage behavior of simplified composite specimens can help to understand the mechanisms, which causes matrix and interface damage in the vicinity of a fiber fracture. To realize an appropriate experimental setup, a novel semi-automatic measurement system based on the analysis of digital images using photoelasticity and image correlation was developed. Applied to specimens with a birefringent matrix material, it is able to provide global and local information of the damage evolution and the stress and strain state at the same time. The image acquisition is accomplished using a long distance microscopic optic with an effective resolution of two micrometer per pixel. While the system is moved along the domain of interest of the specimen, the acquired images are assembled online and used to interpret optically extracted information in combination with global force-displacement curves provided by the load frame. The illumination of the specimen with circularly polarized light and the projection of the transmitted light through different configurations of polarizer and quarterwave-plates enables the synchronous capturing of four images at the quadrants of a four megapixel image sensor. The fifth image is decoupled from the same optical path and is projected to a second camera chip, to get a non-polarized image of the same scene at the same time. The benefit of this optical setup is the opportunity to extract a wide range of information locally, without influence on the progress of the experiment. The four images

  16. A real-time semi-automatic video segmentation system based on mathematical morphology

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Hao; Wu, Chi-Hao; Chen, Jun-Cheng; Kuo, Jin-Hau; Wu, Ja-Ling

    2005-07-01

    Mathematic morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been widely applied to many applications such as edge detection, object segmentation, noise suppression. In this paper, a supervised morphology based video segmentation system is proposed. To find where a semantic object resides, the user could click the mouse near the boundary of the object in the first frame of a video to indicate its rough definition, shape and location. The proposed system will automatically segment the first frame by first locating the searching area and then classifying the units in it into object part and non-object part to find out the continuous contour by means of a multi-valued watershed algorithm using a hierarchical queue. An adaptive morphological operator based on edge strength, which is computed by a multi-scale morphological gradient algorithm, is proposed to lower the error of user assistance such that the searching area is created correctly. Once extended to video object segmentation, a fast video tracking technique is applied. Under the assumption of small motion, the object can be segmented in real-time. Moreover, an accuracy evaluation mechanism is proposed to ensure the robustness of the segmentation.

  17. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks

    PubMed Central

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K. Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans

    2016-01-01

    Motivation: While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. Results: To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Availability and Implementation: Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect. Contact: m.a.swertz@rug.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153686

  18. Semi-automatic laboratory goniospectrometer system for performing multi-angular reflectance and polarization measurements for natural surfaces.

    PubMed

    Sun, Z Q; Wu, Z F; Zhao, Y S

    2014-01-01

    In this paper, the design and operation of the Northeast Normal University Laboratory Goniospectrometer System for performing multi-angular reflected and polarized measurements under controlled illumination conditions is described. A semi-automatic arm, which is carried on a rotated circular ring, enables the acquisition of a large number of measurements of surface Bidirectional Reflectance Factor (BRF) over the full hemisphere. In addition, a set of polarizing optics enables the linear polarization over the spectrum from 350 nm to 2300 nm. Because of the stable measurement condition in the laboratory, the BRF and linear polarization has an average uncertainty of 1% and less than 5% depending on the sample property, respectively. The polarimetric accuracy of the instrument is below 0.01 in the form of the absolute value of degree of linear polarization, which is established by measuring a Spectralon plane. This paper also presents the reflectance and polarization of snow, soil, sand, and ice measured during 2010-2013 in order to illustrate its stability and accuracy. These measurement results are useful to understand the scattering property of natural surfaces on Earth. PMID:24517791

  19. Semi-automatic laboratory goniospectrometer system for performing multi-angular reflectance and polarization measurements for natural surfaces

    NASA Astrophysics Data System (ADS)

    Sun, Z. Q.; Wu, Z. F.; Zhao, Y. S.

    2014-01-01

    In this paper, the design and operation of the Northeast Normal University Laboratory Goniospectrometer System for performing multi-angular reflected and polarized measurements under controlled illumination conditions is described. A semi-automatic arm, which is carried on a rotated circular ring, enables the acquisition of a large number of measurements of surface Bidirectional Reflectance Factor (BRF) over the full hemisphere. In addition, a set of polarizing optics enables the linear polarization over the spectrum from 350 nm to 2300 nm. Because of the stable measurement condition in the laboratory, the BRF and linear polarization has an average uncertainty of 1% and less than 5% depending on the sample property, respectively. The polarimetric accuracy of the instrument is below 0.01 in the form of the absolute value of degree of linear polarization, which is established by measuring a Spectralon plane. This paper also presents the reflectance and polarization of snow, soil, sand, and ice measured during 2010-2013 in order to illustrate its stability and accuracy. These measurement results are useful to understand the scattering property of natural surfaces on Earth.

  20. Graphical user interface (GUIDE) and semi-automatic system for the acquisition of anaglyphs

    NASA Astrophysics Data System (ADS)

    Canchola, Marco A.; Arízaga, Juan A.; Cortés, Obed; Tecpanecatl, Eduardo; Cantero, Jose M.

    2013-09-01

    Diverse educational experiences have shown greater acceptance of children to ideas related to science, compared with adults. That fact and showing great curiosity are factors to consider to undertake scientific outreach efforts for children, with prospects of success. Moreover now 3D digital images have become a topic that has gained importance in various areas, entertainment, film and video games mainly, but also in areas such as medical practice transcendental in disease detection This article presents a system model for 3D images for educational purposes that allows students of various grade levels, school and college, have an approach to image processing, explaining the use of filters for stereoscopic images that give brain impression of depth. The system is based on one of two hardware elements, centered on an Arduino board, and a software based on Matlab. The paper presents the design and construction of each of the elements, also presents information on the images obtained and finally how users can interact with the device.

  1. On-site semi-automatic calibration and registration of a projector-camera system using arbitrary objects with known geometry.

    PubMed

    Resch, Christoph; Naik, Hemal; Keitler, Peter; Benkhardt, Steven; Klinker, Gudrun

    2015-11-01

    In the Shader Lamps concept, a projector-camera system augments physical objects with projected virtual textures, provided that a precise intrinsic and extrinsic calibration of the system is available. Calibrating such systems has been an elaborate and lengthy task in the past and required a special calibration apparatus. Self-calibration methods in turn are able to estimate calibration parameters automatically with no effort. However they inherently lack global scale and are fairly sensitive to input data. We propose a new semi-automatic calibration approach for projector-camera systems that - unlike existing auto-calibration approaches - additionally recovers the necessary global scale by projecting on an arbitrary object of known geometry. To this end our method combines surface registration with bundle adjustment optimization on points reconstructed from structured light projections to refine a solution that is computed from the decomposition of the fundamental matrix. In simulations on virtual data and experiments with real data we demonstrate that our approach estimates the global scale robustly and is furthermore able to improve incorrectly guessed intrinsic and extrinsic calibration parameters thus outperforming comparable metric rectification algorithms. PMID:26439823

  2. Semi-automatic Segmentation for Prostate Interventions

    PubMed Central

    Mahdavi, S. Sara; Chng, Nick; Spadinger, Ingrid; Morris, William J.; Salcudean, Septimiu E.

    2011-01-01

    In this paper we report and characterize a semi-automatic prostate segmentation method for prostate brachytherapy. Based on anatomical evidence and requirements of the treatment procedure, a warped and tapered ellipsoid was found suitable as the a priori 3D shape of the prostate. By transforming the acquired endorectal transverse images of the prostate into ellipses, the shape fitting problem was cast into a convex problem which can be solved efficiently. The average whole gland error between volumes created from manual and semi-automatic contours from 21 patients was 6.63±0.9%. For use in brachytherapy treatment planning, the resulting contours were modified, if deemed necessary, by radiation oncologists prior to treatment. The average whole gland volume error between the volumes computed from semi-automatic contours and those computed from modified contours, from 40 patients, was 5.82±4.15%. The amount of bias in the physicians’ delineations when given an initial semi-automatic contour was measured by comparing the volume error between 10 prostate volumes computed from manual contours with those of modified contours. This error was found to be 7.25±0.39% for the whole gland. Automatic contouring reduced subjectivity, as evidenced by a decrease in segmentation inter- and intra-observer variability from 4.65% and 5.95% for manual segmentation to 3.04% and 3.48% for semi-automatic segmentation, respectively. We characterized the performance of the method relative to the reference obtained from manual segmentation by using a novel approach that divides the prostate region into nine sectors. We analyzed each sector independently as the requirements for segmentation accuracy depend on which region of the prostate is considered. The measured segmentation time is 14±1 seconds with an additional 32±14 seconds for initialization. By assuming 1–3 minutes for modification of the contours, if necessary, a total segmentation time of less than 4 minutes is required

  3. Semi-automatic 3D segmentation of carotid lumen in contrast-enhanced computed tomography angiography images.

    PubMed

    Hemmati, Hamidreza; Kamli-Asl, Alireza; Talebpour, Alireza; Shirani, Shapour

    2015-12-01

    The atherosclerosis disease is one of the major causes of the death in the world. Atherosclerosis refers to the hardening and narrowing of the arteries by plaques. Carotid stenosis is a narrowing or constriction of carotid artery lumen usually caused by atherosclerosis. Carotid artery stenosis can increase risk of brain stroke. Contrast-enhanced Computed Tomography Angiography (CTA) is a minimally invasive method for imaging and quantification of the carotid plaques. Manual segmentation of carotid lumen in CTA images is a tedious and time consuming procedure which is subjected to observer variability. As a result, there is a strong and growing demand for developing computer-aided carotid segmentation procedures. In this study, a novel method is presented for carotid artery lumen segmentation in CTA data. First, the mean shift smoothing is used for uniformity enhancement of gray levels. Then with the help of three seed points, the centerlines of the arteries are extracted by a 3D Hessian based fast marching shortest path algorithm. Finally, a 3D Level set function is performed for segmentation. Results on 14 CTA volumes data show 85% of Dice similarity and 0.42 mm of mean absolute surface distance measures. Evaluation shows that the proposed method requires minimal user intervention, low dependence to gray levels changes in artery path, resistance to extreme changes in carotid diameter and carotid branch locations. The proposed method has high accuracy and can be used in qualitative and quantitative evaluation. PMID:26429385

  4. High-Resolution, Semi-Automatic Fault Mapping Using Umanned Aerial Vehicles and Computer Vision: Mapping from an Armchair

    NASA Astrophysics Data System (ADS)

    Micklethwaite, S.; Vasuki, Y.; Turner, D.; Kovesi, P.; Holden, E.; Lucieer, A.

    2012-12-01

    Our ability to characterise fractures depends upon the accuracy and precision of field techniques, as well as the quantity of data that can be collected. Unmanned Aerial Vehicles (UAVs; otherwise known as "drones") and photogrammetry, provide exciting new opportunities for the accurate mapping of fracture networks, over large surface areas. We use a highly stable, 8 rotor, UAV platform (Oktokopter) with a digital SLR camera and the Structure-from-Motion computer vision technique, to generate point clouds, wireframes, digital elevation models and orthorectified photo mosaics. Furthermore, new image analysis methods such as phase congruency are applied to the data to semiautomatically map fault networks. A case study is provided of intersecting fault networks and associated damage, from Piccaninny Point in Tasmania, Australia. Outcrops >1 km in length can be surveyed in a single 5-10 minute flight, with pixel resolution ~1 cm. Centimetre scale precision can be achieved when selected ground control points are measured using a total station. These techniques have the potential to provide rapid, ultra-high resolution mapping of fracture networks, from many different lithologies; enabling us to more accurately assess the "fit" of observed data relative to model predictions, over a wide range of boundary conditions.igh resolution DEM of faulted outcrop (Piccaninny Point, Tasmania) generated using the Oktokopter UAV (inset) and photogrammetric techniques.

  5. Investigating Helmet Promotion for Cyclists: Results from a Randomised Study with Observation of Behaviour, Using a Semi-Automatic Video System

    PubMed Central

    Constant, Aymery; Messiah, Antoine; Felonneau, Marie-Line; Lagarde, Emmanuel

    2012-01-01

    Introduction Half of fatal injuries among bicyclists are head injuries. While helmet use is likely to provide protection, their use often remains rare. We assessed the influence of strategies for promotion of helmet use with direct observation of behaviour by a semi-automatic video system. Methods We performed a single-centre randomised controlled study, with 4 balanced randomisation groups. Participants were non-helmet users, aged 18–75 years, recruited at a loan facility in the city of Bordeaux, France. After completing a questionnaire investigating their attitudes towards road safety and helmet use, participants were randomly assigned to three groups with the provision of “helmet only”, “helmet and information” or “information only”, and to a fourth control group. Bikes were labelled with a colour code designed to enable observation of helmet use by participants while cycling, using a 7-spot semi-automatic video system located in the city. A total of 1557 participants were included in the study. Results Between October 15th 2009 and September 28th 2010, 2621 cyclists' movements, made by 587 participants, were captured by the video system. Participants seen at least once with a helmet amounted to 6.6% of all observed participants, with higher rates in the two groups that received a helmet at baseline. The likelihood of observed helmet use was significantly increased among participants of the “helmet only” group (OR = 7.73 [2.09–28.5]) and this impact faded within six months following the intervention. No effect of information delivery was found. Conclusion Providing a helmet may be of value, but will not be sufficient to achieve high rates of helmet wearing among adult cyclists. Integrated and repeated prevention programmes will be needed, including free provision of helmets, but also information on the protective effect of helmets and strategies to increase peer and parental pressure. PMID:22355384

  6. A method for semi-automatic segmentation and evaluation of intracranial aneurysms in bone-subtraction computed tomography angiography (BSCTA) images

    NASA Astrophysics Data System (ADS)

    Krämer, Susanne; Ditt, Hendrik; Biermann, Christina; Lell, Michael; Keller, Jörg

    2009-02-01

    The rupture of an intracranial aneurysm has dramatic consequences for the patient. Hence early detection of unruptured aneurysms is of paramount importance. Bone-subtraction computed tomography angiography (BSCTA) has proven to be a powerful tool for detection of aneurysms in particular those located close to the skull base. Most aneurysms though are chance findings in BSCTA scans performed for other reasons. Therefore it is highly desirable to have techniques operating on standard BSCTA scans available which assist radiologists and surgeons in evaluation of intracranial aneurysms. In this paper we present a semi-automatic method for segmentation and assessment of intracranial aneurysms. The only user-interaction required is placement of a marker into the vascular malformation. Termination ensues automatically as soon as the segmentation reaches the vessels which feed the aneurysm. The algorithm is derived from an adaptive region-growing which employs a growth gradient as criterion for termination. Based on this segmentation values of high clinical and prognostic significance, such as volume, minimum and maximum diameter as well as surface of the aneurysm, are calculated automatically. the segmentation itself as well as the calculated diameters are visualised. Further segmentation of the adjoining vessels provides the means for visualisation of the topographical situation of vascular structures associated to the aneurysm. A stereolithographic mesh (STL) can be derived from the surface of the segmented volume. STL together with parameters like the resiliency of vascular wall tissue provide for an accurate wall model of the aneurysm and its associated vascular structures. Consequently the haemodynamic situation in the aneurysm itself and close to it can be assessed by flow modelling. Significant values of haemodynamics such as pressure onto the vascular wall, wall shear stress or pathlines of the blood flow can be computed. Additionally a dynamic flow model can be

  7. Semi-automatic approach for music classification

    NASA Astrophysics Data System (ADS)

    Zhang, Tong

    2003-11-01

    Audio categorization is essential when managing a music database, either a professional library or a personal collection. However, a complete automation in categorizing music into proper classes for browsing and searching is not yet supported by today"s technology. Also, the issue of music classification is subjective to some extent as each user may have his own criteria for categorizing music. In this paper, we propose the idea of semi-automatic music classification. With this approach, a music browsing system is set up which contains a set of tools for separating music into a number of broad types (e.g. male solo, female solo, string instruments performance, etc.) using existing music analysis methods. With results of the automatic process, the user may further cluster music pieces in the database into finer classes and/or adjust misclassifications manually according to his own preferences and definitions. Such a system may greatly improve the efficiency of music browsing and retrieval, while at the same time guarantee accuracy and user"s satisfaction of the results. Since this semi-automatic system has two parts, i.e. the automatic part and the manual part, they are described separately in the paper, with detailed descriptions and examples of each step of the two parts included.

  8. Quantification of coronary artery plaque using 64-slice dual-source CT: comparison of semi-automatic and automatic computer-aided analysis based on intravascular ultrasonography as the gold standard.

    PubMed

    Kim, Young Jun; Jin, Gong Yong; Kim, Eun Young; Han, Young Min; Chae, Jei Keon; Lee, Sang Rok; Kwon, Keun Sang

    2013-12-01

    We evaluated the feasibility of automatic computer-aided analysis (CAA) compared with semi-automatic CAA for differentiating lipid-rich from fibrous plaques based on coronary CT angiography (CCTA) imaging. Seventy-four coronary plaques in 57 patients were evaluated by CCTA using 64-slice dual-source CT. Quantitative analysis of coronary artery plaques was performed by measuring the relative volumes (low, medium, and calcified) of plaque components using automatic CAA and by measuring mean CT density using semi-automatic CAA. We compared the two plaque measurement methods for lipid-rich and fibrous plaques using Pearson's correlation. Intravascular ultrasonography was used as the goal standard for assessment of plaques. Mean CT density of plaques tended to increase in the order of lipid [36 ± 19 Hounsfield unit (HU)], fibrous (106 ± 34 HU), and then calcified plaques (882 ± 296 HU). The mean relative volumes of 'low' components measured by automatic CAA were 13.8 ± 4.6, 7.9 ± 6.7, and 3.5 ± 3.0 % for lipid, fibrous, and calcified plaques, respectively (r = -0.348, P = 0.022). The mean relative volumes of 'medium' components on automatic CAA were 12.9 ± 4.1, 15.7 ± 9.6, and 5.6 ± 4.8 % for lipid, fibrous, and calcified plaques, respectively (r = -0.385, P = 0.011). The mean relative volumes of low and medium components within plaques significantly correlated with the types of plaques. Plaque analysis using automatic CAA has the potential to differentiate lipid from fibrous plaques based on measurement of the relative volume percentages of the low and medium components. PMID:24293043

  9. Anomalous ECG downloads from semi-automatic external defibrillators.

    PubMed

    Calle, P A; Vanhaute, O; Ranhoff, J F; Buylaert, W A

    1998-08-01

    The coincidental print-out by two different Laerdal systems (subsequently called 'system A' and 'system B') of the same medical control module (MCM) for a Laerdal Heartstart 2000 semi-automatic external defibrillator (SAED) led to the discovery of three deficiencies in the information storage and printing processes. First, we noted that the impedance reported via system A was consistently higher. Second, we found the attachment of 'mysterious' ECG samples in the reports from system B, but not from system A. A third problem was the unpredictable (in)ability of system B to print out the information from the MCMs. Further investigations with help from the company suggested that the above-mentioned problems were caused by incompatibilities between the software in the different parts of equipment used (i.e. SAED devices, MCMs, printing systems and a computer program to store the information in a database). These observations demonstrate the need for strict medical supervision on all aspects of a SAED project, and for feed-back from clinicians to manufacturers. PMID:9863574

  10. A Semi-Automatic Variability Search

    NASA Astrophysics Data System (ADS)

    Maciejewski, G.; Niedzielski, A.

    Technical features of the Semi-Automatic Variability Search (SAVS) operating at the Astronomical Observatory of the Nicolaus Copernicus University and the results of the first year of observations are presented. The user-friendly software developed for reduction of acquired CCD images and detection of new variable stars is also described.

  11. Semi-automatic object geometry estimation for image personalization

    NASA Astrophysics Data System (ADS)

    Ding, Hengzhou; Bala, Raja; Fan, Zhigang; Eschbach, Reiner; Bouman, Charles A.; Allebach, Jan P.

    2010-01-01

    Digital printing brings about a host of benefits, one of which is the ability to create short runs of variable, customized content. One form of customization that is receiving much attention lately is in photofinishing applications, whereby personalized calendars, greeting cards, and photo books are created by inserting text strings into images. It is particularly interesting to estimate the underlying geometry of the surface and incorporate the text into the image content in an intelligent and natural way. Current solutions either allow fixed text insertion schemes into preprocessed images, or provide manual text insertion tools that are time consuming and aimed only at the high-end graphic designer. It would thus be desirable to provide some level of automation in the image personalization process. We propose a semi-automatic image personalization workflow which includes two scenarios: text insertion and text replacement. In both scenarios, the underlying surfaces are assumed to be planar. A 3-D pinhole camera model is used for rendering text, whose parameters are estimated by analyzing existing structures in the image. Techniques in image processing and computer vison such as the Hough transform, the bilateral filter, and connected component analysis are combined, along with necessary user inputs. In particular, the semi-automatic workflow is implemented as an image personalization tool, which is presented in our companion paper.1 Experimental results including personalized images for both scenarios are shown, which demonstrate the effectiveness of our algorithms.

  12. Semi-Automatic Digital Landform Mapping

    NASA Astrophysics Data System (ADS)

    Schneider, Martin; Klein, Reinhard

    In this paper a framework for landform mapping on digital aerial photos and elevation models is presented. The developed mapping tools are integrated in a real-time terrain visualization engine in order to improve the visual recovery and identification of objects. Moreover, semi-automatic image segmentation techniques are built into the mapping tools to make object specification faster and easier without reducing accuracy. Thus, the high level cognitive task of object identification is left to the user whereas the segmentation algorithm performs the low level task of capturing the fine details of the object boundary. In addition to that, the user is able to supply additional photos of regions of interest and to match them with the textured DEM. The matched photos do not only drastically increase the visual information content of the data set but also contribute to the mapping process. Using this additional information precise landform mapping becomes even possible at steep slopes although they are only insufficiently represented in aerial imagery. As proof of concept we mapped several geomorphological structures in a high alpine valley.

  13. Semi-automatic knee cartilage segmentation

    NASA Astrophysics Data System (ADS)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  14. Performance testing of a semi-automatic card punch system, using direct STR profiling of DNA from blood samples on FTA™ cards.

    PubMed

    Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J

    2015-01-01

    The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. PMID:25407399

  15. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  16. Semi-automatic customization of internal fracture fixation plates.

    PubMed

    Musuvathy, Suraj; Azernikov, Sergei; Fang, Tong

    2011-01-01

    A new method for customization of fixation plates for repairing bone fractures is proposed. Digital models of plates are typically available as CAD models that contain smooth analytic geometry representations including NURBS. With the existing pre-operative planning solutions, these models are converted to polygonal meshes and adapted manually to the patient's bone geometry by the user. Based on the deformed model, physical bending is then performed by the surgeon in operating room. With the proposed approach, CAD models are semi-automatically adapted using NURBS to generate customized plates that conform to the desired region of the bone surface of patients. This enables an efficient and accurate approach that is also computationally suitable for interactive planning applications. Moreover, the patient-specific customized plates can then be produced directly from the adapted CAD models with a standard CNC machine before surgery. This may dramatically reduce time spent in OR, improve precision of the procedure and as a result improve the patient's outcome. PMID:22254380

  17. Semi-automatic recognition of marine debris on beaches

    PubMed Central

    Ge, Zhenpeng; Shi, Huahong; Mei, Xuefei; Dai, Zhijun; Li, Daoji

    2016-01-01

    An increasing amount of anthropogenic marine debris is pervading the earth’s environmental systems, resulting in an enormous threat to living organisms. Additionally, the large amount of marine debris around the world has been investigated mostly through tedious manual methods. Therefore, we propose the use of a new technique, light detection and ranging (LIDAR), for the semi-automatic recognition of marine debris on a beach because of its substantially more efficient role in comparison with other more laborious methods. Our results revealed that LIDAR should be used for the classification of marine debris into plastic, paper, cloth and metal. Additionally, we reconstructed a 3-dimensional model of different types of debris on a beach with a high validity of debris revivification using LIDAR-based individual separation. These findings demonstrate that the availability of this new technique enables detailed observations to be made of debris on a large beach that was previously not possible. It is strongly suggested that LIDAR could be implemented as an appropriate monitoring tool for marine debris by global researchers and governments. PMID:27156433

  18. Semi-automatic development of Payload Operations Control Center software

    NASA Technical Reports Server (NTRS)

    Ballin, Sidney

    1988-01-01

    This report summarizes the current status of CTA's investigation of methods and tools for automating the software development process in NASA Goddard Space Flight Center, Code 500. The emphasis in this effort has been on methods and tools in support of software reuse. The most recent phase of the effort has been a domain analysis of Payload Operations Control Center (POCC) software. This report summarizes the results of the domain analysis, and proposes an approach to semi-automatic development of POCC Application Processor (AP) software based on these results. The domain analysis enabled us to abstract, from specific systems, the typical components of a POCC AP. We were also able to identify patterns in the way one AP might be different from another. These two perspectives--aspects that tend to change from AP to AP, and aspects that tend to remain the same--suggest an overall approach to the reuse of POCC AP software. We found that different parts of an AP require different development technologies. We propose a hybrid approach that combines constructive and generative technologies. Constructive methods emphasize the assembly of pre-defined reusable components. Generative methods provide for automated generation of software from specifications in a very-high-level language (VHLL).

  19. Semi-automatic recognition of marine debris on beaches

    NASA Astrophysics Data System (ADS)

    Ge, Zhenpeng; Shi, Huahong; Mei, Xuefei; Dai, Zhijun; Li, Daoji

    2016-05-01

    An increasing amount of anthropogenic marine debris is pervading the earth’s environmental systems, resulting in an enormous threat to living organisms. Additionally, the large amount of marine debris around the world has been investigated mostly through tedious manual methods. Therefore, we propose the use of a new technique, light detection and ranging (LIDAR), for the semi-automatic recognition of marine debris on a beach because of its substantially more efficient role in comparison with other more laborious methods. Our results revealed that LIDAR should be used for the classification of marine debris into plastic, paper, cloth and metal. Additionally, we reconstructed a 3-dimensional model of different types of debris on a beach with a high validity of debris revivification using LIDAR-based individual separation. These findings demonstrate that the availability of this new technique enables detailed observations to be made of debris on a large beach that was previously not possible. It is strongly suggested that LIDAR could be implemented as an appropriate monitoring tool for marine debris by global researchers and governments.

  20. Semi-automatic recognition of marine debris on beaches.

    PubMed

    Ge, Zhenpeng; Shi, Huahong; Mei, Xuefei; Dai, Zhijun; Li, Daoji

    2016-01-01

    An increasing amount of anthropogenic marine debris is pervading the earth's environmental systems, resulting in an enormous threat to living organisms. Additionally, the large amount of marine debris around the world has been investigated mostly through tedious manual methods. Therefore, we propose the use of a new technique, light detection and ranging (LIDAR), for the semi-automatic recognition of marine debris on a beach because of its substantially more efficient role in comparison with other more laborious methods. Our results revealed that LIDAR should be used for the classification of marine debris into plastic, paper, cloth and metal. Additionally, we reconstructed a 3-dimensional model of different types of debris on a beach with a high validity of debris revivification using LIDAR-based individual separation. These findings demonstrate that the availability of this new technique enables detailed observations to be made of debris on a large beach that was previously not possible. It is strongly suggested that LIDAR could be implemented as an appropriate monitoring tool for marine debris by global researchers and governments. PMID:27156433

  1. A dorsolateral prefrontal cortex semi-automatic segmenter

    NASA Astrophysics Data System (ADS)

    Al-Hakim, Ramsey; Fallon, James; Nain, Delphine; Melonakos, John; Tannenbaum, Allen

    2006-03-01

    Structural, functional, and clinical studies in schizophrenia have, for several decades, consistently implicated dysfunction of the prefrontal cortex in the etiology of the disease. Functional and structural imaging studies, combined with clinical, psychometric, and genetic analyses in schizophrenia have confirmed the key roles played by the prefrontal cortex and closely linked "prefrontal system" structures such as the striatum, amygdala, mediodorsal thalamus, substantia nigra-ventral tegmental area, and anterior cingulate cortices. The nodal structure of the prefrontal system circuit is the dorsal lateral prefrontal cortex (DLPFC), or Brodmann area 46, which also appears to be the most commonly studied and cited brain area with respect to schizophrenia. 1, 2, 3, 4 In 1986, Weinberger et. al. tied cerebral blood flow in the DLPFC to schizophrenia.1 In 2001, Perlstein et. al. demonstrated that DLPFC activation is essential for working memory tasks commonly deficient in schizophrenia. 2 More recently, groups have linked morphological changes due to gene deletion and increased DLPFC glutamate concentration to schizophrenia. 3, 4 Despite the experimental and clinical focus on the DLPFC in structural and functional imaging, the variability of the location of this area, differences in opinion on exactly what constitutes DLPFC, and inherent difficulties in segmenting this highly convoluted cortical region have contributed to a lack of widely used standards for manual or semi-automated segmentation programs. Given these implications, we developed a semi-automatic tool to segment the DLPFC from brain MRI scans in a reproducible way to conduct further morphological and statistical studies. The segmenter is based on expert neuroanatomist rules (Fallon-Kindermann rules), inspired by cytoarchitectonic data and reconstructions presented by Rajkowska and Goldman-Rakic. 5 It is semi-automated to provide essential user interactivity. We present our results and provide details on

  2. Semi-automatic procedure to extract Couinaud liver segments from multislice CT data

    NASA Astrophysics Data System (ADS)

    Varma, Jay; Durgan, Jacob; Subramanyan, Krishna

    2003-05-01

    Liver resection and transplantation surgeries require careful planning and accurate knowledge of the vascular and gross anatomy of the liver. This study aims to create a semi-automatic method for segmenting the liver, along with its entire venous vessel tree from multi-detector computed tomograms. Using fast marching and region-growth techniques along with morphological operations, we have developed a software package which can isolate the liver and the hepatic venous network from a user-selected seed point. The user is then presented with volumetric analysis of the liver and a 3-Dimensional surface rendering. Software tools allow the user to then analyze the lobes of the liver based upon venous anatomy, as defined by Couinaud. The software package also has utilities for data management, key image specification, commenting, and reporting. Seven patients were scanned with contrast on the Mx8000 CT scanner (Philips Medical Systems), the data was analyzed using our method and compared with results found using a manual method. The results show that the semi-automated method utilizes less time than manual methods, with results that are consistent and similar. Also, display of the venous network along with the entire liver in three dimensions is a unique feature of this software.

  3. A semi-automatic measuring machine

    NASA Technical Reports Server (NTRS)

    Strand, K. A.

    1971-01-01

    A machine designed to locate automatically a star image on a photographic plate from approximate coordinates on a punched card and to measure and record its position to a micron is described. The main frame of the machine is of granite, as are the x and y coordinate carriages which move on air bearings against granite ways. The system is capable of making measurements continuously over a 10 x 10 inch range by means of a Ferranti moire fringe system, with a least count of one micron.

  4. Linking IT-Based Semi-Automatic Marking of Student Mathematics Responses and Meaningful Feedback to Pedagogical Objectives

    ERIC Educational Resources Information Center

    Wong, Khoon Yoong; Oh, Kwang-Shin; Ng, Qiu Ting Yvonne; Cheong, Jim Siew Kuan

    2012-01-01

    The purposes of an online system to auto-mark students' responses to mathematics test items are to expedite the marking process, to enhance consistency in marking and to alleviate teacher assessment workload. We propose that a semi-automatic marking and customizable feedback system better serves pedagogical objectives than a fully automatic one.…

  5. Semi-Automatic Determination of Rockfall Trajectories

    PubMed Central

    Volkwein, Axel; Klette, Johannes

    2014-01-01

    In determining rockfall trajectories in the field, it is essential to calibrate and validate rockfall simulation software. This contribution presents an in situ device and a complementary Local Positioning System (LPS) that allow the determination of parts of the trajectory. An assembly of sensors (herein called rockfall sensor) is installed in the falling block recording the 3D accelerations and rotational velocities. The LPS automatically calculates the position of the block along the slope over time based on Wi-Fi signals emitted from the rockfall sensor. The velocity of the block over time is determined through post-processing. The setup of the rockfall sensor is presented followed by proposed calibration and validation procedures. The performance of the LPS is evaluated by means of different experiments. The results allow for a quality analysis of both the obtained field data and the usability of the rockfall sensor for future/further applications in the field. PMID:25268916

  6. User Interaction in Semi-Automatic Segmentation of Organs at Risk: a Case Study in Radiotherapy.

    PubMed

    Ramkumar, Anjana; Dolz, Jose; Kirisli, Hortense A; Adebahr, Sonja; Schimek-Jasch, Tanja; Nestle, Ursula; Massoptier, Laurent; Varga, Edit; Stappers, Pieter Jan; Niessen, Wiro J; Song, Yu

    2016-04-01

    Accurate segmentation of organs at risk is an important step in radiotherapy planning. Manual segmentation being a tedious procedure and prone to inter- and intra-observer variability, there is a growing interest in automated segmentation methods. However, automatic methods frequently fail to provide satisfactory result, and post-processing corrections are often needed. Semi-automatic segmentation methods are designed to overcome these problems by combining physicians' expertise and computers' potential. This study evaluates two semi-automatic segmentation methods with different types of user interactions, named the "strokes" and the "contour", to provide insights into the role and impact of human-computer interaction. Two physicians participated in the experiment. In total, 42 case studies were carried out on five different types of organs at risk. For each case study, both the human-computer interaction process and quality of the segmentation results were measured subjectively and objectively. Furthermore, different measures of the process and the results were correlated. A total of 36 quantifiable and ten non-quantifiable correlations were identified for each type of interaction. Among those pairs of measures, 20 of the contour method and 22 of the strokes method were strongly or moderately correlated, either directly or inversely. Based on those correlated measures, it is concluded that: (1) in the design of semi-automatic segmentation methods, user interactions need to be less cognitively challenging; (2) based on the observed workflows and preferences of physicians, there is a need for flexibility in the interface design; (3) the correlated measures provide insights that can be used in improving user interaction design. PMID:26553109

  7. Semi-automatic coding with ICPC: the Thesaurus, the algorithm and the Dutch subtitles.

    PubMed

    Gebel, R S

    1997-01-01

    In the ICPC Thesaurus Project, which ran from 1990 to 1992, the Dutch translation of the English version of the ICPC-components 1 and 7 was made available for automated coding by structuring and improving the thesaurus and by developing an algorithm for selecting possible ICPC-codes from a set of medical terms given as input to the program. The thesaurus and algorithm are available to the developers of GP information systems and are at present incorporated in all Dutch GP-systems. This paper brings you up to date with the semi-automatic coding system and the so called Dutch subtitles, an extension to the ICPC. PMID:10179584

  8. Semi Automatic Ontology Instantiation in the domain of Risk Management

    NASA Astrophysics Data System (ADS)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  9. Semi-automatic determination of lead in whole blood.

    PubMed

    Delves, H T; Vinter, P

    1966-09-01

    The procedure developed by Browett and Moss (1964) for the semi-automatic determination of the lead content of urine has been adapted for the determination of lead in blood. Determinations are normally carried out in duplicate on 2.0 ml. samples of whole blood and the minimum sample size is 0.5 ml. The organic substances present in blood are destroyed by a manual wet-oxidation procedure and the lead is determined colorimetrically as lead dithizonate using a Technicon AutoAnalyzer. The lower limit of detection, expressed as three times the standard deviation of the blank value, is 5 mug. Pb/100 ml. blood. The standard deviation of the method in the upper range of normal blood lead level of 30 mug. Pb/100 ml. blood (Moncrieff, Koumides, Clayton, Patrick, Renwick, and Roberts, 1964), is +/- 3 mug. Pb/100 ml. blood. Ten samples per hour may be estimated in duplicate. PMID:5919367

  10. A semi-automatic multi-view depth estimation method

    NASA Astrophysics Data System (ADS)

    Wildeboer, Meindert Onno; Fukushima, Norishige; Yendo, Tomohiro; Panahpour Tehrani, Mehrdad; Fujii, Toshiaki; Tanimoto, Masayuki

    2010-07-01

    In this paper, we propose a semi-automatic depth estimation algorithm whereby the user defines object depth boundaries and disparity initialization. Automatic depth estimation methods generally have difficulty to obtain good depth results around object edges and in areas with low texture. The goal of our method is to improve the depth in these areas and reduce view synthesis artifacts in Depth Image Based Rendering. Good view synthesis quality is very important in applications such as 3DTV and Free-viewpoint Television (FTV). In our proposed method, initial disparity values for smooth areas can be input through a so-called manual disparity map, and depth boundaries are defined by a manually created edge map which can be supplied for one or multiple frames. For evaluation we used MPEG multi-view videos and we demonstrate our algorithm can significantly improve the depth maps and reduce view synthesis artifacts.

  11. Semi-automatic inspecting instrument for watch escape wheel based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Zhong; Wang, Zhen-wei; Zhang, Jin; Cai, Zhen-xing; Liu, Xin-bo

    2011-12-01

    Escape wheel as a typical precision micro-machinery part is one of the most precision parts in one mechanical watch. A new inspecting instrument based on machine vision technology used to achieve semi-automatic inspection of watch escape wheel is introduced in this paper. This instrument makes use of high resolution CCD sensor and independent designed lens as the imaging system. It can not only achieve to image an area with 7mm diameter once, but also has the resolving power in micrometer and cooperates with two-dimensional moving station to achieve a continuous and automatic measurement of the work pieces placed in array type. In which, the following aspects are highlighted: measuring princeple and process, the basic components of array type measuring workbench, positioning process and verticality, parallelism and other precision adjusting mechanism. Cooperating with novelty escape wheel preparation tool this instrument forms an array type semi-automatic measuring mode. At present, the instrument has been successfully running in the industry field.

  12. Semi-automatic parcellation of the corpus striatum

    NASA Astrophysics Data System (ADS)

    Al-Hakim, Ramsey; Nain, Delphine; Levitt, James; Shenton, Martha; Tannenbaum, Allen

    2007-03-01

    The striatum is the input component of the basal ganglia from the cerebral cortex. It includes the caudate, putamen, and nucleus accumbens. Thus, the striatum is an important component in limbic frontal-subcortical circuitry and is believed to be relevant both for reward-guided behaviors and for the expression of psychosis. The dorsal striatum is composed of the caudate and putamen, both of which are further subdivided into pre- and post-commissural components. The ventral striatum (VS) is primarily composed of the nucleus accumbens. The striatum can be functionally divided into three broad regions: 1) a limbic; 2) a cognitive and 3) a sensor-motor region. The approximate corresponding anatomic subregions for these 3 functional regions are: 1) the VS; 2) the pre/post-commissural caudate and the pre-commissural putamen and 3) the post-commissural putamen. We believe assessing these subregions, separately, in disorders with limbic and cognitive impairment such as schizophrenia may yield more informative group differences in comparison with normal controls than prior parcellation strategies of the striatum such as assessing the caudate and putamen. The manual parcellation of the striatum into these subregions is currently defined using certain landmark points and geometric rules. Since identification of these areas is important to clinical research, a reliable and fast parcellation technique is required. Currently, only full manual parcellation using editing software is available; however, this technique is extremely time intensive. Previous work has shown successful application of heuristic rules into a semi-automatic platform1. We present here a semi-automatic algorithm which implements the rules currently used for manual parcellation of the striatum, but requires minimal user input and significantly reduces the time required for parcellation.

  13. A semi-automatic method for positioning a femoral bone reconstruction for strict view generation.

    PubMed

    Milano, Federico; Ritacco, Lucas; Gomez, Adrian; Gonzalez Bernaldo de Quiros, Fernan; Risk, Marcelo

    2010-01-01

    In this paper we present a semi-automatic method for femoral bone positioning after 3D image reconstruction from Computed Tomography images. This serves as grounding for the definition of strict axial, longitudinal and anterior-posterior views, overcoming the problem of patient positioning biases in 2D femoral bone measuring methods. After the bone reconstruction is aligned to a standard reference frame, new tomographic slices can be generated, on which unbiased measures may be taken. This could allow not only accurate inter-patient comparisons but also intra-patient comparisons, i.e., comparisons of images of the same patient taken at different times. This method could enable medical doctors to diagnose and follow up several bone deformities more easily. PMID:21096490

  14. A semi-automatic method for peak and valley detection in free-breathing respiratory waveforms

    SciTech Connect

    Lu Wei; Nystrom, Michelle M.; Parikh, Parag J.; Fooshee, David R.; Hubenschmidt, James P.; Bradley, Jeffrey D.; Low, Daniel A.

    2006-10-15

    The existing commercial software often inadequately determines respiratory peaks for patients in respiration correlated computed tomography. A semi-automatic method was developed for peak and valley detection in free-breathing respiratory waveforms. First the waveform is separated into breath cycles by identifying intercepts of a moving average curve with the inspiration and expiration branches of the waveform. Peaks and valleys were then defined, respectively, as the maximum and minimum between pairs of alternating inspiration and expiration intercepts. Finally, automatic corrections and manual user interventions were employed. On average for each of the 20 patients, 99% of 307 peaks and valleys were automatically detected in 2.8 s. This method was robust for bellows waveforms with large variations.

  15. Semi-automatic conversion of BioProp semantic annotation to PASBio annotation

    PubMed Central

    Tsai, Richard Tzong-Han; Dai, Hong-Jie; Huang, Chi-Hsin; Hsu, Wen-Lian

    2008-01-01

    Background Semantic role labeling (SRL) is an important text analysis technique. In SRL, sentences are represented by one or more predicate-argument structures (PAS). Each PAS is composed of a predicate (verb) and several arguments (noun phrases, adverbial phrases, etc.) with different semantic roles, including main arguments (agent or patient) as well as adjunct arguments (time, manner, or location). PropBank is the most widely used PAS corpus and annotation format in the newswire domain. In the biomedical field, however, more detailed and restrictive PAS annotation formats such as PASBio are popular. Unfortunately, due to the lack of an annotated PASBio corpus, no publicly available machine-learning (ML) based SRL systems based on PASBio have been developed. In previous work, we constructed a biomedical corpus based on the PropBank standard called BioProp, on which we developed an ML-based SRL system, BIOSMILE. In this paper, we aim to build a system to convert BIOSMILE's BioProp annotation output to PASBio annotation. Our system consists of BIOSMILE in combination with a BioProp-PASBio rule-based converter, and an additional semi-automatic rule generator. Results Our first experiment evaluated our rule-based converter's performance independently from BIOSMILE performance. The converter achieved an F-score of 85.29%. The second experiment evaluated combined system (BIOSMILE + rule-based converter). The system achieved an F-score of 69.08% for PASBio's 29 verbs. Conclusion Our approach allows PAS conversion between BioProp and PASBio annotation using BIOSMILE alongside our newly developed semi-automatic rule generator and rule-based converter. Our system can match the performance of other state-of-the-art domain-specific ML-based SRL systems and can be easily customized for PASBio application development. PMID:19091017

  16. Semi-Automatic Road/Pavement Modeling using Mobile Laser Scanning

    NASA Astrophysics Data System (ADS)

    Hervieu, A.; Soheilian, B.

    2013-10-01

    Scene analysis, in urban environments, deals with street modeling and understanding. A street mainly consists of roadways, pavements (i.e., walking areas), facades, still and moving obstacles. In this paper, we investigate the surface modeling of roadways and pavements using LIDAR data acquired by a mobile laser scanning (MLS) system. First, road border detection is considered. A system recognizing curbs and curb ramps while reconstructing the missing information in case of occlusion is presented. A user interface scheme is also described, providing an effective tool for semi-automatic processing of large amount of data. Then, based upon road edge information, a process that reconstructs surfaces of roads and pavements has been developed, providing a centimetric precision while reconstructing missing information. This system hence provides an important knowledge of the street, that may open perspectives in various domains such as path planning or road maintenance.

  17. Semi-automatic determination of lead in whole blood

    PubMed Central

    Delves, H. T.; Vinter, P.

    1966-01-01

    The procedure developed by Browett and Moss (1964) for the semi-automatic determination of the lead content of urine has been adapted for the determination of lead in blood. Determinations are normally carried out in duplicate on 2.0 ml. samples of whole blood and the minimum sample size is 0.5 ml. The organic substances present in blood are destroyed by a manual wet-oxidation procedure and the lead is determined colorimetrically as lead dithizonate using a Technicon AutoAnalyzer. The lower limit of detection, expressed as three times the standard deviation of the blank value, is 5 μg. Pb/100 ml. blood. The standard deviation of the method in the upper range of normal blood lead level of 30 μg. Pb/100 ml. blood (Moncrieff, Koumides, Clayton, Patrick, Renwick, and Roberts, 1964), is ± 3 μg. Pb/100 ml. blood. Ten samples per hour may be estimated in duplicate. Images PMID:5919367

  18. a New Approach for the Semi-Automatic Texture Generation of the Buildings Facades, from Terrestrial Laser Scanner Data

    NASA Astrophysics Data System (ADS)

    Oniga, E.

    2012-07-01

    The result of the terrestrial laser scanning is an impressive number of spatial points, each of them being characterized as position by the X, Y and Z co-ordinates, by the value of the laser reflectance and their real color, expressed as RGB (Red, Green, Blue) values. The color code for each LIDAR point is taken from the georeferenced digital images, taken with a high resolution panoramic camera incorporated in the scanner system. In this article I propose a new algorithm for the semiautomatic texture generation, using the color information, the RGB values of every point that has been taken by terrestrial laser scanning technology and the 3D surfaces defining the buildings facades, generated with the Leica Cyclone software. The first step is when the operator defines the limiting value, i.e. the minimum distance between a point and the closest surface. The second step consists in calculating the distances, or the perpendiculars drawn from each point to the closest surface. In the third step we associate the points whose 3D coordinates are known, to every surface, depending on the limiting value. The fourth step consists in computing the Voronoi diagram for the points that belong to a surface. The final step brings automatic association between the RGB value of the color code and the corresponding polygon of the Voronoi diagram. The advantage of using this algorithm is that we can obtain, in a semi-automatic manner, a photorealistic 3D model of the building.

  19. Clinical evaluation of semi-automatic landmark-based lesion tracking software for CT-scans

    PubMed Central

    2014-01-01

    Background To evaluate a semi-automatic landmark-based lesion tracking software enabling navigation between RECIST lesions in baseline and follow-up CT-scans. Methods The software automatically detects 44 stable anatomical landmarks in each thoraco/abdominal/pelvic CT-scan, sets up a patient specific coordinate-system and cross-links the coordinate-systems of consecutive CT-scans. Accuracy of the software was evaluated on 96 RECIST lesions (target- and non-target lesions) in baseline and follow-up CT-scans of 32 oncologic patients (64 CT-scans). Patients had to present at least one thoracic, one abdominal and one pelvic RECIST lesion. Three radiologists determined the deviation between lesions’ centre and the software’s navigation result in consensus. Results The initial mean runtime of the system to synchronize baseline and follow-up examinations was 19.4 ± 1.2 seconds, with subsequent navigation to corresponding RECIST lesions facilitating in real-time. Mean vector length of the deviations between lesions’ centre and the semi-automatic navigation result was 10.2 ± 5.1 mm without a substantial systematic error in any direction. Mean deviation in the cranio-caudal dimension was 5.4 ± 4.0 mm, in the lateral dimension 5.2 ± 3.9 mm and in the ventro-dorsal dimension 5.3 ± 4.0 mm. Conclusion The investigated software accurately and reliably navigates between lesions in consecutive CT-scans in real-time, potentially accelerating and facilitating cancer staging. PMID:25609496

  20. A semi-automatic web based tool for the selection of research projects reviewers.

    PubMed

    Pupella, Valeria; Monteverde, Maria Eugenia; Lombardo, Claudio; Belardelli, Filippo; Giacomini, Mauro

    2014-01-01

    The correct evaluation of research proposals continues today to be problematic, and in many cases, grants and fellowships are subjected to this type of assessment. A web based semi-automatic tool to help in the selection of reviewers was developed. The core of the proposed system is the matching of the MeSH Descriptors of the publications submitted by the reviewers (for their accreditation) and the Descriptor linked to the research keywords, which were selected. Moreover, a citation related index was further calculated and adopted in order to discard not suitable reviewers. This tool was used as a support in a web site for the evaluation of candidates applying for a fellowship in the oncology field. PMID:25160328

  1. Semi-automatic Road Extraction from SAR images using EKF and PF

    NASA Astrophysics Data System (ADS)

    Zhao, J. Q.; Yang, J.; Li, P. X.; Lu, J. M.

    2015-06-01

    Recently, the use of linear features for processing remote sensing images has shown its importance in applications. As one of typical linear targets, road is a hot spot of remote sensing image interpretation. Since extracting road by manual processing is too expensive and time consuming, researches based on automatic and semi-automatic have become more and more popular. Such interest is motivated by the requirements for civilian and military applications, such as road maps, traffic monitoring, navigation applications, and topographic mapping. How to extract road accurately and efficiently from SAR images is a key problem. In this paper, through analyzing characteristics of road, semi-automatic road extraction based on Extend Kalman Filtering (EKF) and Particles Filtering (PF), is presented. These two methods have the same algorithm flow which is an iterative approach based on prediction and update. The specific procedure as follows: at prediction stage, we obtain prior probability density function by the prior stage and prediction model, and through prior probability density function and the new measurement, at update stage we obtain the posterior probability density function which is the optimal estimation of road system state. Both EKF and PF repeat the steps above until the extracting tasks are finished. We use these two methods to extract road respectively. The effectiveness of the proposed method is demonstrated through the experiments from Howland by UAVSAR in L-band. And through contrast experiments, we discover that extracting difference complexity of road based on different methods can improve accuracy and efficiency. The results show that EKF has better performance on road with middle noise and PF has better performance on road with high noise.

  2. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  3. Semi-automatic system for ultrasonic measurement of texture

    DOEpatents

    Thompson, R. Bruce; Wormley, Samuel J.

    1991-09-17

    A means and method for ultrasonic measurement of texture non-destructively and efficiently. Texture characteristics are derived by transmitting ultrasound energy into the material, measuring the time it takes to be received by ultrasound receiving means, and calculating velocity of the ultrasound energy from the timed measurements. Textured characteristics can then be derived from the velocity calculations. One or more sets of ultrasound transmitters and receivers are utilized to derive velocity measurements in different angular orientations through the material and in different ultrasound modes. An ultrasound transmitter is utilized to direct ultrasound energy to the material and one or more ultrasound receivers are utilized to receive the same. The receivers are at a predetermined fixed distance from the transmitter. A control means is utilized to control transmission of the ultrasound, and a processing means derives timing, calculation of velocity and derivation of texture characteristics.

  4. Semi-automatic system for ultrasonic measurement of texture

    DOEpatents

    Thompson, R.B.; Wormley, S.J.

    1991-09-17

    A means and method are disclosed for ultrasonic measurement of texture nondestructively and efficiently. Texture characteristics are derived by transmitting ultrasound energy into the material, measuring the time it takes to be received by ultrasound receiving means, and calculating velocity of the ultrasound energy from the timed measurements. Textured characteristics can then be derived from the velocity calculations. One or more sets of ultrasound transmitters and receivers are utilized to derive velocity measurements in different angular orientations through the material and in different ultrasound modes. An ultrasound transmitter is utilized to direct ultrasound energy to the material and one or more ultrasound receivers are utilized to receive the same. The receivers are at a predetermined fixed distance from the transmitter. A control means is utilized to control transmission of the ultrasound, and a processing means derives timing, calculation of velocity and derivation of texture characteristics. 5 figures.

  5. Semi-automatic mapping of cultural heritage from airborne laser scanning using deep learning

    NASA Astrophysics Data System (ADS)

    Due Trier, Øivind; Salberg, Arnt-Børre; Holger Pilø, Lars; Tonning, Christer; Marius Johansen, Hans; Aarsten, Dagrun

    2016-04-01

    This paper proposes to use deep learning to improve semi-automatic mapping of cultural heritage from airborne laser scanning (ALS) data. Automatic detection methods, based on traditional pattern recognition, have been applied in a number of cultural heritage mapping projects in Norway for the past five years. Automatic detection of pits and heaps have been combined with visual interpretation of the ALS data for the mapping of deer hunting systems, iron production sites, grave mounds and charcoal kilns. However, the performance of the automatic detection methods varies substantially between ALS datasets. For the mapping of deer hunting systems on flat gravel and sand sediment deposits, the automatic detection results were almost perfect. However, some false detections appeared in the terrain outside of the sediment deposits. These could be explained by other pit-like landscape features, like parts of river courses, spaces between boulders, and modern terrain modifications. However, these were easy to spot during visual interpretation, and the number of missed individual pitfall traps was still low. For the mapping of grave mounds, the automatic method produced a large number of false detections, reducing the usefulness of the semi-automatic approach. The mound structure is a very common natural terrain feature, and the grave mounds are less distinct in shape than the pitfall traps. Still, applying automatic mound detection on an entire municipality did lead to a new discovery of an Iron Age grave field with more than 15 individual mounds. Automatic mound detection also proved to be useful for a detailed re-mapping of Norway's largest Iron Age grave yard, which contains almost 1000 individual graves. Combined pit and mound detection has been applied to the mapping of more than 1000 charcoal kilns that were used by an iron work 350-200 years ago. The majority of charcoal kilns were indirectly detected as either pits on the circumference, a central mound, or both

  6. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  7. Semi-automatic delineation using weighted CT-MRI registered images for radiotherapy of nasopharyngeal cancer

    SciTech Connect

    Fitton, I.; Cornelissen, S. A. P.; Duppen, J. C.; Rasch, C. R. N.; Herk, M. van; Steenbakkers, R. J. H. M.; Peeters, S. T. H.; Hoebers, F. J. P.; Kaanders, J. H. A. M.; Nowak, P. J. C. M.

    2011-08-15

    Purpose: To develop a delineation tool that refines physician-drawn contours of the gross tumor volume (GTV) in nasopharynx cancer, using combined pixel value information from x-ray computed tomography (CT) and magnetic resonance imaging (MRI) during delineation. Methods: Operator-guided delineation assisted by a so-called ''snake'' algorithm was applied on weighted CT-MRI registered images. The physician delineates a rough tumor contour that is continuously adjusted by the snake algorithm using the underlying image characteristics. The algorithm was evaluated on five nasopharyngeal cancer patients. Different linear weightings CT and MRI were tested as input for the snake algorithm and compared according to contrast and tumor to noise ratio (TNR). The semi-automatic delineation was compared with manual contouring by seven experienced radiation oncologists. Results: A good compromise for TNR and contrast was obtained by weighing CT twice as strong as MRI. The new algorithm did not notably reduce interobserver variability, it did however, reduce the average delineation time by 6 min per case. Conclusions: The authors developed a user-driven tool for delineation and correction based a snake algorithm and registered weighted CT image and MRI. The algorithm adds morphological information from CT during the delineation on MRI and accelerates the delineation task.

  8. Applicability of semi-automatic segmentation for volumetric analysis of brain lesions.

    PubMed

    Heinonen, T; Dastidar, P; Eskola, H; Frey, H; Ryymin, P; Laasonen, E

    1998-01-01

    This project involves the development of a fast semi-automatic segmentation procedure to make an accurate volumetric estimation of brain lesions. This method has been applied in the segmentation of demyelination plaques in Multiple Sclerosis (MS) and right cerebral hemispheric infarctions in patients with neglect. The developed segmentation method includes several image processing techniques, such as image enhancement, amplitude segmentation, and region growing. The entire program operates on a PC-based computer and applies graphical user interfaces. Twenty three patients with MS and 43 patients with right cerebral hemisphere infarctions were studied on a 0.5 T MRI unit. The MS plaques and cerebral infarctions were thereafter segmented. The volumetric accuracy of the program was demonstrated by segmenting Magnetic Resonance (MR) images of fluid filled syringes. The relative error of the total volume measurement based on the MR images of syringes was 1.5%. Also the repeatability test was carried out as inter-and intra-observer study in which MS plaques of six randomly selected patients were segmented. These tests indicated 7% variability in the inter-observer study and 4% variability in the intra-observer study. Average time used to segment and calculate the total plaque volumes for one patient was 10 min. This simple segmentation method can be utilized in the quantitation of anatomical structures, such as air cells in the sinonasal and temporal bone area, as well as in different pathological conditions, such as brain tumours, intracerebral haematomas and bony destructions. PMID:9680601

  9. Scalable Semi-Automatic Annotation for Multi-Camera Person Tracking.

    PubMed

    Niño-Castañeda, Jorge; Frías-Velázquez, Andrés; Bo, Nyan Bo; Slembrouck, Maarten; Guan, Junzhi; Debard, Glen; Vanrumste, Bart; Tuytelaars, Tinne; Philips, Wilfried

    2016-05-01

    This paper proposes a generic methodology for the semi-automatic generation of reliable position annotations for evaluating multi-camera people-trackers on large video data sets. Most of the annotation data are automatically computed, by estimating a consensus tracking result from multiple existing trackers and people detectors and classifying it as either reliable or not. A small subset of the data, composed of tracks with insufficient reliability, is verified by a human using a simple binary decision task, a process faster than marking the correct person position. The proposed framework is generic and can handle additional trackers. We present results on a data set of $sim 6$ h captured by 4 cameras, featuring a person in a holiday flat, performing activities such as walking, cooking, eating, cleaning, and watching TV. When aiming for a tracking accuracy of 60 cm, 80% of all video frames are automatically annotated. The annotations for the remaining 20% of the frames were added after human verification of an automatically selected subset of data. This involved $sim 2.4$ h of manual labor. According to a subsequent comprehensive visual inspection to judge the annotation procedure, we found 99% of the automatically annotated frames to be correct. We provide guidelines on how to apply the proposed methodology to new data sets. We also provide an exploratory study for the multi-target case, applied on the existing and new benchmark video sequences. PMID:27458637

  10. Semi-automatic crop inventory from sequential ERTS-1 imagery

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Coleman, V. B.

    1973-01-01

    The detection of a newly introduced crop into the Imperial (California) Valley by sequential ERTS-1 imagery is proving that individual crop types can be identified by remote sensing techniques. Initial results have provided an extremely useful product for water agencies. A system for the identification of field conditions enables the production of a statistical summary within two to three days of receipt of the ERTS-1 imagery. The summary indicates the total acreage of producing crops and irrigated planted crops currently demanding water and further indicates freshly plowed fields that will be demanding water in the near future. Relating the field conditions to the crop calendar of the region by means of computer techniques will provide specific crop identification for the 8000 plus fields.

  11. Breast Contrast Enhanced MR Imaging: Semi-Automatic Detection of Vascular Map and Predominant Feeding Vessel

    PubMed Central

    Petrillo, Antonella; Fusco, Roberta; Filice, Salvatore; Granata, Vincenza; Catalano, Orlando; Vallone, Paolo; Di Bonito, Maurizio; D’Aiuto, Massimiliano; Rinaldo, Massimo; Capasso, Immacolata; Sansone, Mario

    2016-01-01

    Purpose To obtain breast vascular map and to assess correlation between predominant feeding vessel and tumor location with a semi-automatic method compared to conventional radiologic reading. Methods 148 malignant and 75 benign breast lesions were included. All patients underwent bilateral MR imaging. Written informed consent was obtained from the patients before MRI. The local ethics committee granted approval for this study. Semi-automatic breast vascular map and predominant vessel detection was performed on MRI, for each patient. Semi-automatic detection (depending on grey levels threshold manually chosen by radiologist) was compared with results of two expert radiologists; inter-observer variability and reliability of semi-automatic approach were assessed. Results Anatomic analysis of breast lesions revealed that 20% of patients had masses in internal half, 50% in external half and the 30% in subareolar/central area. As regards the 44 tumors in internal half, based on radiologic consensus, 40 demonstrated a predominant feeding vessel (61% were supplied by internal thoracic vessels, 14% by lateral thoracic vessels, 16% by both thoracic vessels and 9% had no predominant feeding vessel—p<0.01), based on semi-automatic detection, 38 tumors demonstrated a predominant feeding vessel (66% were supplied by internal thoracic vessels, 11% by lateral thoracic vessels, 9% by both thoracic vessels and 14% had no predominant feeding vessel—p<0.01). As regards the 111 tumors in external half, based on radiologic consensus, 91 demonstrated a predominant feeding vessel (25% were supplied by internal thoracic vessels, 39% by lateral thoracic vessels, 18% by both thoracic vessels and 18% had no predominant feeding vessel—p<0.01), based on semi-automatic detection, 94 demonstrated a predominant feeding vessel (27% were supplied by internal thoracic vessels, 45% by lateral thoracic vessels, 4% by both thoracic vessels and 24% had no predominant feeding vessel—p<0.01). An

  12. Semi-automatic stereotactic coordinate identification algorithm for routine localization of Deep Brain Stimulation electrodes.

    PubMed

    Hebb, Adam O; Miller, Kai J

    2010-03-15

    Deep Brain Stimulation (DBS) is a routine therapy for movement disorders, and has several emerging indications. We present a novel protocol to define the stereotactic coordinates of metallic DBS implants that may be routinely employed for validating therapeutic anatomical targets. Patients were referred for troubleshooting or new DBS implantation. A volumetric MRI of the brain obtained prior to or during this protocol was formatted to the Anterior Commissure-Posterior Commissure (AC-PC) coordinate system. Patients underwent a CT scan of the brain in an extended Hounsfield unit (EHU) mode. A semi-automatic detection algorithm based on a Normalized Mutual Information (NMI) co-registration method was implemented to measure the AC-PC coordinates of each DBS contact. This algorithm was validated using manual DBS contact identification. Fifty MRI-CT image pairs were available in 39 patients with a total of 336 DBS electrodes. The median and mean Euclidean distance errors for automatic identification of electrode locations were 0.20mm and 0.22 mm, respectively. This method is an accurate method of localization of active DBS contacts within the sub-cortical region. As the investigational indications of DBS expand, this method may be used for verification of final implant coordinates, critical for understanding clinical benefit and comparing efficacy between subjects. PMID:20036691

  13. SEMI-AUTOMATIC SEGMENTATION OF BRAIN SUBCORTICAL STRUCTURES FROM HIGH-FIELD MRI

    PubMed Central

    Kim, Jinyoung; Lenglet, Christophe; Sapiro, Guillermo; Harel, Noam

    2015-01-01

    Volumetric segmentation of subcortical structures such as the basal ganglia and thalamus is necessary for non-invasive diagnosis and neurosurgery planning. This is a challenging problem due in part to limited boundary information between structures, similar intensity profiles across the different structures, and low contrast data. This paper presents a semi-automatic segmentation system exploiting the superior image quality of ultra-high field (7 Tesla) MRI. The proposed approach handles and exploits multiple structural MRI modalities. It uniquely combines T1-weighted (T1W), T2-weighted (T2W), diffusion, and susceptibility-weighted (SWI) MRI and introduces a dedicated new edge indicator function. In addition to this, we employ prior shape and configuration knowledge of the subcortical structures in order to guide the evolution of geometric active surfaces. Neighboring structures are segmented iteratively, constraining over-segmentation at their borders with a non-overlapping penalty. Extensive experiments with data acquired on a 7T MRI scanner demonstrate the feasibility and power of the approach for the segmentation of basal ganglia components critical for neurosurgery applications such as deep brain stimulation. PMID:25192576

  14. Subjective Evaluation of a Semi-Automatic Optical See-Through Head-Mounted Display Calibration Technique.

    PubMed

    Moser, Kenneth; Itoh, Yuta; Oshima, Kohei; Swan, J Edward; Klinker, Gudrun; Sandor, Christian

    2015-04-01

    With the growing availability of optical see-through (OST) head-mounted displays (HMDs) there is a present need for robust, uncomplicated, and automatic calibration methods suited for non-expert users. This work presents the results of a user study which both objectively and subjectively examines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM, (2) Degraded SPAAM, and (3) Recycled INDICA, a recently developed semi-automatic calibration method. Accuracy metrics used for evaluation include subject provided quality values and error between perceived and absolute registration coordinates. Our results show all three calibration methods produce very accurate registration in the horizontal direction but caused subjects to perceive the distance of virtual objects to be closer than intended. Surprisingly, the semi-automatic calibration method produced more accurate registration vertically and in perceived object distance overall. User assessed quality values were also the highest for Recycled INDICA, particularly when objects were shown at distance. The results of this study confirm that Recycled INDICA is capable of producing equal or superior on-screen registration compared to common OST HMD calibration methods. We also identify a potential hazard in using reprojection error as a quantitative analysis technique to predict registration accuracy. We conclude with discussing the further need for examining INDICA calibration in binocular HMD systems, and the present possibility for creation of a closed-loop continuous calibration method for OST Augmented Reality. PMID:26357099

  15. Monitoring targeted therapy using dual-energy CT: semi-automatic RECIST plus supplementary functional information by quantifying iodine uptake of melanoma metastases

    PubMed Central

    Sedlmair, M.; Schlemmer, H.P.; Hassel, J.C.; Ganten, M.

    2013-01-01

    Abstract Aim: Supplementary functional information can contribute to assess response in targeted therapies. The aim of this study was to evaluate semi-automatic RECIST plus iodine uptake (IU) determination in melanoma metastases under BRAF inhibitor (vemurafenib) therapy using dual-energy computed tomography (DECT). Methods: Nine patients with stage IV melanoma treated with a BRAF inhibitor were included. Contrast-enhanced DECT was performed before and twice after treatment onset. Changes in tumor size were assessed according to RECIST. Quantification of IU (absolute value for total IU (mg) and volume-normalized IU (mg/ml)) was based on semi-automatic tumor volume segmentation. The decrease compared with baseline was calculated. Results: The mean change of RECIST diameter sum per patient was −47% at the first follow-up (FU), −56% at the second FU (P < 0.01). The mean normalized IU per patient was −21% at the first FU (P < 0.2) and −45% at the second FU (P < 0.01). Total IU per patient, combining both normalized IU and volume, showed the most pronounced decrease: −89% at the first FU and −90% at the second FU (P < 0.01). Conclusion: Semi-automatic RECIST plus IU quantification in DECT enables objective, easy and fast parameterization of tumor size and contrast medium uptake, thus providing 2 complementary pieces of information for response monitoring applicable in daily routine. PMID:23876444

  16. A semi-automatic model for sinkhole identification in a karst area of Zhijin County, China

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Oguchi, Takashi; Wu, Pan

    2015-12-01

    The objective of this study is to investigate the use of DEMs derived from ASTER and SRTM remote sensing images and topographic maps to detect and quantify natural sinkholes in a karst area in Zhijin county, southwest China. Two methodologies were implemented. The first is a semi-automatic approach which stepwise identifies the depression using DEMs: 1) DEM acquisition; 2) sink fill; 3) sink depth calculation using the difference between the original and sinkfree DEMs; and 4) elimination of the spurious sinkholes by the threshold values of morphometric parameters including TPI (topographic position index), geology, and land use. The second is the traditional visual interpretation of depressions based on the integrated analysis of the high-resolution aerial photographs and topographic maps. The threshold values of the depression area, shape, depth and TPI appropriate for distinguishing true depressions were abstained from the maximum overall accuracy generated by the comparison between the depression maps produced by the semi-automatic model or visual interpretation. The result shows that the best performance of the semi-automatic model for meso-scale karst depression delineation was using the DEM from the topographic maps with the thresholds area >~ 60 m2, ellipticity >~ 0.2 and TPI <= 0. With these realistic thresholds, the accuracy of the semi-automatic model ranges from 0.78 to 0.95 for DEM resolutions from 3 to 75 m.

  17. Semi-Automatic Post-Processing for Improved Usability of Electure Podcasts

    ERIC Educational Resources Information Center

    Hurst, Wolfgang; Welte, Martina

    2009-01-01

    Purpose: Playing back recorded lectures on handheld devices offers interesting perspectives for learning, but suffers from small screen sizes. The purpose of this paper is to propose several semi-automatic post-processing steps in order to improve usability by providing a better readability and additional navigation functionality.…

  18. Semi-automatic breast ultrasound image segmentation based on mean shift and graph cuts.

    PubMed

    Zhou, Zhuhuang; Wu, Weiwei; Wu, Shuicai; Tsui, Po-Hsiang; Lin, Chung-Chih; Zhang, Ling; Wang, Tianfu

    2014-10-01

    Computerized tumor segmentation on breast ultrasound (BUS) images remains a challenging task. In this paper, we proposed a new method for semi-automatic tumor segmentation on BUS images using Gaussian filtering, histogram equalization, mean shift, and graph cuts. The only interaction required was to select two diagonal points to determine a region of interest (ROI) on an input image. The ROI image was shrunken by a factor of 2 using bicubic interpolation to reduce computation time. The shrunken image was smoothed by a Gaussian filter and then contrast-enhanced by histogram equalization. Next, the enhanced image was filtered by pyramid mean shift to improve homogeneity. The object and background seeds for graph cuts were automatically generated on the filtered image. Using these seeds, the filtered image was then segmented by graph cuts into a binary image containing the object and background. Finally, the binary image was expanded by a factor of 2 using bicubic interpolation, and the expanded image was processed by morphological opening and closing to refine the tumor contour. The method was implemented with OpenCV 2.4.3 and Visual Studio 2010 and tested for 38 BUS images with benign tumors and 31 BUS images with malignant tumors from different ultrasound scanners. Experimental results showed that our method had a true positive rate (TP) of 91.7%, a false positive (FP) rate of 11.9%, and a similarity (SI) rate of 85.6%. The mean run time on Intel Core 2.66 GHz CPU and 4 GB RAM was 0.49 ± 0.36 s. The experimental results indicate that the proposed method may be useful in BUS image segmentation. PMID:24759696

  19. Development and evaluation of a semi-automatic technique for determining the bilateral symmetry plane of the facial skeleton.

    PubMed

    Willing, Ryan T; Roumeliotis, Grayson; Jenkyn, Thomas R; Yazdani, Arjang

    2013-12-01

    During reconstructive surgery of the face, one side may be used as a template for the other, exploiting assumed bilateral facial symmetry. The best method to calculate this plane, however, is debated. A new semi-automatic technique for calculating the symmetry plane of the facial skeleton is presented here that uses surface models reconstructed from computed tomography image data in conjunction with principal component analysis and an iterative closest point alignment method. This new technique was found to provide more accurate symmetry planes than traditional methods when applied to a set of 7 human craniofacial skeleton specimens, and showed little vulnerability to missing model data, usually deviating less than 1.5° and 2 mm from the intact model symmetry plane when 30 mm radius voids were present. This new technique will be used for subsequent studies measuring symmetry of the facial skeleton for different patient populations. PMID:23891670

  20. Semi-automatic construction of the Chinese-English MeSH using Web-based term translation method.

    PubMed

    Lu, Wen-Hsiang; Lin, Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi

    2005-01-01

    Due to language barrier, non-English users are unable to retrieve the most updated medical information from the U.S. authoritative medical websites, such as PubMed and MedlinePlus. A few cross-language medical information retrieval (CLMIR) systems have been utilizing MeSH (Medical Subject Heading) with multilingual thesaurus to bridge the gap. Unfortunately, MeSH has yet not been translated into traditional Chinese currently. We proposed a semi-automatic approach to constructing Chinese-English MeSH based on Web-based term translation. The system provides knowledge engineers with candidate terms mining from anchor texts and search-result pages. The result is encouraging. Currently, more than 19,000 Chinese-English MeSH entries have been complied. This thesaurus will be used in Chinese-English CLMIR in the future. PMID:16779085

  1. Semi-automatic classification of textures in thoracic CT scans

    NASA Astrophysics Data System (ADS)

    Kockelkorn, Thessa T. J. P.; de Jong, Pim A.; Schaefer-Prokop, Cornelia M.; Wittenberg, Rianne; Tiehuis, Audrey M.; Gietema, Hester A.; Grutters, Jan C.; Viergever, Max A.; van Ginneken, Bram

    2016-08-01

    The textural patterns in the lung parenchyma, as visible on computed tomography (CT) scans, are essential to make a correct diagnosis in interstitial lung disease. We developed one automatic and two interactive protocols for classification of normal and seven types of abnormal lung textures. Lungs were segmented and subdivided into volumes of interest (VOIs) with homogeneous texture using a clustering approach. In the automatic protocol, VOIs were classified automatically by an extra-trees classifier that was trained using annotations of VOIs from other CT scans. In the interactive protocols, an observer iteratively trained an extra-trees classifier to distinguish the different textures, by correcting mistakes the classifier makes in a slice-by-slice manner. The difference between the two interactive methods was whether or not training data from previously annotated scans was used in classification of the first slice. The protocols were compared in terms of the percentages of VOIs that observers needed to relabel. Validation experiments were carried out using software that simulated observer behavior. In the automatic classification protocol, observers needed to relabel on average 58% of the VOIs. During interactive annotation without the use of previous training data, the average percentage of relabeled VOIs decreased from 64% for the first slice to 13% for the second half of the scan. Overall, 21% of the VOIs were relabeled. When previous training data was available, the average overall percentage of VOIs requiring relabeling was 20%, decreasing from 56% in the first slice to 13% in the second half of the scan.

  2. Semi-Automatic Determination of Citation Relevancy: User Evaluation.

    ERIC Educational Resources Information Center

    Huffman, G. David

    1990-01-01

    Discussion of online bibliographic database searches focuses on a software system, SORT-AID/SABRE, that ranks retrieved citations in terms of relevance. Results of a comprehensive user evaluation of the relevance ranking procedure to determine its effectiveness are presented, and implications for future work are suggested. (10 references) (LRW)

  3. Semi-automatic classification of textures in thoracic CT scans.

    PubMed

    Kockelkorn, Thessa T J P; de Jong, Pim A; Schaefer-Prokop, Cornelia M; Wittenberg, Rianne; Tiehuis, Audrey M; Gietema, Hester A; Grutters, Jan C; Viergever, Max A; van Ginneken, Bram

    2016-08-21

    The textural patterns in the lung parenchyma, as visible on computed tomography (CT) scans, are essential to make a correct diagnosis in interstitial lung disease. We developed one automatic and two interactive protocols for classification of normal and seven types of abnormal lung textures. Lungs were segmented and subdivided into volumes of interest (VOIs) with homogeneous texture using a clustering approach. In the automatic protocol, VOIs were classified automatically by an extra-trees classifier that was trained using annotations of VOIs from other CT scans. In the interactive protocols, an observer iteratively trained an extra-trees classifier to distinguish the different textures, by correcting mistakes the classifier makes in a slice-by-slice manner. The difference between the two interactive methods was whether or not training data from previously annotated scans was used in classification of the first slice. The protocols were compared in terms of the percentages of VOIs that observers needed to relabel. Validation experiments were carried out using software that simulated observer behavior. In the automatic classification protocol, observers needed to relabel on average 58% of the VOIs. During interactive annotation without the use of previous training data, the average percentage of relabeled VOIs decreased from 64% for the first slice to 13% for the second half of the scan. Overall, 21% of the VOIs were relabeled. When previous training data was available, the average overall percentage of VOIs requiring relabeling was 20%, decreasing from 56% in the first slice to 13% in the second half of the scan. PMID:27436568

  4. Introducing a semi-automatic method to simulate large numbers of forensic fingermarks for research on fingerprint identification.

    PubMed

    Rodriguez, Crystal M; de Jongh, Arent; Meuwly, Didier

    2012-03-01

    Statistical research on fingerprint identification and the testing of automated fingerprint identification system (AFIS) performances require large numbers of forensic fingermarks. These fingermarks are rarely available. This study presents a semi-automatic method to create simulated fingermarks in large quantities that model minutiae features or images of forensic fingermarks. This method takes into account several aspects contributing to the variability of forensic fingermarks such as the number of minutiae, the finger region, and the elastic deformation of the skin. To investigate the applicability of the simulated fingermarks, fingermarks have been simulated with 5-12 minutiae originating from different finger regions for six fingers. An AFIS matching algorithm was used to obtain similarity scores for comparisons between the minutiae configurations of fingerprints and the minutiae configurations of simulated and forensic fingermarks. The results showed similar scores for both types of fingermarks suggesting that the simulated fingermarks are good substitutes for forensic fingermarks. PMID:22103733

  5. [Implanted automatic defibrillator after ventricular fibrillation treated with semi-automatic defibrillation].

    PubMed

    Ould-Ahmed, M; Bordier, E; Leenhardt, A; Frank, R; Michel, A

    1998-01-01

    We report two cases of out-of-hospital ventricular fibrillation treated without delay, with basic life support practiced by the witness, followed by a successful defibrillation by paramedics with a semi-automatic defibrillator. In the subsequent month, a cardioverter-defibrillator was implanted. In one patient, a ventricular tachycardia occurring 10 months later and a ventricular fibrillation 9 months later in the other respectively, were successfully reversed by the implanted defibrillator. These two cases illustrate the value of the "survival chain" concept (undelayed alert, basic life support by witness, early defibrillation by paramedics with a semi-automatic defibrillator, advanced life support by a physician) as well as the benefit of the implanted cardioverter-defibrillator. PMID:9750683

  6. Semi-automatic version of the potentiometric titration method for characterization of uranium compounds.

    PubMed

    Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T

    2012-09-01

    The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. PMID:22406220

  7. Semi-Automatic Extraction Algorithm for Images of the Ciliary Muscle

    PubMed Central

    Kao, Chiu-Yen; Richdale, Kathryn; Sinnott, Loraine T.; Ernst, Lauren E.; Bailey, Melissa D.

    2011-01-01

    Purpose To development and evaluate a semi-automatic algorithm for segmentation and morphological assessment of the dimensions of the ciliary muscle in Visante™ Anterior Segment Optical Coherence Tomography images. Methods Geometric distortions in Visante images analyzed as binary files were assessed by imaging an optical flat and human donor tissue. The appropriate pixel/mm conversion factor to use for air (n = 1) was estimated by imaging calibration spheres. A semi-automatic algorithm was developed to extract the dimensions of the ciliary muscle from Visante images. Measurements were also made manually using Visante software calipers. Interclass correlation coefficients (ICC) and Bland-Altman analyses were used to compare the methods. A multilevel model was fitted to estimate the variance of algorithm measurements that was due to differences within- and between-examiners in scleral spur selection versus biological variability. Results The optical flat and the human donor tissue were imaged and appeared without geometric distortions in binary file format. Bland-Altman analyses revealed that caliper measurements tended to underestimate ciliary muscle thickness at 3 mm posterior to the scleral spur in subjects with the thickest ciliary muscles (t = 3.6, p < 0.001). The percent variance due to within- or between-examiner differences in scleral spur selection was found to be small (6%) when compared to the variance due to biological difference across subjects (80%). Using the mean of measurements from three images achieved an estimated ICC of 0.85. Conclusions The semi-automatic algorithm successfully segmented the ciliary muscle for further measurement. Using the algorithm to follow the scleral curvature to locate more posterior measurements is critical to avoid underestimating thickness measurements. This semi-automatic algorithm will allow for repeatable, efficient, and masked ciliary muscle measurements in large datasets. PMID:21169877

  8. Semi-automatic handling of meteorological ground measurements using WeatherProg: prospects and practical implications

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; De Mascellis, Roberto; Manna, Piero; Terribile, Fabio

    2016-04-01

    WeatherProg is a computer program for the semi-automatic handling of data measured at ground stations within a climatic network. The program performs a set of tasks ranging from gathering raw point-based sensors measurements to the production of digital climatic maps. Originally the program was developed as the baseline asynchronous engine for the weather records management within the SOILCONSWEB Project (LIFE08 ENV/IT/000408), in which daily and hourly data where used to run water balance in the soil-plant-atmosphere continuum or pest simulation models. WeatherProg can be configured to automatically perform the following main operations: 1) data retrieval; 2) data decoding and ingestion into a database (e.g. SQL based); 3) data checking to recognize missing and anomalous values (using a set of differently combined checks including logical, climatological, spatial, temporal and persistence checks); 4) infilling of data flagged as missing or anomalous (deterministic or statistical methods); 5) spatial interpolation based on alternative/comparative methods such as inverse distance weighting, iterative regression kriging, and a weighted least squares regression (based on physiography), using an approach similar to PRISM. 6) data ingestion into a geodatabase (e.g. PostgreSQL+PostGIS or rasdaman). There is an increasing demand for digital climatic maps both for research and development (there is a gap between the major of scientific modelling approaches that requires digital climate maps and the gauged measurements) and for practical applications (e.g. the need to improve the management of weather records which in turn raises the support provided to farmers). The demand is particularly burdensome considering the requirement to handle climatic data at the daily (e.g. in the soil hydrological modelling) or even at the hourly time step (e.g. risk modelling in phytopathology). The key advantage of WeatherProg is the ability to perform all the required operations and

  9. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  10. Development and initial evaluation of a semi-automatic approach to assess perivascular spaces on conventional magnetic resonance images

    PubMed Central

    Wang, Xin; Valdés Hernández, Maria del C.; Doubal, Fergus; Chappell, Francesca M.; Piper, Rory J.; Deary, Ian J.; Wardlaw, Joanna M.

    2016-01-01

    Purpose Perivascular spaces (PVS) are associated with ageing, cerebral small vessel disease, inflammation and increased blood brain barrier permeability. Most studies to date use visual rating scales to assess PVS, but these are prone to observer variation. Methods We developed a semi-automatic computational method that extracts PVS on bilateral ovoid basal ganglia (BG) regions on intensity-normalised T2-weighted magnetic resonance images. It uses Analyze™10.0 and was applied to 100 mild stroke patients’ datasets. We used linear regression to test association between BGPVS count, volume and visual rating scores; and between BGPVS count & volume, white matter hyperintensity (WMH) rating scores (periventricular: PVH; deep: DWMH) & volume, atrophy rating scores and brain volume. Results In the 100 patients WMH ranged from 0.4 to 119 ml, and total brain tissue volume from 0.65 to 1.45 l. BGPVS volume increased with BGPVS count (67.27, 95%CI [57.93 to 76.60], p < 0.001). BGPVS count was positively associated with WMH visual rating (PVH: 2.20, 95%CI [1.22 to 3.18], p < 0.001; DWMH: 1.92, 95%CI [0.99 to 2.85], p < 0.001), WMH volume (0.065, 95%CI [0.034 to 0.096], p < 0.001), and whole brain atrophy visual rating (1.01, 95%CI [0.49 to 1.53], p < 0.001). BGPVS count increased as brain volume (as % of ICV) decreased (−0.33, 95%CI [−0.53 to −0.13], p = 0.002). Comparison with existing method BGPVS count and volume increased with the overall increase of BGPVS visual scores (2.11, 95%CI [1.36 to 2.86] for count and 0.022, 95%CI [0.012 to 0.031] for volume, p < 0.001). Distributions for PVS count and visual scores were also similar. Conclusions This semi-automatic method is applicable to clinical protocols and offers quantitative surrogates for PVS load. It shows good agreement with a visual rating scale and confirmed that BGPVS are associated with WMH and atrophy measurements. PMID:26416614

  11. Semi-automatic matching of OCT and IVUS images for image fusion

    NASA Astrophysics Data System (ADS)

    Pauly, Olivier; Unal, Gozde; Slabaugh, Greg; Carlier, Stephane; Fang, Tong

    2008-03-01

    Medical imaging is essential in the diagnosis of atherosclerosis. In this paper, we propose the semi-automatic matching of two promising and complementary intravascular imaging techniques, Intravascular Ultrasound (IVUS) and Optical Coherence Tomography (OCT), with the ultimate goal of producing hybrid images with increased diagnostic value for assessing arterial health. If no ECG gating has been performed on the IVUS and OCT pullbacks, there is typically an anatomical shuffle (displacement in time and space) in the image sequences due to the catheter motion in the artery during the cardiac cycle, and thus, this is not possible to perform a 3D registration. Therefore, the goal of our work is to detect semi-automatically the corresponding images in both modalities as a preprocessing step for the fusion. Our method is based on the characterization of the lumen shape by a set of Gabor Jets features. We also introduce different correction terms based on the approximate position of the slice in the artery. Then we train different support vector machines based on these features to recognize these correspondences. Experimental results demonstrate the usefulness of our approach, which achieves up to 95% matching accuracy for our data.

  12. A semi-automatic method for developing an anthropomorphic numerical model of dielectric anatomy by MRI.

    PubMed

    Mazzurana, M; Sandrini, L; Vaccari, A; Malacarne, C; Cristoforetti, L; Pontalti, R

    2003-10-01

    Complex permittivity values have a dominant role in the overall consideration of interaction between radiofrequency electromagnetic fields and living matter, and in related applications such as electromagnetic dosimetry. There are still some concerns about the accuracy of published data and about their variability due to the heterogeneous nature of biological tissues. The aim of this study is to provide an alternative semi-automatic method by which numerical dielectric human models for dosimetric studies can be obtained. Magnetic resonance imaging (MRI) tomography was used to acquire images. A new technique was employed to correct nonuniformities in the images and frequency-dependent transfer functions to correlate image intensity with complex permittivity were used. The proposed method provides frequency-dependent models in which permittivity and conductivity vary with continuity--even in the same tissue--reflecting the intrinsic realistic spatial dispersion of such parameters. The human model is tested with an FDTD (finite difference time domain) algorithm at different frequencies; the results of layer-averaged and whole-body-averaged SAR (specific absorption rate) are compared with published work, and reasonable agreement has been found. Due to the short time needed to obtain a whole body model, this semi-automatic method may be suitable for efficient study of various conditions that can determine large differences in the SAR distribution, such as body shape, posture, fat-to-muscle ratio, height and weight. PMID:14579858

  13. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  14. Tectonic lineament mapping of the Thaumasia Plateau, Mars: Comparing results from photointerpretation and a semi-automatic approach

    NASA Astrophysics Data System (ADS)

    Vaz, David A.; Di Achille, Gaetano; Barata, Maria Teresa; Alves, Eduardo Ivo

    2012-11-01

    Photointerpretation is the technique generally used to map and analyze the tectonic features existent on Mars surface. In this study we compare qualitatively and quantitatively two tectonic maps based on the interpretation of satellite imagery and a map derived semi-automatically. The comparison of the two photointerpreted datasets allowed us to infer some of the factors that can influence the process of lineament mapping on Mars. Comparing the manually mapped datasets with the semi-automatically mapped features allowed us to evaluate the accuracy of the semi-automatic mapping procedure, as well as to identify the main limitations of the semi-automatic approach to mapping tectonic structures from MOLA altimetry. Significant differences were found between the two photointerpretations. The qualitative and quantitative comparisons showed how mapping criteria, illumination conditions and scale of analysis can locally influence the interpretations. The semi-automatic mapping procedure proved to be mainly dependent on data quality; nevertheless the methodology, when applied to MOLA data, is able to produce meaningful results at a regional scale.

  15. A semi-automatic framework of measuring pulmonary arterial metrics at anatomic airway locations using CT imaging

    NASA Astrophysics Data System (ADS)

    Jin, Dakai; Guo, Junfeng; Dougherty, Timothy M.; Iyer, Krishna S.; Hoffman, Eric A.; Saha, Punam K.

    2016-03-01

    Pulmonary vascular dysfunction has been implicated in smoking-related susceptibility to emphysema. With the growing interest in characterizing arterial morphology for early evaluation of the vascular role in pulmonary diseases, there is an increasing need for the standardization of a framework for arterial morphological assessment at airway segmental levels. In this paper, we present an effective and robust semi-automatic framework to segment pulmonary arteries at different anatomic airway branches and measure their cross-sectional area (CSA). The method starts with user-specified endpoints of a target arterial segment through a custom-built graphical user interface. It then automatically detect the centerline joining the endpoints, determines the local structure orientation and computes the CSA along the centerline after filtering out the adjacent pulmonary structures, such as veins or airway walls. Several new techniques are presented, including collision-impact based cost function for centerline detection, radial sample-line based CSA computation, and outlier analysis of radial distance to subtract adjacent neighboring structures in the CSA measurement. The method was applied to repeat-scan pulmonary multirow detector CT (MDCT) images from ten healthy subjects (age: 21-48 Yrs, mean: 28.5 Yrs; 7 female) at functional residual capacity (FRC). The reproducibility of computed arterial CSA from four airway segmental regions in middle and lower lobes was analyzed. The overall repeat-scan intra-class correlation (ICC) of the computed CSA from all four airway regions in ten subjects was 96% with maximum ICC found at LB10 and RB4 regions.

  16. Mitochondrial complex I and cell death: a semi-automatic shotgun model

    PubMed Central

    Gonzalez-Halphen, D; Ghelli, A; Iommarini, L; Carelli, V; Esposti, M D

    2011-01-01

    Mitochondrial dysfunction often leads to cell death and disease. We can now draw correlations between the dysfunction of one of the most important mitochondrial enzymes, NADH:ubiquinone reductase or complex I, and its structural organization thanks to the recent advances in the X-ray structure of its bacterial homologs. The new structural information on bacterial complex I provide essential clues to finally understand how complex I may work. However, the same information remains difficult to interpret for many scientists working on mitochondrial complex I from different angles, especially in the field of cell death. Here, we present a novel way of interpreting the bacterial structural information in accessible terms. On the basis of the analogy to semi-automatic shotguns, we propose a novel functional model that incorporates recent structural information with previous evidence derived from studies on mitochondrial diseases, as well as functional bioenergetics. PMID:22030538

  17. Semi-automatic detection of linear archaeological traces from orthorectified aerial images

    NASA Astrophysics Data System (ADS)

    Figorito, Benedetto; Tarantino, Eufemia

    2014-02-01

    This paper presents a semi-automatic approach for archaeological traces detection from aerial images. The method developed was based on the multiphase active contour model (ACM). The image was segmented into three competing regions to improve the visibility of buried remains showing in the image as crop marks (i.e. centuriations, agricultural allocations, ancient roads, etc.). An initial determination of relevant traces can be quickly carried out by the operator by sketching straight lines close to the traces. Subsequently, tuning parameters (i.e. eccentricity, orientation, minimum area and distance from input line) are used to remove non-target objects and parameterize the detected traces. The algorithm and graphical user interface for this method were developed in a MATLAB environment and tested on high resolution orthorectified aerial images. A qualitative analysis of the method was lastly performed by comparing the traces extracted with ancient traces verified by archaeologists.

  18. Semi-automatic detection of Gd-DTPA-saline filled capsules for colonic transit time assessment in MRI

    NASA Astrophysics Data System (ADS)

    Harrer, Christian; Kirchhoff, Sonja; Keil, Andreas; Kirchhoff, Chlodwig; Mussack, Thomas; Lienemann, Andreas; Reiser, Maximilian; Navab, Nassir

    2008-03-01

    Functional gastrointestinal disorders result in a significant number of consultations in primary care facilities. Chronic constipation and diarrhea are regarded as two of the most common diseases affecting between 2% and 27% of the population in western countries 1-3. Defecatory disorders are most commonly due to dysfunction of the pelvic floor or the anal sphincter. Although an exact differentiation of these pathologies is essential for adequate therapy, diagnosis is still only based on a clinical evaluation1. Regarding quantification of constipation only the ingestion of radio-opaque markers or radioactive isotopes and the consecutive assessment of colonic transit time using X-ray or scintigraphy, respectively, has been feasible in clinical settings 4-8. However, these approaches have several drawbacks such as involving rather inconvenient, time consuming examinations and exposing the patient to ionizing radiation. Therefore, conventional assessment of colonic transit time has not been widely used. Most recently a new technique for the assessment of colonic transit time using MRI and MR-contrast media filled capsules has been introduced 9. However, due to numerous examination dates per patient and corresponding datasets with many images, the evaluation of the image data is relatively time-consuming. The aim of our study was to develop a computer tool to facilitate the detection of the capsules in MRI datasets and thus to shorten the evaluation time. We present a semi-automatic tool which provides an intensity, size 10, and shape-based 11,12 detection of ingested Gd-DTPA-saline filled capsules. After an automatic pre-classification, radiologists may easily correct the results using the application-specific user interface, therefore decreasing the evaluation time significantly.

  19. Semi-automatic attenuation of cochlear implant artifacts for the evaluation of late auditory evoked potentials.

    PubMed

    Viola, Filipa Campos; De Vos, Maarten; Hine, Jemma; Sandmann, Pascale; Bleeck, Stefan; Eyles, Julie; Debener, Stefan

    2012-02-01

    Electrical artifacts caused by the cochlear implant (CI) contaminate electroencephalographic (EEG) recordings from implanted individuals and corrupt auditory evoked potentials (AEPs). Independent component analysis (ICA) is efficient in attenuating the electrical CI artifact and AEPs can be successfully reconstructed. However the manual selection of CI artifact related independent components (ICs) obtained with ICA is unsatisfactory, since it contains expert-choices and is time consuming. We developed a new procedure to evaluate temporal and topographical properties of ICs and semi-automatically select those components representing electrical CI artifact. The CI Artifact Correction (CIAC) algorithm was tested on EEG data from two different studies. The first consists of published datasets from 18 CI users listening to environmental sounds. Compared to the manual IC selection performed by an expert the sensitivity of CIAC was 91.7% and the specificity 92.3%. After CIAC-based attenuation of CI artifacts, a high correlation between age and N1-P2 peak-to-peak amplitude was observed in the AEPs, replicating previously reported findings and further confirming the algorithm's validity. In the second study AEPs in response to pure tone and white noise stimuli from 12 CI users that had also participated in the other study were evaluated. CI artifacts were attenuated based on the IC selection performed semi-automatically by CIAC and manually by one expert. Again, a correlation between N1 amplitude and age was found. Moreover, a high test-retest reliability for AEP N1 amplitudes and latencies suggested that CIAC-based attenuation reliably preserves plausible individual response characteristics. We conclude that CIAC enables the objective and efficient attenuation of the CI artifact in EEG recordings, as it provided a reasonable reconstruction of individual AEPs. The systematic pattern of individual differences in N1 amplitudes and latencies observed with different stimuli at

  20. (Semi) automatic extraction from airborne laser scan data of roads and paths in forested areas

    NASA Astrophysics Data System (ADS)

    Vletter, Willem F.

    2014-08-01

    The possibilities of airborne laser scanning as a tool for visualisation of micro topology have been known for some decades. Indeed, in the archaeological field a lot of new features have been detected or reconfirmed. However, the task to map manually the enormous amount of features is time consuming and costly. Therefore, there is a need for automation. In this paper four workflows of visualisation and (semi) automatic extraction of (historical) roads and paths are compared. It proved that the concept of openness is preferred over the break line concept for visualisation. Regarding the extraction the software plug in Feature Analyst showed the best results. Openness and Feature Analyst stand also out when costs and processing time were considered. Therefore, we suggest the workflow which combines openness, for visualisation, and Feature Analyst for extraction. The results of this study contribute to the development of automatic extraction techniques in general. In this regard software packages like eCognition look promising to improve extraction methods.

  1. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  2. Semi-Automatic Building Models and FAÇADE Texture Mapping from Mobile Phone Images

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Kim, T.

    2016-06-01

    Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  3. Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation

    PubMed Central

    Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.

    2012-01-01

    Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720

  4. A semi-automatic method for extracting thin line structures in images as rooted tree network

    SciTech Connect

    Brazzini, Jacopo; Dillard, Scott; Soille, Pierre

    2010-01-01

    This paper addresses the problem of semi-automatic extraction of line networks in digital images - e.g., road or hydrographic networks in satellite images, blood vessels in medical images, robust. For that purpose, we improve a generic method derived from morphological and hydrological concepts and consisting in minimum cost path estimation and flow simulation. While this approach fully exploits the local contrast and shape of the network, as well as its arborescent nature, we further incorporate local directional information about the structures in the image. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the target network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given seed with this metric is combined with hydrological operators for overland flow simulation to extract the line network. The algorithm is demonstrated for the extraction of blood vessels in a retina image and of a river network in a satellite image.

  5. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    PubMed

    Jiang, Dong; Huang, Yaohuan; Zhuang, Dafang; Zhu, Yunqiang; Xu, Xinliang; Ren, Hongyan

    2012-01-01

    Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1) images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization) with convenience. PMID:23049886

  6. Conceptual design of semi-automatic wheelbarrow to overcome ergonomics problems among palm oil plantation workers

    NASA Astrophysics Data System (ADS)

    Nawik, N. S. M.; Deros, B. M.; Rahman, M. N. A.; Sukadarin, E. H.; Nordin, N.; Tamrin, S. B. M.; Bakar, S. A.; Norzan, M. L.

    2015-12-01

    An ergonomics problem is one of the main issues faced by palm oil plantation workers especially during harvesting and collecting of fresh fruit bunches (FFB). Intensive manual handling and labor activities involved have been associated with high prevalence of musculoskeletal disorders (MSDs) among palm oil plantation workers. New and safe technology on machines and equipment in palm oil plantation are very important in order to help workers reduce risks and injuries while working. The aim of this research is to improve the design of a wheelbarrow, which is suitable for workers and a small size oil palm plantation. The wheelbarrow design was drawn using CATIA ergonomic features. The characteristic of ergonomics assessment is performed by comparing the existing design of wheelbarrow. Conceptual design was developed based on the problems that have been reported by workers. From the analysis of the problem, finally have resulting concept design the ergonomic quality of semi-automatic wheelbarrow with safe and suitable used for palm oil plantation workers.

  7. A Simple Semi-Automatic Approach for Land Cover Classification from Multispectral Remote Sensing Imagery

    PubMed Central

    Jiang, Dong; Huang, Yaohuan; Zhuang, Dafang; Zhu, Yunqiang; Xu, Xinliang; Ren, Hongyan

    2012-01-01

    Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1) images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization) with convenience. PMID:23049886

  8. Semi-automatic medical image segmentation with adaptive local statistics in Conditional Random Fields framework.

    PubMed

    Hu, Yu-Chi J; Grossberg, Michael D; Mageras, Gikas S

    2008-01-01

    Planning radiotherapy and surgical procedures usually require onerous manual segmentation of anatomical structures from medical images. In this paper we present a semi-automatic and accurate segmentation method to dramatically reduce the time and effort required of expert users. This is accomplished by giving a user an intuitive graphical interface to indicate samples of target and non-target tissue by loosely drawing a few brush strokes on the image. We use these brush strokes to provide the statistical input for a Conditional Random Field (CRF) based segmentation. Since we extract purely statistical information from the user input, we eliminate the need of assumptions on boundary contrast previously used by many other methods, A new feature of our method is that the statistics on one image can be reused on related images without registration. To demonstrate this, we show that boundary statistics provided on a few 2D slices of volumetric medical data, can be propagated through the entire 3D stack of images without using the geometric correspondence between images. In addition, the image segmentation from the CRF can be formulated as a minimum s-t graph cut problem which has a solution that is both globally optimal and fast. The combination of a fast segmentation and minimal user input that is reusable, make this a powerful technique for the segmentation of medical images. PMID:19163362

  9. A semi-automatic non-destructive method to quantify grapevine downy mildew sporulation.

    PubMed

    Peressotti, Elisa; Duchêne, Eric; Merdinoglu, Didier; Mestre, Pere

    2011-02-01

    The availability of fast, reliable and non-destructive methods for the analysis of pathogen development contributes to a better understanding of plant-pathogen interactions. This is particularly true for the genetic analysis of quantitative resistance to plant pathogens, where the availability of a method allowing a precise quantification of pathogen development allows the reliable detection of different genomic regions involved in the resistance. Grapevine downy mildew, caused by the biotrophic Oomycete Plasmopara viticola, is one of the most important diseases affecting viticulture. Here we report the development of a simple image analysis-based semi-automatic method for the quantification of grapevine downy mildew sporulation, requiring just a compact digital camera and the open source software ImageJ. We confirm the suitability of the method for the analysis of the interaction between grapevine and downy mildew by performing QTL analysis of resistance to downy mildew as well as analysis of the kinetics of downy mildew infection. The non-destructive nature of the method will enable comparison between the phenotypic and molecular data obtained from the very same sample, resulting in a more accurate description of the interaction, while its simplicity makes it easily adaptable to other plant-pathogen interactions, in particular those involving downy mildews. PMID:21167874

  10. Integration of semi-automatic detection and sediment connectivity assessment for the characterization of sediment source areas in mountain catchments

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Bossi, Giulia; Marchi, Lorenzo; Cavalli, Marco

    2016-04-01

    Identifying areas that are directly delivering sediment to the channel network or to a catchment outlet is of great importance for a sound sediment dynamic characterization and for assessing sediment budget. We present an integration of remote sensing analysis techniques to characterize the effective sediment contributing area that is the sub-portion of the catchment in which sediment is effectively routed towards the catchment outlet. A semi-automatic mapping of active sediment source areas is carried out via image analysis techniques. To this purpose, satellite multispectral images and aerial orthophotos are considered for the analysis. Several algorithms for features extraction are applied and the maps obtained are compared with an expert-based sediment source mapping derived from photointerpretation and field surveys. The image-based analysis is additionally integrated with a topography-driven filtering procedure. Thanks to the availability of High-Resolution, LiDAR-derived Digital Terrain Models, it is possible to work at a fine scale and to compute morphometric parameters (e.g., slope, roughness, curvature) suitable for refining the image analysis. In particular, information on local topography was integrated with the image-based analysis to discriminate between rocky outcrops and sediment sources, thus improving the overall consistency of the procedure. The sediment source areas are then combined with the output of a connectivity assessment. A topography-based index of sediment connectivity is computed for the analyzed areas in order to better estimate the effective sediment contributing area and to obtain a ranking of the source areas in the studied catchments. The study methods have been applied in catchments of the Eastern Italian Alps where a detailed census of sediment source areas is available. The comparison of the results of image analysis with expert-based sediment sources mapping shows a satisfactory agreement between the two approaches

  11. Compact semi-automatic incident sampler for personal monitoring of volatile organic compounds in occupational air.

    PubMed

    Solbu, Kasper; Hersson, Merete; Thorud, Syvert; Lundanes, Elsa; Nilsen, Terje; Synnes, Ole; Ellingsen, Dag; Molander, Paal

    2010-05-01

    Suddenly occurring and time limited chemical exposures caused by unintended incidents might pose a threat to many workers at various work sites. Monitoring of exposure during such occasional incidents is challenging. In this study a compact, low-weight and personal semi-automatic pumped unit for sampling of organic vapor phase compounds from occupational air during sporadic and suddenly occurring incidents has been developed, providing simple activation by the worker potentially subjected to the sudden occurring exposures when a trained occupational hygienist is not available. The sampler encompasses a tube (glass or stainless steel) containing an adsorbent material in combination with a small membrane pump, where the adsorbent is capped at both ends by gas tight solenoid valves. The sampler is operated by a conventional 9 V battery which tolerates long storage time (at least one year), and is activated by pulling a pin followed by automatic operation and subsequent closing of valves, prior to shipping to a laboratory. The adjustable sampling air flow rate and the sampling time are pre-programmed with a standard setting of 200 mL min(-1) and 30 min, respectively. The average airflow in the time interval 25-30 min compared to average airflow in the interval 2-7 min was 92-95% (n = 6), while the flow rate between-assay precisions (RSD) for six different samplers on three days each were in the range 0.5-3.7%. Incident sampler recoveries of VOCs from a generated VOC atmosphere relative to a validated standard method were between 95 and 102% (+/-4-5%). The valves that seal the sampler adsorbent during storage have been shown to prevent an external VOC atmosphere (500 mg m(-3)) to enter the adsorbent tube, in addition to that the sampler adsorbent is storable for at least one month due to absence of ingress of contaminants from internal parts. The sampler was also suitable for trapping of semi-volatile organophosphates. PMID:21491688

  12. Localization accuracy from automatic and semi-automatic rigid registration of locally-advanced lung cancer targets during image-guided radiation therapy

    SciTech Connect

    Robertson, Scott P.; Weiss, Elisabeth; Hugo, Geoffrey D.

    2012-01-15

    Purpose: To evaluate localization accuracy resulting from rigid registration of locally-advanced lung cancer targets using fully automatic and semi-automatic protocols for image-guided radiation therapy. Methods: Seventeen lung cancer patients, fourteen also presenting with involved lymph nodes, received computed tomography (CT) scans once per week throughout treatment under active breathing control. A physician contoured both lung and lymph node targets for all weekly scans. Various automatic and semi-automatic rigid registration techniques were then performed for both individual and simultaneous alignments of the primary gross tumor volume (GTV{sub P}) and involved lymph nodes (GTV{sub LN}) to simulate the localization process in image-guided radiation therapy. Techniques included ''standard'' (direct registration of weekly images to a planning CT), ''seeded'' (manual prealignment of targets to guide standard registration), ''transitive-based'' (alignment of pretreatment and planning CTs through one or more intermediate images), and ''rereferenced'' (designation of a new reference image for registration). Localization error (LE) was assessed as the residual centroid and border distances between targets from planning and weekly CTs after registration. Results: Initial bony alignment resulted in centroid LE of 7.3 {+-} 5.4 mm and 5.4 {+-} 3.4 mm for the GTV{sub P} and GTV{sub LN}, respectively. Compared to bony alignment, transitive-based and seeded registrations significantly reduced GTV{sub P} centroid LE to 4.7 {+-} 3.7 mm (p = 0.011) and 4.3 {+-} 2.5 mm (p < 1 x 10{sup -3}), respectively, but the smallest GTV{sub P} LE of 2.4 {+-} 2.1 mm was provided by rereferenced registration (p < 1 x 10{sup -6}). Standard registration significantly reduced GTV{sub LN} centroid LE to 3.2 {+-} 2.5 mm (p < 1 x 10{sup -3}) compared to bony alignment, with little additional gain offered by the other registration techniques. For simultaneous target alignment, centroid LE as low

  13. Semi-automatic registration of digital histopathology images to in-vivo MR images in molded and unmolded prostates

    PubMed Central

    Starobinets, Olga; Guo, Richard; Simko, Jeffry P.; Kuchinsky, Kyle; Kurhanewicz, John; Carroll, Peter R.; Greene, Kirsten L.; Noworolski, Susan M.

    2013-01-01

    Purpose To evaluate a semi-automatic software-based method of registering in vivo prostate magnetic resonance (MR) images to digital histopathology images using two approaches: 1) in which the prostates were molded to simulate distortion due to the endorectal imaging coil prior to fixation, and 2) in which the prostates were not molded. Materials and Methods T2-weighted MR images and digitized whole-mount histopathology images were acquired for twenty-six patients with biopsy-confirmed prostate cancer who underwent radical prostatectomy. Ten excised prostates were molded prior to fixation. A semi-automatic method was used to align MR images to histopathology. Percent overlap between MR and histopathology images, as well as distances between corresponding anatomical landmarks were calculated and used to evaluate the registration technique for molded and unmolded cases. Results The software successfully morphed histology-based prostate images into corresponding MR images. Percent overlap improved from 80.4±5.8% prior to morphing to 99.7±0.62% post morphing. Molded prostates had a smaller distance between landmarks (1.91±0.75mm) versus unmolded (2.34±0.68mm), p<0.08. Conclusion Molding a prostate prior to fixation provided a better alignment of internal structures within the prostate, but this did not reach statistical significance. Software-based morphing allowed for nearly complete overlap between the pathology slides and the MR images. PMID:24136783

  14. A semi-automatic 2D-to-3D video conversion with adaptive key-frame selection

    NASA Astrophysics Data System (ADS)

    Ju, Kuanyu; Xiong, Hongkai

    2014-11-01

    To compensate the deficit of 3D content, 2D to 3D video conversion (2D-to-3D) has recently attracted more attention from both industrial and academic communities. The semi-automatic 2D-to-3D conversion which estimates corresponding depth of non-key-frames through key-frames is more desirable owing to its advantage of balancing labor cost and 3D effects. The location of key-frames plays a role on quality of depth propagation. This paper proposes a semi-automatic 2D-to-3D scheme with adaptive key-frame selection to keep temporal continuity more reliable and reduce the depth propagation errors caused by occlusion. The potential key-frames would be localized in terms of clustered color variation and motion intensity. The distance of key-frame interval is also taken into account to keep the accumulated propagation errors under control and guarantee minimal user interaction. Once their depth maps are aligned with user interaction, the non-key-frames depth maps would be automatically propagated by shifted bilateral filtering. Considering that depth of objects may change due to the objects motion or camera zoom in/out effect, a bi-directional depth propagation scheme is adopted where a non-key frame is interpolated from two adjacent key frames. The experimental results show that the proposed scheme has better performance than existing 2D-to-3D scheme with fixed key-frame interval.

  15. Contour propagation in MRI-guided radiotherapy treatment of cervical cancer: the accuracy of rigid, non-rigid and semi-automatic registrations

    NASA Astrophysics Data System (ADS)

    van der Put, R. W.; Kerkhof, E. M.; Raaymakers, B. W.; Jürgenliemk-Schulz, I. M.; Lagendijk, J. J. W.

    2009-12-01

    External beam radiation treatment for patients with cervical cancer is hindered by the relatively large motion of the target volume. A hybrid MRI-accelerator system makes it possible to acquire online MR images during treatment in order to correct for motion and deformation. To fully benefit from such a system, online delineation of the target volumes is necessary. The aim of this study is to investigate the accuracy of rigid, non-rigid and semi-automatic registrations of MR images for interfractional contour propagation in patients with cervical cancer. Registration using mutual information was performed on both bony anatomy and soft tissue. A B-spline transform was used for the non-rigid method. Semi-automatic registration was implemented with a point set registration algorithm on a small set of manual landmarks. Online registration was simulated by application of each method to four weekly MRI scans for each of 33 cervical cancer patients. Evaluation was performed by distance analysis with respect to manual delineations. The results show that soft-tissue registration significantly (P < 0.001) improves the accuracy of contour propagation compared to registration based on bony anatomy. A combination of user-assisted and non-rigid registration provides the best results with a median error of 3.2 mm (1.4-9.9 mm) compared to 5.9 mm (1.7-19.7 mm) with bone registration (P < 0.001) and 3.4 mm (1.3-19.1 mm) with non-rigid registration (P = 0.01). In a clinical setting, the benefit may be further increased when outliers can be removed by visual inspection of the online images. We conclude that for external beam radiation treatment of cervical cancer, online MRI imaging will allow target localization based on soft tissue visualization, which provides a significantly higher accuracy than localization based on bony anatomy. The use of limited user input to guide the registration increases overall accuracy. Additional non-rigid registration further reduces the propagation

  16. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  17. Semi-automatic central-chest lymph-node definition from 3D MDCT images

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Higgins, William E.

    2010-03-01

    Central-chest lymph nodes play a vital role in lung-cancer staging. The three-dimensional (3D) definition of lymph nodes from multidetector computed-tomography (MDCT) images, however, remains an open problem. This is because of the limitations in the MDCT imaging of soft-tissue structures and the complicated phenomena that influence the appearance of a lymph node in an MDCT image. In the past, we have made significant efforts toward developing (1) live-wire-based segmentation methods for defining 2D and 3D chest structures and (2) a computer-based system for automatic definition and interactive visualization of the Mountain central-chest lymph-node stations. Based on these works, we propose new single-click and single-section live-wire methods for segmenting central-chest lymph nodes. The single-click live wire only requires the user to select an object pixel on one 2D MDCT section and is designed for typical lymph nodes. The single-section live wire requires the user to process one selected 2D section using standard 2D live wire, but it is more robust. We applied these methods to the segmentation of 20 lymph nodes from two human MDCT chest scans (10 per scan) drawn from our ground-truth database. The single-click live wire segmented 75% of the selected nodes successfully and reproducibly, while the success rate for the single-section live wire was 85%. We are able to segment the remaining nodes, using our previously derived (but more interaction intense) 2D live-wire method incorporated in our lymph-node analysis system. Both proposed methods are reliable and applicable to a wide range of pulmonary lymph nodes.

  18. Conversation analysis at work: detection of conflict in competitive discussions through semi-automatic turn-organization analysis.

    PubMed

    Pesarin, Anna; Cristani, Marco; Murino, Vittorio; Vinciarelli, Alessandro

    2012-10-01

    This study proposes a semi-automatic approach aimed at detecting conflict in conversations. The approach is based on statistical techniques capable of identifying turn-organization regularities associated with conflict. The only manual step of the process is the segmentation of the conversations into turns (time intervals during which only one person talks) and overlapping speech segments (time intervals during which several persons talk at the same time). The rest of the process takes place automatically and the results show that conflictual exchanges can be detected with Precision and Recall around 70% (the experiments have been performed over 6 h of political debates). The approach brings two main benefits: the first is the possibility of analyzing potentially large amounts of conversational data with a limited effort, the second is that the model parameters provide indications on what turn-regularities are most likely to account for the presence of conflict. PMID:22009168

  19. Quantitative evaluation of six graph based semi-automatic liver tumor segmentation techniques using multiple sets of reference segmentation

    NASA Astrophysics Data System (ADS)

    Su, Zihua; Deng, Xiang; Chefd'hotel, Christophe; Grady, Leo; Fei, Jun; Zheng, Dong; Chen, Ning; Xu, Xiaodong

    2011-03-01

    Graph based semi-automatic tumor segmentation techniques have demonstrated great potential in efficiently measuring tumor size from CT images. Comprehensive and quantitative validation is essential to ensure the efficacy of graph based tumor segmentation techniques in clinical applications. In this paper, we present a quantitative validation study of six graph based 3D semi-automatic tumor segmentation techniques using multiple sets of expert segmentation. The six segmentation techniques are Random Walk (RW), Watershed based Random Walk (WRW), LazySnapping (LS), GraphCut (GHC), GrabCut (GBC), and GrowCut (GWC) algorithms. The validation was conducted using clinical CT data of 29 liver tumors and four sets of expert segmentation. The performance of the six algorithms was evaluated using accuracy and reproducibility. The accuracy was quantified using Normalized Probabilistic Rand Index (NPRI), which takes into account of the variation of multiple expert segmentations. The reproducibility was evaluated by the change of the NPRI from 10 different sets of user initializations. Our results from the accuracy test demonstrated that RW (0.63) showed the highest NPRI value, compared to WRW (0.61), GWC (0.60), GHC (0.58), LS (0.57), GBC (0.27). The results from the reproducibility test indicated that GBC is more sensitive to user initialization than the other five algorithms. Compared to previous tumor segmentation validation studies using one set of reference segmentation, our evaluation methods use multiple sets of expert segmentation to address the inter or intra rater variability issue in ground truth annotation, and provide quantitative assessment for comparing different segmentation algorithms.

  20. Comparison Of Semi-Automatic And Automatic Slick Detection Algorithms For Jiyeh Power Station Oil Spill, Lebanon

    NASA Astrophysics Data System (ADS)

    Osmanoglu, B.; Ozkan, C.; Sunar, F.

    2013-10-01

    After air strikes on July 14 and 15, 2006 the Jiyeh Power Station started leaking oil into the eastern Mediterranean Sea. The power station is located about 30 km south of Beirut and the slick covered about 170 km of coastline threatening the neighboring countries Turkey and Cyprus. Due to the ongoing conflict between Israel and Lebanon, cleaning efforts could not start immediately resulting in 12 000 to 15 000 tons of fuel oil leaking into the sea. In this paper we compare results from automatic and semi-automatic slick detection algorithms. The automatic detection method combines the probabilities calculated for each pixel from each image to obtain a joint probability, minimizing the adverse effects of atmosphere on oil spill detection. The method can readily utilize X-, C- and L-band data where available. Furthermore wind and wave speed observations can be used for a more accurate analysis. For this study, we utilize Envisat ASAR ScanSAR data. A probability map is generated based on the radar backscatter, effect of wind and dampening value. The semi-automatic algorithm is based on supervised classification. As a classifier, Artificial Neural Network Multilayer Perceptron (ANN MLP) classifier is used since it is more flexible and efficient than conventional maximum likelihood classifier for multisource and multi-temporal data. The learning algorithm for ANN MLP is chosen as the Levenberg-Marquardt (LM). Training and test data for supervised classification are composed from the textural information created from SAR images. This approach is semiautomatic because tuning the parameters of classifier and composing training data need a human interaction. We point out the similarities and differences between the two methods and their results as well as underlining their advantages and disadvantages. Due to the lack of ground truth data, we compare obtained results to each other, as well as other published oil slick area assessments.

  1. Semi-automatic reduced order models from expert-defined transients

    NASA Astrophysics Data System (ADS)

    Class, Andreas; Prill, Dennis

    2013-11-01

    Boiling water reactors (BWRs) not only show growing power oscillations at high-power low-flow conditions but also amplitude limited oscillations with temporal flow reversal. Methodologies, applicable in the non-linear regime, allow insight into the physical mechanisms behind BWR dynamics. The proposed methodology exploits relevant simulation data computed by an expert choice of transient. Proper orthogonal modes are extracted and serve as Ansatz functions within a spectral approach, yielding a reduced order model (ROM). Required steps to achieve reliable and numerical stable ROMs are discussed, i.e. mean value handling, inner product choice, variational formulation of derivatives and boundary conditions.Two strongly non-linear systems are analyzed: The tubular reactor, including Arrhenius reaction and heat losses, yields sensitive response on transient boundary conditions. A simple natural convection loop is considered due to its dynamical similarities to BWRs. It exhibits bifurcations resulting in limit cycles. The presented POD-ROM methodology reproduces dynamics with a small number of spectral modes and reaches appreciable accuracy. Funded by AREVA GmbH.

  2. Quality Metrics of Semi Automatic DTM from Large Format Digital Camera

    NASA Astrophysics Data System (ADS)

    Narendran, J.; Srinivas, P.; Udayalakshmi, M.; Muralikrishnan, S.

    2014-11-01

    The high resolution digital images from Ultracam-D Large Format Digital Camera (LFDC) was used for near automatic DTM generation. In the past, manual method for DTM generation was used which are time consuming and labour intensive. In this study LFDC in synergy with accurate position and orientation system and processes like image matching algorithms, distributed processing and filtering techniques for near automatic DTM generation. Traditionally the DTM accuracy is reported using check points collected from the field which are limited in number, time consuming and costly. This paper discusses the reliability of near automatic DTM generated from Ultracam-D for an operational project covering an area of nearly 600 Sq. Km. using 21,000 check points captured stereoscopically by experienced operators. The reliability of the DTM for the three study areas with different morphology is presented using large number of stereo check points and parameters related to statistical distribution of residuals such as skewness, kurtosis, standard deviation and linear error at 90% confidence interval. The residuals obtained for the three areas follow normal distribution in agreement with the majority of standards on positional accuracy. The quality metrics in terms of reliability were computed for the DTMs generated and the tables and graphs show the potential of Ultracam-D for the generation of semiautomatic DTM process for different terrain types.

  3. RFA-cut: Semi-automatic segmentation of radiofrequency ablation zones with and without needles via optimal s-t-cuts.

    PubMed

    Egger, Jan; Busse, Harald; Brandmaier, Philipp; Seider, Daniel; Gawlitza, Matthias; Strocka, Steffen; Voglreiter, Philip; Dokter, Mark; Hofmann, Michael; Kainz, Bernhard; Chen, Xiaojun; Hann, Alexander; Boechat, Pedro; Yu, Wei; Freisleben, Bernd; Alhonnoro, Tuomas; Pollari, Mika; Moche, Michael; Schmalstieg, Dieter

    2015-08-01

    In this contribution, we present a semi-automatic segmentation algorithm for radiofrequency ablation (RFA) zones via optimal s-t-cuts. Our interactive graph-based approach builds upon a polyhedron to construct the graph and was specifically designed for computed tomography (CT) acquisitions from patients that had RFA treatments of Hepatocellular Carcinomas (HCC). For evaluation, we used twelve post-interventional CT datasets from the clinical routine and as evaluation metric we utilized the Dice Similarity Coefficient (DSC), which is commonly accepted for judging computer aided medical segmentation tasks. Compared with pure manual slice-by-slice expert segmentations from interventional radiologists, we were able to achieve a DSC of about eighty percent, which is sufficient for our clinical needs. Moreover, our approach was able to handle images containing (DSC=75.9%) and not containing (78.1%) the RFA needles still in place. Additionally, we found no statistically significant difference (p<;0.423) between the segmentation results of the subgroups for a Mann-Whitney test. Finally, to the best of our knowledge, this is the first time a segmentation approach for CT scans including the RFA needles is reported and we show why another state-of-the-art segmentation method fails for these cases. Intraoperative scans including an RFA probe are very critical in the clinical practice and need a very careful segmentation and inspection to avoid under-treatment, which may result in tumor recurrence (up to 40%). If the decision can be made during the intervention, an additional ablation can be performed without removing the entire needle. This decreases the patient stress and associated risks and costs of a separate intervention at a later date. Ultimately, the segmented ablation zone containing the RFA needle can be used for a precise ablation simulation as the real needle position is known. PMID:26736783

  4. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....1Waivers and Field Testing for Non-conventional Clothes Washers. Manufacturers of nonconventional clothes... 10 Energy 3 2010-01-01 2010-01-01 false Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes Washers J1 Appendix J1 to Subpart B of Part 430...

  5. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....1Waivers and Field Testing for Non-conventional Clothes Washers. Manufacturers of nonconventional clothes... 10 Energy 3 2011-01-01 2011-01-01 false Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes Washers J1 Appendix J1 to Subpart B of Part 430...

  6. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  7. Computer Vision Assisted Virtual Reality Calibration

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1999-01-01

    A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.

  8. Roads Centre-Axis Extraction in Airborne SAR Images: AN Approach Based on Active Contour Model with the Use of Semi-Automatic Seeding

    NASA Astrophysics Data System (ADS)

    Lotte, R. G.; Sant'Anna, S. J. S.; Almeida, C. M.

    2013-05-01

    Research works dealing with computational methods for roads extraction have considerably increased in the latest two decades. This procedure is usually performed on optical or microwave sensors (radar) imagery. Radar images offer advantages when compared to optical ones, for they allow the acquisition of scenes regardless of atmospheric and illumination conditions, besides the possibility of surveying regions where the terrain is hidden by the vegetation canopy, among others. The cartographic mapping based on these images is often manually accomplished, requiring considerable time and effort from the human interpreter. Maps for detecting new roads or updating the existing roads network are among the most important cartographic products to date. There are currently many studies involving the extraction of roads by means of automatic or semi-automatic approaches. Each of them presents different solutions for different problems, making this task a scientific issue still open. One of the preliminary steps for roads extraction can be the seeding of points belonging to roads, what can be done using different methods with diverse levels of automation. The identified seed points are interpolated to form the initial road network, and are hence used as an input for an extraction method properly speaking. The present work introduces an innovative hybrid method for the extraction of roads centre-axis in a synthetic aperture radar (SAR) airborne image. Initially, candidate points are fully automatically seeded using Self-Organizing Maps (SOM), followed by a pruning process based on specific metrics. The centre-axis are then detected by an open-curve active contour model (snakes). The obtained results were evaluated as to their quality with respect to completeness, correctness and redundancy.

  9. ALMA correlator computer systems

    NASA Astrophysics Data System (ADS)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  10. Computer controlled antenna system

    NASA Technical Reports Server (NTRS)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  11. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Vho, Alice; Bistacchi, Andrea

    2015-04-01

    A quantitative analysis of fault-rock distribution is of paramount importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation along faults at depth. Here we present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM). This workflow has been developed on a real case of study: the strike-slip Gole Larghe Fault Zone (GLFZ). It consists of a fault zone exhumed from ca. 10 km depth, hosted in granitoid rocks of Adamello batholith (Italian Southern Alps). Individual seismogenic slip surfaces generally show green cataclasites (cemented by the precipitation of epidote and K-feldspar from hydrothermal fluids) and more or less well preserved pseudotachylytes (black when well preserved, greenish to white when altered). First of all, a digital model for the outcrop is reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs, processed with VisualSFM software. By using high resolution photographs the DOM can have a much higher resolution than with LIDAR surveys, up to 0.2 mm/pixel. Then, image processing is performed to map the fault-rock distribution with the ImageJ-Fiji package. Green cataclasites and epidote/K-feldspar veins can be quite easily separated from the host rock (tonalite) using spectral analysis. Particularly, band ratio and principal component analysis have been tested successfully. The mapping of black pseudotachylyte veins is more tricky because the differences between the pseudotachylyte and biotite spectral signature are not appreciable. For this reason we have tested different morphological processing tools aimed at identifying (and subtracting) the tiny biotite grains. We propose a solution based on binary images involving a combination of size and circularity thresholds. Comparing the results with manually segmented images, we noticed that major problems occur only when pseudotachylyte veins are very thin and discontinuous. After

  12. Semi-automatic registration of 3D orthodontics models from photographs

    NASA Astrophysics Data System (ADS)

    Destrez, Raphaël.; Treuillet, Sylvie; Lucas, Yves; Albouy-Kissi, Benjamin

    2013-03-01

    In orthodontics, a common practice used to diagnose and plan the treatment is the dental cast. After digitization by a CT-scan or a laser scanner, the obtained 3D surface models can feed orthodontics numerical tools for computer-aided diagnosis and treatment planning. One of the pre-processing critical steps is the 3D registration of dental arches to obtain the occlusion of these numerical models. For this task, we propose a vision based method to automatically compute the registration based on photos of patient mouth. From a set of matched singular points between two photos and the dental 3D models, the rigid transformation to apply to the mandible to be in contact with the maxillary may be computed by minimizing the reprojection errors. During a precedent study, we established the feasibility of this visual registration approach with a manual selection of singular points. This paper addresses the issue of automatic point detection. Based on a priori knowledge, histogram thresholding and edge detection are used to extract specific points in 2D images. Concurrently, curvatures information detects 3D corresponding points. To improve the quality of the final registration, we also introduce a combined optimization of the projection matrix with the 2D/3D point positions. These new developments are evaluated on real data by considering the reprojection errors and the deviation angles after registration in respect to the manual reference occlusion realized by a specialist.

  13. Semi-automatic tool for segmentation and volumetric analysis of medical images.

    PubMed

    Heinonen, T; Dastidar, P; Kauppinen, P; Malmivuo, J; Eskola, H

    1998-05-01

    Segmentation software is described, developed for medical image processing and run on Windows. The software applies basic image processing techniques through a graphical user interface. For particular applications, such as brain lesion segmentation, the software enables the combination of different segmentation techniques to improve its efficiency. The program is applied for magnetic resonance imaging, computed tomography and optical images of cryosections. The software can be utilised in numerous applications, including pre-processing for three-dimensional presentations, volumetric analysis and construction of volume conductor models. PMID:9747567

  14. A Semi-Automatic Alignment Method for Math Educational Standards Using the MP (Materialization Pattern) Model

    ERIC Educational Resources Information Center

    Choi, Namyoun

    2010-01-01

    Educational standards alignment, which matches similar or equivalent concepts of educational standards, is a necessary task for educational resource discovery and retrieval. Automated or semi-automated alignment systems for educational standards have been recently available. However, existing systems frequently result in inconsistency in…

  15. Semi-automatic extraction of sectional view from point clouds - The case of Ottmarsheim's abbey-church

    NASA Astrophysics Data System (ADS)

    Landes, T.; Bidino, S.; Guild, R.

    2014-06-01

    Today, elevations or sectional views of buildings are often produced from terrestrial laser scanning. However, due to the amount of data to process and because usually 2D maps are required by customers, the 3D point cloud is often degraded into 2D slices. In a sectional view, not only the portions of the objet which are intersected by the cutting plane but also edges and contours of other parts of the object which are visible behind the cutting plane are represented. To avoid the tedious manual drawing, the aim of this work is to propose a semi-automatic approach for creating sectional views by point cloud processing. The extraction of sectional views requires in a first step the segmentation of the point cloud into planar and non-planar entities. Since in cultural heritage buildings, arches, vaults, columns can be found, the position and the direction of the sectional view must be taken into account before contours extraction. Indeed, the edges of surfaces of revolution depend on the chosen view. The developed extraction approach is detailed based on point clouds acquired inside and outside churches. The resulting sectional view has been evaluated in a qualitative and quantitative way by comparing it with a reference sectional view made by hand. A mean deviation of 3 cm between both sections proves that the proposed approach is promising. Regarding the processing time, despite a few manual corrections, it has saved 40% of the time required for manual drawing.

  16. Semi-automatic segmentation of vertebral bodies in volumetric MR images using a statistical shape+pose model

    NASA Astrophysics Data System (ADS)

    Suzani, Amin; Rasoulian, Abtin; Fels, Sidney; Rohling, Robert N.; Abolmaesumi, Purang

    2014-03-01

    Segmentation of vertebral structures in magnetic resonance (MR) images is challenging because of poor con­trast between bone surfaces and surrounding soft tissue. This paper describes a semi-automatic method for segmenting vertebral bodies in multi-slice MR images. In order to achieve a fast and reliable segmentation, the method takes advantage of the correlation between shape and pose of different vertebrae in the same patient by using a statistical multi-vertebrae anatomical shape+pose model. Given a set of MR images of the spine, we initially reduce the intensity inhomogeneity in the images by using an intensity-correction algorithm. Then a 3D anisotropic diffusion filter smooths the images. Afterwards, we extract edges from a relatively small region of the pre-processed image with a simple user interaction. Subsequently, an iterative Expectation Maximization tech­nique is used to register the statistical multi-vertebrae anatomical model to the extracted edge points in order to achieve a fast and reliable segmentation for lumbar vertebral bodies. We evaluate our method in terms of speed and accuracy by applying it to volumetric MR images of the spine acquired from nine patients. Quantitative and visual results demonstrate that the method is promising for segmentation of vertebral bodies in volumetric MR images.

  17. Towards Semi-Automatic Artifact Rejection for the Improvement of Alzheimer’s Disease Screening from EEG Signals

    PubMed Central

    Solé-Casals, Jordi; Vialatte, François-Benoît

    2015-01-01

    A large number of studies have analyzed measurable changes that Alzheimer’s disease causes on electroencephalography (EEG). Despite being easily reproducible, those markers have limited sensitivity, which reduces the interest of EEG as a screening tool for this pathology. This is for a large part due to the poor signal-to-noise ratio of EEG signals: EEG recordings are indeed usually corrupted by spurious extra-cerebral artifacts. These artifacts are responsible for a consequent degradation of the signal quality. We investigate the possibility to automatically clean a database of EEG recordings taken from patients suffering from Alzheimer’s disease and healthy age-matched controls. We present here an investigation of commonly used markers of EEG artifacts: kurtosis, sample entropy, zero-crossing rate and fractal dimension. We investigate the reliability of the markers, by comparison with human labeling of sources. Our results show significant differences with the sample entropy marker. We present a strategy for semi-automatic cleaning based on blind source separation, which may improve the specificity of Alzheimer screening using EEG signals. PMID:26213933

  18. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  19. Semi-automatic ground truth generation using unsupervised clustering and limited manual labeling: Application to handwritten character recognition

    PubMed Central

    Vajda, Szilárd; Rangoni, Yves; Cecotti, Hubert

    2015-01-01

    For training supervised classifiers to recognize different patterns, large data collections with accurate labels are necessary. In this paper, we propose a generic, semi-automatic labeling technique for large handwritten character collections. In order to speed up the creation of a large scale ground truth, the method combines unsupervised clustering and minimal expert knowledge. To exploit the potential discriminant complementarities across features, each character is projected into five different feature spaces. After clustering the images in each feature space, the human expert labels the cluster centers. Each data point inherits the label of its cluster’s center. A majority (or unanimity) vote decides the label of each character image. The amount of human involvement (labeling) is strictly controlled by the number of clusters – produced by the chosen clustering approach. To test the efficiency of the proposed approach, we have compared, and evaluated three state-of-the art clustering methods (k-means, self-organizing maps, and growing neural gas) on the MNIST digit data set, and a Lampung Indonesian character data set, respectively. Considering a k-nn classifier, we show that labeling manually only 1.3% (MNIST), and 3.2% (Lampung) of the training data, provides the same range of performance than a completely labeled data set would. PMID:25870463

  20. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  1. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  2. Semi-automatic identification of punching areas for tissue microarray building: the tubular breast cancer pilot study

    PubMed Central

    2010-01-01

    Background Tissue MicroArray technology aims to perform immunohistochemical staining on hundreds of different tissue samples simultaneously. It allows faster analysis, considerably reducing costs incurred in staining. A time consuming phase of the methodology is the selection of tissue areas within paraffin blocks: no utilities have been developed for the identification of areas to be punched from the donor block and assembled in the recipient block. Results The presented work supports, in the specific case of a primary subtype of breast cancer (tubular breast cancer), the semi-automatic discrimination and localization between normal and pathological regions within the tissues. The diagnosis is performed by analysing specific morphological features of the sample such as the absence of a double layer of cells around the lumen and the decay of a regular glands-and-lobules structure. These features are analysed using an algorithm which performs the extraction of morphological parameters from images and compares them to experimentally validated threshold values. Results are satisfactory since in most of the cases the automatic diagnosis matches the response of the pathologists. In particular, on a total of 1296 sub-images showing normal and pathological areas of breast specimens, algorithm accuracy, sensitivity and specificity are respectively 89%, 84% and 94%. Conclusions The proposed work is a first attempt to demonstrate that automation in the Tissue MicroArray field is feasible and it can represent an important tool for scientists to cope with this high-throughput technique. PMID:21087464

  3. Comparison of Semi Automatic DTM from Image Matching with DTM from LIDAR

    NASA Astrophysics Data System (ADS)

    Rahmayudi, Aji; Rizaldy, Aldino

    2016-06-01

    Nowadays DTM LIDAR was used extensively for generating contour line in Topographic Map. This method is very superior compared to traditionally stereomodel compilation from aerial images that consume large resource of human operator and very time consuming. Since the improvement of computer vision and digital image processing, it is possible to generate point cloud DSM from aerial images using image matching algorithm. It is also possible to classify point cloud DSM to DTM using the same technique with LIDAR classification and producing DTM which is comparable to DTM LIDAR. This research will study the accuracy difference of both DTMs and the result of DTM in several different condition including urban area and forest area, flat terrain and mountainous terrain, also time calculation for mass production Topographic Map. From statistical data, both methods are able to produce 1:5.000 Topographic Map scale.

  4. Attitude computation system

    NASA Technical Reports Server (NTRS)

    Werking, R. D.

    1973-01-01

    An attitude computation facility for the control of unmanned satellite missions is reported. The system's major components include: the ability to transfer the attitude data from the control center to the attitude computer at a rate of 2400 bps; an attitude computation center which houses communications, closed circuit TV, graphics devices and a data evaluation area; and the use of interactive graphics devices to schedule jobs and to control program flow.

  5. A Semi-Automatic Coronary Artery Segmentation Framework Using Mechanical Simulation.

    PubMed

    Cai, Ken; Yang, Rongqian; Li, Lihua; Ou, Shanxing; Chen, Yuke; Dou, Jianhong

    2015-10-01

    CVD (cardiovascular disease) is one of the biggest threats to human beings nowadays. An early and quantitative diagnosis of CVD is important in extending lifespan and improving people's life quality. Coronary artery stenosis can prevent CVD. To diagnose the degree of stenosis, the inner diameter of coronary artery needs to be measured. To achieve such measurement, the coronary artery is segmented by using a method that is based on morphology and the continuity between computed tomography image slices. A centerline extraction method based on mechanical simulation is proposed. This centerline extraction method can figure out a basic framework of the coronary artery by simulating pixel dots of the artery image into mass points. Such mass points have tensile forces, with which the outer pixel dots can be drawn to the center. Subsequently, the centerline of the coronary artery can be outlined by using the local line-fitting method. Finally, the nearest point method is adopted to measure the inner diameter. Experimental results showed that the methods proposed in this paper can precisely extract the centerline of the coronary artery and can accurately measure its inner diameter, thereby providing a basis for quantitative diagnosis of coronary artery stenosis. PMID:26310950

  6. A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text

    ERIC Educational Resources Information Center

    Nguyen, Bao-An; Yang, Don-Lin

    2012-01-01

    An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…

  7. Semi-automatic registration of multi-source satellite imagery with varying geometric resolutions

    NASA Astrophysics Data System (ADS)

    Al-Ruzouq, Rami

    Image registration concerns the problem of how to combine data and information from multiple sensors in order to achieve improved accuracy and better inferences about the environment than could be attained through the use of a single sensor. Registration of imagery from multiple sources is essential for a variety of applications in remote sensing, medical diagnosis, computer vision, and pattern recognition. In general, an image registration methodology must deal with four issues. First, a decision has to be made regarding the choice of primitives for the registration procedure. The second issue concerns establishing the registration transformation function that mathematically relates images to be registered. Then, a similarity measure should be devised to ensure the correspondence of conjugate primitives. Finally, a matching strategy has to be designed and implemented as a controlling framework that utilizes the primitives, the similarity measure, and the transformation function to solve the registration problem. The Modified Iterated Hough Transform (MINT) is used as the matching strategy for automatically deriving an estimate of the parameters involved in the transformation function as well as the correspondence between conjugate primitives. The MIHT procedure follows an optimal sequence for parameter estimation. This sequence takes into account the contribution of linear features with different orientations at various locations within the imagery towards the estimation of the transformation parameters in question. Accurate co-registration of multi-sensor datasets captured at different times is a prerequisite step for a reliable change detection procedure. Once the registration problem has been solved, the suggested methodology proceeds by detecting changes between the registered images. Derived edges from the registered images are used as the basis for change detection. Edges are utilized because they are invariant regardless of possible radiometric differences

  8. A conceptual study of automatic and semi-automatic quality assurance techniques for round image processing

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report summarizes the results of a study conducted by Engineering and Economics Research (EER), Inc. under NASA Contract Number NAS5-27513. The study involved the development of preliminary concepts for automatic and semiautomatic quality assurance (QA) techniques for ground image processing. A distinction is made between quality assessment and the more comprehensive quality assurance which includes decision making and system feedback control in response to quality assessment.

  9. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  10. Terrain-driven unstructured mesh development through semi-automatic vertical feature extraction

    NASA Astrophysics Data System (ADS)

    Bilskie, Matthew V.; Coggin, David; Hagen, Scott C.; Medeiros, Stephen C.

    2015-12-01

    validation techniques are necessary for state-of-the-art flood inundation models. In addition, the semi-automated, unstructured mesh generation process presented herein increases the overall accuracy of simulated storm surge across the floodplain without reliance on hand digitization or sacrificing computational cost.

  11. YODA++: A proposal for a semi-automatic space mission control

    NASA Astrophysics Data System (ADS)

    Casolino, M.; de Pascale, M. P.; Nagni, M.; Picozza, P.

    YODA++ is a proposal for a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and are complemented by housekeeping information. Housekeeping information will be analyzed within a short time from download (1 h) in order to monitor the status of the experiment and to foreseen the mission acquisition planning. A prototype for the data visualization will run on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. Data retrieving development is in production phase, while a GUI interface for human friendly monitoring is on preliminary phase as well as a JavaServerPages/JavaServerFaces (JSP/JSF) web application facility. On a longer timescale (1 3 h from download) scientific data are analyzed. The data storage core will be a mix of CERNs ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing on ground of PAMELA data.

  12. Characterising spectral, spatial and morphometric properties of landslides for semi-automatic detection using object-oriented methods

    NASA Astrophysics Data System (ADS)

    Martha, Tapas R.; Kerle, Norman; Jetten, Victor; van Westen, Cees J.; Kumar, K. Vinod

    2010-03-01

    Recognition and classification of landslides is a critical requirement in pre- and post-disaster hazard analysis. This has been primarily done through field mapping or manual image interpretation. However, image interpretation can also be done semi-automatically by creating a routine in object-based classification using the spectral, spatial and morphometric properties of landslides, and by incorporating expert knowledge. This is a difficult task since a fresh landslide has spectral properties that are nearly identical to those of other natural objects, such as river sand and rocky outcrops, and they also do not have unique shapes. This paper investigates the use of a combination of spectral, shape and contextual information to detect landslides. The algorithm is tested with a 5.8 m multispectral data from Resourcesat-1 and a 10 m digital terrain model generated from 2.5 m Cartosat-1 imagery for an area in the rugged Himalayas in India. It uses objects derived from the segmentation of a multispectral image as classifying units for object-oriented analysis. Spectral information together with shape and morphometric characteristics was used initially to separate landslides from false positives. Objects recognised as landslides were subsequently classified based on material type and movement as debris slides, debris flows and rock slides, using adjacency and morphometric criteria. They were further classified for their failure mechanism using terrain curvature. The procedure was developed for a training catchment and then applied without further modification on an independent catchment. A total of five landslide types were detected by this method with 76.4% recognition and 69.1% classification accuracies. This method detects landslides relatively quickly, and hence has the potential to aid risk analysis, disaster management and decision making processes in the aftermath of an earthquake or an extreme rainfall event.

  13. Semi-automatic algorithm for construction of the left ventricular area variation curve over a complete cardiac cycle

    PubMed Central

    2010-01-01

    Background Two-dimensional echocardiography (2D-echo) allows the evaluation of cardiac structures and their movements. A wide range of clinical diagnoses are based on the performance of the left ventricle. The evaluation of myocardial function is typically performed by manual segmentation of the ventricular cavity in a series of dynamic images. This process is laborious and operator dependent. The automatic segmentation of the left ventricle in 4-chamber long-axis images during diastole is troublesome, because of the opening of the mitral valve. Methods This work presents a method for segmentation of the left ventricle in dynamic 2D-echo 4-chamber long-axis images over the complete cardiac cycle. The proposed algorithm is based on classic image processing techniques, including time-averaging and wavelet-based denoising, edge enhancement filtering, morphological operations, homotopy modification, and watershed segmentation. The proposed method is semi-automatic, requiring a single user intervention for identification of the position of the mitral valve in the first temporal frame of the video sequence. Image segmentation is performed on a set of dynamic 2D-echo images collected from an examination covering two consecutive cardiac cycles. Results The proposed method is demonstrated and evaluated on twelve healthy volunteers. The results are quantitatively evaluated using four different metrics, in a comparison with contours manually segmented by a specialist, and with four alternative methods from the literature. The method's intra- and inter-operator variabilities are also evaluated. Conclusions The proposed method allows the automatic construction of the area variation curve of the left ventricle corresponding to a complete cardiac cycle. This may potentially be used for the identification of several clinical parameters, including the area variation fraction. This parameter could potentially be used for evaluating the global systolic function of the left ventricle

  14. Computational Systems Biology

    SciTech Connect

    McDermott, Jason E.; Samudrala, Ram; Bumgarner, Roger E.; Montogomery, Kristina; Ireton, Renee

    2009-05-01

    Computational systems biology is the term that we use to describe computational methods to identify, infer, model, and store relationships between the molecules, pathways, and cells (“systems”) involved in a living organism. Based on this definition, the field of computational systems biology has been in existence for some time. However, the recent confluence of high throughput methodology for biological data gathering, genome-scale sequencing and computational processing power has driven a reinvention and expansion of this field. The expansions include not only modeling of small metabolic{Ishii, 2004 #1129; Ekins, 2006 #1601; Lafaye, 2005 #1744} and signaling systems{Stevenson-Paulik, 2006 #1742; Lafaye, 2005 #1744} but also modeling of the relationships between biological components in very large systems, incluyding whole cells and organisms {Ideker, 2001 #1124; Pe'er, 2001 #1172; Pilpel, 2001 #393; Ideker, 2002 #327; Kelley, 2003 #1117; Shannon, 2003 #1116; Ideker, 2004 #1111}{Schadt, 2003 #475; Schadt, 2006 #1661}{McDermott, 2002 #878; McDermott, 2005 #1271}. Generally these models provide a general overview of one or more aspects of these systems and leave the determination of details to experimentalists focused on smaller subsystems. The promise of such approaches is that they will elucidate patterns, relationships and general features that are not evident from examining specific components or subsystems. These predictions are either interesting in and of themselves (for example, the identification of an evolutionary pattern), or are interesting and valuable to researchers working on a particular problem (for example highlight a previously unknown functional pathway). Two events have occurred to bring about the field computational systems biology to the forefront. One is the advent of high throughput methods that have generated large amounts of information about particular systems in the form of genetic studies, gene expression analyses (both protein and

  15. DRYNUT COMPUTER EXPERT SYSTEMS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of Drynut is to minimize the climatic environmental impact and economic risks in dryland peanut production while maximizing the economic return and improved peanut quality. Drynut uses new concepts, tools and management systems with finger tip computer-based technology to modify the i...

  16. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  17. Semi-automatic methodologies for landslide features extraction: new opportunities but also challenges from high resolution topography

    NASA Astrophysics Data System (ADS)

    Tarolli, P.; Sofia, G.; Dalla Fontana, G.

    2009-12-01

    In recent years new remotely sensed technologies, such as airborne and terrestrial laser scanner, have improved the detail, and the quality of topographic data with notable advantages over traditional survey techniques (Tarolli et al., 2009). A new generation of high resolution (≤3 m) Digital Terrain Models (DTMs) are now available for different areas, and widely used by researchers, offering new opportunities for the scientific community. These data call for the development of the new generation of methodologies for objective extraction of geomorphic features, such as channel heads, channel networks, landslide scars, etc. A high resolution DTM, for example, is able to detect in detail the divergence/convergence areas related to unchannelized/channelized processes respect to a coarse DTM. In last few years different studies used the landform curvature as a useful measure for the interpretation of dominant landform processes (Tarolli and Dalla Fontana, 2009). Curvature has been used to analyze landslide morphology and distribution, and to objectively extract the channel network. In this work, we test the performances of some of these new methodologies for geomorphic features extraction, in order to provide a semi-automatic method to recognize landslide scars in a complex mountainous terrain. The analysis has been carried out using a very high resolution DTM (0.5 m), and different sizes of the moving window for the landform curvature calculation. Statistical dispersion measures (standard deviation, interquartile range, mean and median absolute deviation), and probability plots (quantile-quantile plot) were adopted to objectively define the thresholds of curvature for landslide features extraction. The study was conducted on a study area located in the Eastern Italian Alps, where recent accurate field surveys by DGPS on landslide scars, and a high quality set of airborne laser scanner elevation data are available. The results indicate that curvature maps derived by

  18. A semi-automatic method to determine electrode positions and labels from gel artifacts in EEG/fMRI-studies.

    PubMed

    de Munck, Jan C; van Houdt, Petra J; Verdaasdonk, Ruud M; Ossenblok, Pauly P W

    2012-01-01

    The analysis of simultaneous EEG and fMRI data is generally based on the extraction of regressors of interest from the EEG, which are correlated to the fMRI data in a general linear model setting. In more advanced approaches, the spatial information of EEG is also exploited by assuming underlying dipole models. In this study, we present a semi automatic and efficient method to determine electrode positions from electrode gel artifacts, facilitating the integration of EEG and fMRI in future EEG/fMRI data models. In order to visualize all electrode artifacts simultaneously in a single view, a surface rendering of the structural MRI is made using a skin triangular mesh model as reference surface, which is expanded to a "pancake view". Then the electrodes are determined with a simple mouse click for each electrode. Using the geometry of the skin surface and its transformation to the pancake view, the 3D coordinates of the electrodes are reconstructed in the MRI coordinate frame. The electrode labels are attached to the electrode positions by fitting a template grid of the electrode cap in which the labels are known. The correspondence problem between template and sample electrodes is solved by minimizing a cost function over rotations, shifts and scalings of the template grid. The crucial step here is to use the solution of the so-called "Hungarian algorithm" as a cost function, which makes it possible to identify the electrode artifacts in arbitrary order. The template electrode grid has to be constructed only once for each cap configuration. In our implementation of this method, the whole procedure can be performed within 15 min including import of MRI, surface reconstruction and transformation, electrode identification and fitting to template. The method is robust in the sense that an electrode template created for one subject can be used without identification errors for another subject for whom the same EEG cap was used. Furthermore, the method appears to be

  19. Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  20. Computer Algebra System

    Energy Science and Technology Software Center (ESTSC)

    1992-05-04

    DOE-MACSYMA (Project MAC''s SYmbolic MAnipulation system) is a large computer programming system written in LISP. With DOE-MACSYMA the user can differentiate, integrate, take limits, solve systems of linear or polynomial equations, factor polynomials, expand functions in Laurent or Taylor series, solve differential equations (using direct or transform methods), compute Poisson series, plot curves, and manipulate matrices and tensors. A language similar to ALGOL-60 permits users to write their own programs for transforming symbolic expressions. Franzmore » Lisp OPUS 38 provides the environment for the Encore, Celerity, and DEC VAX11 UNIX,SUN(OPUS) versions under UNIX and the Alliant version under Concentrix. Kyoto Common Lisp (KCL) provides the environment for the SUN(KCL),Convex, and IBM PC under UNIX and Data General under AOS/VS.« less

  1. Computational Systems Chemical Biology

    PubMed Central

    Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander

    2013-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007). The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology / systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology. PMID:20838980

  2. Semi-automatic cone beam CT segmentation of in vivo pre-clinical subcutaneous tumours provides an efficient non-invasive alternative for tumour volume measurements

    PubMed Central

    Brodin, N P; Tang, J; Skalina, K; Quinn, TJ; Basu, I; Guha, C

    2015-01-01

    Objective: To evaluate the feasibility and accuracy of using cone beam CT (CBCT) scans obtained in radiation studies using the small-animal radiation research platform to perform semi-automatic tumour segmentation of pre-clinical tumour volumes. Methods: Volume measurements were evaluated for different anatomical tumour sites, the flank, thigh and dorsum of the hind foot, for a variety of tumour cell lines. The estimated tumour volumes from CBCT and manual calliper measurements using different volume equations were compared with the “gold standard”, measured by weighing the tumours following euthanasia and tumour resection. The correlation between tumour volumes estimated with the different methods, compared with the gold standard, was estimated by the Spearman's rank correlation coefficient, root-mean-square deviation and the coefficient of determination. Results: The semi-automatic CBCT volume segmentation performed favourably compared with manual calliper measures for flank tumours ≤2 cm3 and thigh tumours ≤1 cm3. For tumours >2 cm3 or foot tumours, the CBCT method was not able to accurately segment the tumour volumes and manual calliper measures were superior. Conclusion: We demonstrated that tumour volumes of flank and thigh tumours, obtained as a part of radiation studies using image-guided small-animal irradiators, can be estimated more efficiently and accurately using semi-automatic segmentation from CBCT scans. Advances in knowledge: This is the first study evaluating tumour volume assessment of pre-clinical subcutaneous tumours in different anatomical sites using on-board CBCT imaging. We also compared the accuracy of the CBCT method to manual calliper measures, using various volume calculation equations. PMID:25823502

  3. Computer memory management system

    DOEpatents

    Kirk, III, Whitson John

    2002-01-01

    A computer memory management system utilizing a memory structure system of "intelligent" pointers in which information related to the use status of the memory structure is designed into the pointer. Through this pointer system, The present invention provides essentially automatic memory management (often referred to as garbage collection) by allowing relationships between objects to have definite memory management behavior by use of coding protocol which describes when relationships should be maintained and when the relationships should be broken. In one aspect, the present invention system allows automatic breaking of strong links to facilitate object garbage collection, coupled with relationship adjectives which define deletion of associated objects. In another aspect, The present invention includes simple-to-use infinite undo/redo functionality in that it has the capability, through a simple function call, to undo all of the changes made to a data model since the previous `valid state` was noted.

  4. Semi-automatic characterization of fractured rock masses using 3D point clouds: discontinuity orientation, spacing and SMR geomechanical classification

    NASA Astrophysics Data System (ADS)

    Riquelme, Adrian; Tomas, Roberto; Abellan, Antonio; Cano, Miguel; Jaboyedoff, Michel

    2015-04-01

    Investigation of fractured rock masses for different geological applications (e.g. fractured reservoir exploitation, rock slope instability, rock engineering, etc.) requires a deep geometric understanding of the discontinuity sets affecting rock exposures. Recent advances in 3D data acquisition using photogrammetric and/or LiDAR techniques currently allow a quick and an accurate characterization of rock mass discontinuities. This contribution presents a methodology for: (a) use of 3D point clouds for the identification and analysis of planar surfaces outcropping in a rocky slope; (b) calculation of the spacing between different discontinuity sets; (c) semi-automatic calculation of the parameters that play a capital role in the Slope Mass Rating geomechanical classification. As for the part a) (discontinuity orientation), our proposal identifies and defines the algebraic equations of the different discontinuity sets of the rock slope surface by applying an analysis based on a neighbouring points coplanarity test. Additionally, the procedure finds principal orientations by Kernel Density Estimation and identifies clusters (Riquelme et al., 2014). As a result of this analysis, each point is classified with a discontinuity set and with an outcrop plane (cluster). Regarding the part b) (discontinuity spacing) our proposal utilises the previously classified point cloud to investigate how different outcropping planes are linked in space. Discontinuity spacing is calculated for each pair of linked clusters within the same discontinuity set, and then spacing values are analysed calculating their statistic values. Finally, as for the part c) the previous results are used to calculate parameters F_1, F2 and F3 of the Slope Mass Rating geomechanical classification. This analysis is carried out for each discontinuity set using their respective orientation extracted in part a). The open access tool SMRTool (Riquelme et al., 2014) is then used to calculate F1 to F3 correction

  5. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Mittempergher, Silvia; Vho, Alice; Bistacchi, Andrea

    2016-04-01

    A quantitative analysis of fault-rock distribution in outcrops of exhumed fault zones is of fundamental importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation. We present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM), developed on the Gole Larghe Fault Zone (GLFZ), a well exposed strike-slip fault in the Adamello batholith (Italian Southern Alps). The GLFZ has been exhumed from ca. 8-10 km depth, and consists of hundreds of individual seismogenic slip surfaces lined by green cataclasites (crushed wall rocks cemented by the hydrothermal epidote and K-feldspar) and black pseudotachylytes (solidified frictional melts, considered as a marker for seismic slip). A digital model of selected outcrop exposures was reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs processed with VisualSFM software. The resulting DOM has a resolution up to 0.2 mm/pixel. Most of the outcrop was imaged using images each one covering a 1 x 1 m2 area, while selected structural features, such as sidewall ripouts or stepovers, were covered with higher-resolution images covering 30 x 40 cm2 areas.Image processing algorithms were preliminarily tested using the ImageJ-Fiji package, then a workflow in Matlab was developed to process a large collection of images sequentially. Particularly in detailed 30 x 40 cm images, cataclasites and hydrothermal veins were successfully identified using spectral analysis in RGB and HSV color spaces. This allows mapping the network of cataclasites and veins which provided the pathway for hydrothermal fluid circulation, and also the volume of mineralization, since we are able to measure the thickness of cataclasites and veins on the outcrop surface. The spectral signature of pseudotachylyte veins is indistinguishable from that of biotite grains in the wall rock (tonalite), so we tested morphological analysis tools to discriminate

  6. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries

    PubMed Central

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Background and objective Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Methods Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. Results The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. Conclusions The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. PMID:25332357

  7. Semi-automatic 3D-volumetry of liver metastases from neuroendocrine tumors to improve combination therapy with 177Lu-DOTATOC and 90Y-DOTATOC

    PubMed Central

    Cieciera, Matthaeus; Kratochwil, Clemens; Moltz, Jan; Kauczor, Hans-Ulrich; Holland-Letz, Tim; Choyke, Peter; Mier, Walter; Haberkorn, Uwe; Giesel, Frederik L.

    2016-01-01

    PURPOSE Patients with neuroendocrine tumors (NET) often present with disseminated liver metastases and can be treated with a number of different nuclides or nuclide combinations in peptide receptor radionuclide therapy (PRRT) depending on tumor load and lesion diameter. For quantification of disseminated liver lesions, semi-automatic lesion detection is helpful to determine tumor burden and tumor diameter in a time efficient manner. Here, we aimed to evaluate semi-automated measurement of total metastatic burden for therapy stratification. METHODS Nineteen patients with liver metastasized NET underwent contrast-enhanced 1.5 T MRI using gadolinium-ethoxybenzyl diethylenetriaminepentaacetic acid. Liver metastases (n=1537) were segmented using Fraunhofer MEVIS Software for three-dimensional (3D) segmentation. All lesions were stratified according to longest 3D diameter >20 mm or ≤20 mm and relative contribution to tumor load was used for therapy stratification. RESULTS Mean count of lesions ≤20 mm was 67.5 and mean count of lesions >20 mm was 13.4. However, mean contribution to total tumor volume of lesions ≤20 mm was 24%, while contribution of lesions >20 mm was 76%. CONCLUSION Semi-automatic lesion analysis provides useful information about lesion distribution in predominantly liver metastasized NET patients prior to PRRT. As conventional manual lesion measurements are laborious, our study shows this new approach is more efficient and less operator-dependent and may prove to be useful in the decision making process selecting the best combination PRRT in each patient. PMID:27015320

  8. A semi-automatic microextraction in packed sorbent, using a digitally controlled syringe, combined with ultra-high pressure liquid chromatography as a new and ultra-fast approach for the determination of prenylflavonoids in beers.

    PubMed

    Gonçalves, João L; Alves, Vera L; Rodrigues, Fátima P; Figueira, José A; Câmara, José S

    2013-08-23

    In this work a highly selective and sensitive analytical procedure based on semi-automatic microextraction by packed sorbents (MEPS) technique, using a new digitally controlled syringe (eVol(®)) combined with ultra-high pressure liquid chromatography (UHPLC), is proposed to determine the prenylated chalcone derived from the hop (Humulus lupulus L.), xanthohumol (XN), and its isomeric flavonone isoxanthohumol (IXN) in beers. Extraction and UHPLC parameters were accurately optimized to achieve the highest recoveries and to enhance the analytical characteristics of the method. Important parameters affecting MEPS performance, namely the type of sorbent material (C2, C8, C18, SIL, and M1), elution solvent system, number of extraction cycles (extract-discard), sample volume, elution volume, and sample pH, were evaluated. The optimal experimental conditions involves the loading of 500μL of sample through a C18 sorbent in a MEPS syringe placed in the semi-automatic eVol(®) syringe followed by elution using 250μL of acetonitrile (ACN) in a 10 extractions cycle (about 5min for the entire sample preparation step). The obtained extract is directly analyzed in the UHPLC system using a binary mobile phase composed of aqueous 0.1% formic acid (eluent A) and ACN (eluent B) in the gradient elution mode (10min total analysis). Under optimized conditions good results were obtained in terms of linearity within the established concentration range with correlation coefficients (R) values higher than 0.986, with a residual deviation for each calibration point below 12%. The limit of detection (LOD) and limit of quantification (LOQ) obtained were 0.4ngmL(-1) and 1.0ngmL(-1) for IXN, and 0.9ngmL(-1) and 3.0ngmL(-1) for XN, respectively. Precision was lower than 4.6% for IXN and 8.4% for XN. Typical recoveries ranged between 67.1% and 99.3% for IXN and between 74.2% and 99.9% for XN, with relative standard deviations %RSD no larger than 8%. The applicability of the proposed analytical

  9. New computing systems and their impact on computational mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  10. Computer Security Systems Enable Access.

    ERIC Educational Resources Information Center

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  11. Robot, computer problem solving system

    NASA Technical Reports Server (NTRS)

    Becker, J. D.; Merriam, E. W.

    1973-01-01

    The TENEX computer system, the ARPA network, and computer language design technology was applied to support the complex system programs. By combining the pragmatic and theoretical aspects of robot development, an approach is created which is grounded in realism, but which also has at its disposal the power that comes from looking at complex problems from an abstract analytical point of view.

  12. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    NASA Astrophysics Data System (ADS)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  13. Axonal Morphometry of Hippocampal Pyramidal Neurons Semi-Automatically Reconstructed After In-Vivo Labeling in Different CA3 Locations

    PubMed Central

    Ropireddy, Deepak; Scorcioni, Ruggero; Lasher, Bonnie; Buzsáki, Gyorgy; Ascoli, Giorgio A.

    2013-01-01

    Axonal arbors of principal neurons form the backbone of neuronal networks in the mammalian cortex. Three-dimensional reconstructions of complete axonal trees are invaluable for quantitative analysis and modeling. However, digital data are still sparse due to labor intensity of reconstructing these complex structures. We augmented conventional tracing techniques with computational approaches to reconstruct fully labeled axonal morphologies. We digitized the axons of three rat hippocampal pyramidal cells intracellularly filled in-vivo from different CA3 sub-regions: two from areas CA3b and CA3c, respectively, toward the septal pole, and one from the posterior/ventral area (CA3pv) near the temporal pole. The reconstruction system was validated by comparing the morphology of the CA3c neuron with that traced from the same cell by a different operator on a standard commercial setup. Morphometric analysis revealed substantial differences among neurons. Total length ranged from 200mm (CA3b) to 500mm (CA3c), and axonal branching complexity peaked between 1mm (CA3b and CA3pv) and 2mm (CA3c) of Euclidean distance from the soma. Length distribution was analyzed among sub-regions (CA3a,b,c and CA1a,b,c), cytoarchitectonic layers, and longitudinal extent within a three-dimensional template of the rat hippocampus. The CA3b axon extended thrice more collaterals within CA3 than into CA1. On the contrary, the CA3c projection was double into CA1 than within CA3. Moreover, the CA3b axon extension was equal between strata oriens and radiatum, while the CA3c axon displayed an oriens/radiatum ratio of 1:6. The axonal distribution of the CA3pv neuron was intermediate between those of the CA3b and CA3c neurons both relative to sub-regions and layers, with uniform collateral presence across CA3/CA1 and moderate preponderance of radiatum over oriens. In contrast with the dramatic sub-region and layer differences, the axon longitudinal spread around the soma was similar for the three neurons

  14. A web-based computer aided system for liver surgery planning: initial implementation on RayPlus

    NASA Astrophysics Data System (ADS)

    Luo, Ming; Yuan, Rong; Sun, Zhi; Li, Tianhong; Xie, Qingguo

    2016-03-01

    At present, computer aided systems for liver surgery design and risk evaluation are widely used in clinical all over the world. However, most systems are local applications that run on high-performance workstations, and the images have to processed offline. Compared with local applications, a web-based system is accessible anywhere and for a range of regardless of relative processing power or operating system. RayPlus (http://rayplus.life.hust.edu.cn), a B/S platform for medical image processing, was developed to give a jump start on web-based medical image processing. In this paper, we implement a computer aided system for liver surgery planning on the architecture of RayPlus. The system consists of a series of processing to CT images including filtering, segmentation, visualization and analyzing. Each processing is packaged into an executable program and runs on the server side. CT images in DICOM format are processed step by to interactive modeling on browser with zero-installation and server-side computing. The system supports users to semi-automatically segment the liver, intrahepatic vessel and tumor from the pre-processed images. Then, surface and volume models are built to analyze the vessel structure and the relative position between adjacent organs. The results show that the initial implementation meets satisfactorily its first-order objectives and provide an accurate 3D delineation of the liver anatomy. Vessel labeling and resection simulation are planned to add in the future. The system is available on Internet at the link mentioned above and an open username for testing is offered.

  15. The UCLA MEDLARS computer system.

    PubMed

    Garvis, F J

    1966-01-01

    Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355

  16. Students "Hacking" School Computer Systems

    ERIC Educational Resources Information Center

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  17. Robot computer problem solving system

    NASA Technical Reports Server (NTRS)

    Becker, J. D.; Merriam, E. W.

    1974-01-01

    The conceptual, experimental, and practical aspects of the development of a robot computer problem solving system were investigated. The distinctive characteristics were formulated of the approach taken in relation to various studies of cognition and robotics. Vehicle and eye control systems were structured, and the information to be generated by the visual system is defined.

  18. User computer system pilot project

    SciTech Connect

    Eimutis, E.C.

    1989-09-06

    The User Computer System (UCS) is a general purpose unclassified, nonproduction system for Mound users. The UCS pilot project was successfully completed, and the system currently has more than 250 users. Over 100 tables were installed on the UCS for use by subscribers, including tables containing data on employees, budgets, and purchasing. In addition, a UCS training course was developed and implemented.

  19. Operating systems. [of computers

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Brown, R. L.

    1984-01-01

    A counter operating system creates a hierarchy of levels of abstraction, so that at a given level all details concerning lower levels can be ignored. This hierarchical structure separates functions according to their complexity, characteristic time scale, and level of abstraction. The lowest levels include the system's hardware; concepts associated explicitly with the coordination of multiple tasks appear at intermediate levels, which conduct 'primitive processes'. Software semaphore is the mechanism controlling primitive processes that must be synchronized. At higher levels lie, in rising order, the access to the secondary storage devices of a particular machine, a 'virtual memory' scheme for managing the main and secondary memories, communication between processes by way of a mechanism called a 'pipe', access to external input and output devices, and a hierarchy of directories cataloguing the hardware and software objects to which access must be controlled.

  20. A new generic method for semi-automatic extraction of river and road networks in low- and mid-resolution satellite images

    NASA Astrophysics Data System (ADS)

    Grazzini, Jacopo; Dillard, Scott; Soille, Pierre

    2010-10-01

    This paper addresses the problem of semi-automatic extraction of road or hydrographic networks in satellite images. For that purpose, we propose an approach combining concepts arising from mathematical morphology and hydrology. The method exploits both geometrical and topological characteristics of rivers/roads and their tributaries in order to reconstruct the complete networks. It assumes that the images satisfy the following two general assumptions, which are the minimum conditions for a road/river network to be identifiable and are usually verified in low- to mid-resolution satellite images: (i) visual constraint: most pixels composing the network have similar spectral signature that is distinguishable from most of the surrounding areas; (ii) geometric constraint: a line is a region that is relatively long and narrow, compared with other objects in the image. While this approach fully exploits local (roads/rivers are modeled as elongated regions with a smooth spectral signature in the image and a maximum width) and global (they are structured like a tree) characteristics of the networks, further directional information about the image structures is incorporated. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the target network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given network seed with this metric is combined with hydrological operators for overland flow simulation to extract the paths which contain most line evidence and identify them with the target network.

  1. Semi-automatic delineation of the spino-laminar junction curve on lateral x-ray radiographs of the cervical spine

    NASA Astrophysics Data System (ADS)

    Narang, Benjamin; Phillips, Michael; Knapp, Karen; Appelboam, Andy; Reuben, Adam; Slabaugh, Greg

    2015-03-01

    Assessment of the cervical spine using x-ray radiography is an important task when providing emergency room care to trauma patients suspected of a cervical spine injury. In routine clinical practice, a physician will inspect the alignment of the cervical spine vertebrae by mentally tracing three alignment curves along the anterior and posterior sides of the cervical vertebral bodies, as well as one along the spinolaminar junction. In this paper, we propose an algorithm to semi-automatically delineate the spinolaminar junction curve, given a single reference point and the corners of each vertebral body. From the reference point, our method extracts a region of interest, and performs template matching using normalized cross-correlation to find matching regions along the spinolaminar junction. Matching points are then fit to a third order spline, producing an interpolating curve. Experimental results demonstrate promising results, on average producing a modified Hausdorff distance of 1.8 mm, validated on a dataset consisting of 29 patients including those with degenerative change, retrolisthesis, and fracture.

  2. Semi-automatic segmentation and modeling of the cervical spinal cord for volume quantification in multiple sclerosis patients from magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Sonkova, Pavlina; Evangelou, Iordanis E.; Gallo, Antonio; Cantor, Fredric K.; Ohayon, Joan; McFarland, Henry F.; Bagnato, Francesca

    2008-03-01

    Spinal cord (SC) tissue loss is known to occur in some patients with multiple sclerosis (MS), resulting in SC atrophy. Currently, no measurement tools exist to determine the magnitude of SC atrophy from Magnetic Resonance Images (MRI). We have developed and implemented a novel semi-automatic method for quantifying the cervical SC volume (CSCV) from Magnetic Resonance Images (MRI) based on level sets. The image dataset consisted of SC MRI exams obtained at 1.5 Tesla from 12 MS patients (10 relapsing-remitting and 2 secondary progressive) and 12 age- and gender-matched healthy volunteers (HVs). 3D high resolution image data were acquired using an IR-FSPGR sequence acquired in the sagittal plane. The mid-sagittal slice (MSS) was automatically located based on the entropy calculation for each of the consecutive sagittal slices. The image data were then pre-processed by 3D anisotropic diffusion filtering for noise reduction and edge enhancement before segmentation with a level set formulation which did not require re-initialization. The developed method was tested against manual segmentation (considered ground truth) and intra-observer and inter-observer variability were evaluated.

  3. A new generic method for the semi-automatic extraction of river and road networks in low and mid-resolution satellite images

    SciTech Connect

    Grazzini, Jacopo; Dillard, Scott; Soille, Pierre

    2010-10-21

    This paper addresses the problem of semi-automatic extraction of road or hydrographic networks in satellite images. For that purpose, we propose an approach combining concepts arising from mathematical morphology and hydrology. The method exploits both geometrical and topological characteristics of rivers/roads and their tributaries in order to reconstruct the complete networks. It assumes that the images satisfy the following two general assumptions, which are the minimum conditions for a road/river network to be identifiable and are usually verified in low- to mid-resolution satellite images: (i) visual constraint: most pixels composing the network have similar spectral signature that is distinguishable from most of the surrounding areas; (ii) geometric constraint: a line is a region that is relatively long and narrow, compared with other objects in the image. While this approach fully exploits local (roads/rivers are modeled as elongated regions with a smooth spectral signature in the image and a maximum width) and global (they are structured like a tree) characteristics of the networks, further directional information about the image structures is incorporated. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the target network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given network seed with this metric is combined with hydrological operators for overland flow simulation to extract the paths which contain most line evidence and identify them with the target network.

  4. Revised adage graphics computer system

    NASA Technical Reports Server (NTRS)

    Tulppo, J. S.

    1980-01-01

    Bootstrap loader and mode-control options for Adage Graphics Computer System Significantly simplify operations procedures. Normal load and control functions are performed quickly and easily from control console. Operating characteristics of revised system include greatly increased speed, convenience, and reliability.

  5. Mission operations computing systems evolution

    NASA Technical Reports Server (NTRS)

    Kurzhals, P. R.

    1981-01-01

    As part of its preparation for the operational Shuttle era, the Goddard Space Flight Center (GSFC) is currently replacing most of the mission operations computing complexes that have supported near-earth space missions since the late 1960's. Major associated systems include the Metric Data Facility (MDF) which preprocesses, stores, and forwards all near-earth satellite tracking data; the Orbit Computation System (OCS) which determines related production orbit and attitude information; the Flight Dynamics System (FDS) which formulates spacecraft attitude and orbit maneuvers; and the Command Management System (CMS) which handles mission planning, scheduling, and command generation and integration. Management issues and experiences for the resultant replacement process are driven by a wide range of possible future mission requirements, flight-critical system aspects, complex internal system interfaces, extensive existing applications software, and phasing to optimize systems evolution.

  6. A semi-automatic method to extract canal pathways in 3D micro-CT images of Octocorals.

    PubMed

    Morales Pinzón, Alfredo; Orkisz, Maciej; Rodríguez Useche, Catalina María; Torres González, Juan Sebastián; Teillaud, Stanislas; Sánchez, Juan Armando; Hernández Hoyos, Marcela

    2014-01-01

    The long-term goal of our study is to understand the internal organization of the octocoral stem canals, as well as their physiological and functional role in the growth of the colonies, and finally to assess the influence of climatic changes on this species. Here we focus on imaging tools, namely acquisition and processing of three-dimensional high-resolution images, with emphasis on automated extraction of canal pathways. Our aim was to evaluate the feasibility of the whole process, to point out and solve - if possible - technical problems related to the specimen conditioning, to determine the best acquisition parameters and to develop necessary image-processing algorithms. The pathways extracted are expected to facilitate the structural analysis of the colonies, namely to help observing the distribution, formation and number of canals along the colony. Five volumetric images of Muricea muricata specimens were successfully acquired by X-ray computed tomography with spatial resolution ranging from 4.5 to 25 micrometers. The success mainly depended on specimen immobilization. More than [Formula: see text] of the canals were successfully detected and tracked by the image-processing method developed. Thus obtained three-dimensional representation of the canal network was generated for the first time without the need of histological or other destructive methods. Several canal patterns were observed. Although most of them were simple, i.e. only followed the main branch or "turned" into a secondary branch, many others bifurcated or fused. A majority of bifurcations were observed at branching points. However, some canals appeared and/or ended anywhere along a branch. At the tip of a branch, all canals fused into a unique chamber. Three-dimensional high-resolution tomographic imaging gives a non-destructive insight to the coral ultrastructure and helps understanding the organization of the canal network. Advanced image-processing techniques greatly reduce human observer

  7. A Semi-Automatic Method to Extract Canal Pathways in 3D Micro-CT Images of Octocorals

    PubMed Central

    Morales Pinzón, Alfredo; Orkisz, Maciej; Rodríguez Useche, Catalina María; Torres González, Juan Sebastián; Teillaud, Stanislas; Sánchez, Juan Armando; Hernández Hoyos, Marcela

    2014-01-01

    The long-term goal of our study is to understand the internal organization of the octocoral stem canals, as well as their physiological and functional role in the growth of the colonies, and finally to assess the influence of climatic changes on this species. Here we focus on imaging tools, namely acquisition and processing of three-dimensional high-resolution images, with emphasis on automated extraction of canal pathways. Our aim was to evaluate the feasibility of the whole process, to point out and solve – if possible – technical problems related to the specimen conditioning, to determine the best acquisition parameters and to develop necessary image-processing algorithms. The pathways extracted are expected to facilitate the structural analysis of the colonies, namely to help observing the distribution, formation and number of canals along the colony. Five volumetric images of Muricea muricata specimens were successfully acquired by X-ray computed tomography with spatial resolution ranging from 4.5 to 25 micrometers. The success mainly depended on specimen immobilization. More than of the canals were successfully detected and tracked by the image-processing method developed. Thus obtained three-dimensional representation of the canal network was generated for the first time without the need of histological or other destructive methods. Several canal patterns were observed. Although most of them were simple, i.e. only followed the main branch or “turned” into a secondary branch, many others bifurcated or fused. A majority of bifurcations were observed at branching points. However, some canals appeared and/or ended anywhere along a branch. At the tip of a branch, all canals fused into a unique chamber. Three-dimensional high-resolution tomographic imaging gives a non-destructive insight to the coral ultrastructure and helps understanding the organization of the canal network. Advanced image-processing techniques greatly reduce human observer's effort and

  8. Computational capabilities of physical systems.

    PubMed

    Wolpert, David H

    2002-01-01

    In this paper strong limits on the accuracy of real-world physical computation are established. To derive these results a non-Turing machine formulation of physical computation is used. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out every computational task in the subset of such tasks that could potentially be posed to C. This means in particular that there cannot be a physical computer that can be assured of correctly "processing information faster than the universe does." Because this result holds independent of how or if the computer is physically coupled to the rest of the universe, it also means that there cannot exist an infallible, general-purpose observation apparatus, nor an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or nonclassical, and/or obey chaotic dynamics. They also hold even if one could use an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing machine (TM). After deriving these results analogs of the TM Halting theorem are derived for the novel kind of computer considered in this paper, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analog of algorithmic information complexity, "prediction complexity," is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task. This is analogous to the "encoding" bound governing how much the algorithm information complexity of a TM calculation can differ for two reference universal TMs. It is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike

  9. Computational capabilities of physical systems

    NASA Astrophysics Data System (ADS)

    Wolpert, David H.

    2002-01-01

    In this paper strong limits on the accuracy of real-world physical computation are established. To derive these results a non-Turing machine formulation of physical computation is used. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out every computational task in the subset of such tasks that could potentially be posed to C. This means in particular that there cannot be a physical computer that can be assured of correctly ``processing information faster than the universe does.'' Because this result holds independent of how or if the computer is physically coupled to the rest of the universe, it also means that there cannot exist an infallible, general-purpose observation apparatus, nor an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or nonclassical, and/or obey chaotic dynamics. They also hold even if one could use an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing machine (TM). After deriving these results analogs of the TM Halting theorem are derived for the novel kind of computer considered in this paper, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analog of algorithmic information complexity, ``prediction complexity,'' is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task. This is analogous to the ``encoding'' bound governing how much the algorithm information complexity of a TM calculation can differ for two reference universal TMs. It is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike

  10. View planning and mesh refinement effects on a semi-automatic three-dimensional photorealistic texture mapping procedure

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong; Yang, Yuanfan

    2012-02-01

    A novel three-dimensional (3-D) photorealistic texturing process is presented that applies a view-planning and view-sequencing algorithm to the 3-D coarse model to determine a set of best viewing angles for capturing the individual real-world objects/building's images. The best sequence of views will generate sets of visible edges in each view to serve as a guide for camera field shots by either manual adjustment or equipment alignment. The best view tries to cover as many objects/building surfaces as possible in one shot. This will lead to a smaller total number of shots taken for a complete model reconstruction requiring texturing with photo-realistic effects. The direct linear transformation method (DLT) is used for reprojection of 3-D model vertices onto a two-dimensional (2-D) images plane for actual texture mapping. Given this method, the actual camera orientations do not have to be unique and can be set arbitrarily without heavy and expensive positioning equipment. We also present results of a study on the texture-mapping precision as a function of the level of visible mesh subdivision. In addition, the control points selection for the DLT method used for reprojection of 3-D model vertices onto 2-D textured images is also investigated for its effects on mapping precision. By using DLT and perspective projection theories on a coarse model feature points, this technique will allow accurate 3-D texture mapping of refined model meshes of real-world buildings. The novel integration flow of this research not only greatly reduces the human labor and intensive equipment requirements of traditional methods, but also generates a more appealing photo-realistic appearance of reconstructed models, which is useful in many multimedia applications. The roles of view planning (VP) are multifold. VP can (1) reduce the repetitive texture-mapping computation load, (2) can present a set of visible model wireframe edges that can serve as a guide for images with sharp edges and

  11. Analog computation with dynamical systems

    NASA Astrophysics Data System (ADS)

    Siegelmann, Hava T.; Fishman, Shmuel

    1998-09-01

    Physical systems exhibit various levels of complexity: their long term dynamics may converge to fixed points or exhibit complex chaotic behavior. This paper presents a theory that enables to interpret natural processes as special purpose analog computers. Since physical systems are naturally described in continuous time, a definition of computational complexity for continuous time systems is required. In analogy with the classical discrete theory we develop fundamentals of computational complexity for dynamical systems, discrete or continuous in time, on the basis of an intrinsic time scale of the system. Dissipative dynamical systems are classified into the computational complexity classes P d, Co-RP d, NP d and EXP d, corresponding to their standard counterparts, according to the complexity of their long term behavior. The complexity of chaotic attractors relative to regular ones leads to the conjecture P d ≠ NP d. Continuous time flows have been proven useful in solving various practical problems. Our theory provides the tools for an algorithmic analysis of such flows. As an example we analyze the continuous Hopfield network.

  12. A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.

    PubMed

    Rau, Jiann-Yeou; Yeh, Po-Chia

    2012-01-01

    The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656

  13. A Semi-Automatic Image-Based Close Range 3D Modeling Pipeline Using a Multi-Camera Configuration

    PubMed Central

    Rau, Jiann-Yeou; Yeh, Po-Chia

    2012-01-01

    The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656

  14. In vivo semi-automatic segmentation of multicontrast cardiovascular magnetic resonance for prospective cohort studies on plaque tissue composition: initial experience.

    PubMed

    Yoneyama, Taku; Sun, Jie; Hippe, Daniel S; Balu, Niranjan; Xu, Dongxiang; Kerwin, William S; Hatsukami, Thomas S; Yuan, Chun

    2016-01-01

    Automatic in vivo segmentation of multicontrast (multisequence) carotid magnetic resonance for plaque composition has been proposed as a substitute for manual review to save time and reduce inter-reader variability in large-scale or multicenter studies. Using serial images from a prospective longitudinal study, we sought to compare a semi-automatic approach versus expert human reading in analyzing carotid atherosclerosis progression. Baseline and 6-month follow-up multicontrast carotid images from 59 asymptomatic subjects with 16-79 % carotid stenosis were reviewed by both trained radiologists with 2-4 years of specialized experience in carotid plaque characterization with MRI and a previously reported automatic atherosclerotic plaque segmentation algorithm, referred to as morphology-enhanced probabilistic plaque segmentation (MEPPS). Agreement on measurements from individual time points, as well as on compositional changes, was assessed using the intraclass correlation coefficient (ICC). There was good agreement between manual and MEPPS reviews on individual time points for calcification (CA) (area: ICC; 0.85-0.91; volume: ICC; 0.92-0.95) and lipid-rich necrotic core (LRNC) (area: ICC; 0.78-0.82; volume: ICC; 0.84-0.86). For compositional changes, agreement was good for CA volume change (ICC; 0.78) and moderate for LRNC volume change (ICC; 0.49). Factors associated with LRNC progression as detected by MEPPS review included intraplaque hemorrhage (positive association) and reduction in low-density lipoprotein cholesterol (negative association), which were consistent with previous findings from manual review. Automatic classifier for plaque composition produced results similar to expert manual review in a prospective serial MRI study of carotid atherosclerosis progression. Such automatic classification tools may be beneficial in large-scale multicenter studies by reducing image analysis time and avoiding bias between human reviewers. PMID:26169389

  15. Estimating ice albedo from fine debris cover quantified by a semi-automatic method: the case study of Forni Glacier, Italian Alps

    NASA Astrophysics Data System (ADS)

    Azzoni, Roberto Sergio; Senese, Antonella; Zerboni, Andrea; Maugeri, Maurizio; Smiraglia, Claudio; Diolaiuti, Guglielmina Adele

    2016-03-01

    In spite of the quite abundant literature focusing on fine debris deposition over glacier accumulation areas, less attention has been paid to the glacier melting surface. Accordingly, we proposed a novel method based on semi-automatic image analysis to estimate ice albedo from fine debris coverage (d). Our procedure was tested on the surface of a wide Alpine valley glacier (the Forni Glacier, Italy), in summer 2011, 2012 and 2013, acquiring parallel data sets of in situ measurements of ice albedo and high-resolution surface images. Analysis of 51 images yielded d values ranging from 0.01 to 0.63 and albedo was found to vary from 0.06 to 0.32. The estimated d values are in a linear relation with the natural logarithm of measured ice albedo (R = -0.84). The robustness of our approach in evaluating d was analyzed through five sensitivity tests, and we found that it is largely replicable. On the Forni Glacier, we also quantified a mean debris coverage rate (Cr) equal to 6 g m-2 per day during the ablation season of 2013, thus supporting previous studies that describe ongoing darkening phenomena at Alpine debris-free glaciers surface. In addition to debris coverage, we also considered the impact of water (both from melt and rainfall) as a factor that tunes albedo: meltwater occurs during the central hours of the day, decreasing the albedo due to its lower reflectivity; instead, rainfall causes a subsequent mean daily albedo increase slightly higher than 20 %, although it is short-lasting (from 1 to 4 days).

  16. Policy Information System Computer Program.

    ERIC Educational Resources Information Center

    Hamlin, Roger E.; And Others

    The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…

  17. SYRIT Computer School Systems Report.

    ERIC Educational Resources Information Center

    Maldonado, Carmen

    The 1991-92 and 1993-94 audit for SYRIT Computer School Systems revealed noncompliance of appropriate law and regulations in certifying students for Tuition Assistance Program (TAP) awards. SYRIT was overpaid $2,817,394 because school officials incorrectly certified student eligibility. The audit also discovered that students graduated and were…

  18. Robot computer problem solving system

    NASA Technical Reports Server (NTRS)

    Becker, J. D.; Merriam, E. W.

    1974-01-01

    The conceptual, experimental, and practical phases of developing a robot computer problem solving system are outlined. Robot intelligence, conversion of the programming language SAIL to run under the THNEX monitor, and the use of the network to run several cooperating jobs at different sites are discussed.

  19. Computer aided coordinate measuring systems

    NASA Astrophysics Data System (ADS)

    Nastri, J. W.

    Sikorsky's computer-aided inspection system and equipment utilized to assure that manufactured parts meet drawing tolerance specifications are discussed. An overview of the system is given, and the software is described, including the monitor console routine and commands and the language commands. The system's three coordinate measuring machines are discussed, and the part inspection methods are described in stepwise fashion. System benefits and time savings items are detailed, including quick and accurate measurement of parts difficult to inspect by conventional methods, significant reduction in inspection time, a consistent baseline that highlights variances, and the use of personnel with lower skill levels to effectively inspect critical parts.

  20. Robot, computer problem solving system

    NASA Technical Reports Server (NTRS)

    Becker, J. D.

    1972-01-01

    The development of a computer problem solving system is reported that considers physical problems faced by an artificial robot moving around in a complex environment. Fundamental interaction constraints with a real environment are simulated for the robot by visual scan and creation of an internal environmental model. The programming system used in constructing the problem solving system for the simulated robot and its simulated world environment is outlined together with the task that the system is capable of performing. A very general framework for understanding the relationship between an observed behavior and an adequate description of that behavior is included.

  1. Semi-automatic delimitation of volcanic edifice boundaries: Validation and application to the cinder cones of the Tancitaro-Nueva Italia region (Michoacán-Guanajuato Volcanic Field, Mexico)

    NASA Astrophysics Data System (ADS)

    Di Traglia, Federico; Morelli, Stefano; Casagli, Nicola; Garduño Monroy, Victor Hugo

    2014-08-01

    The shape and size of monogenetic volcanoes are the result of complex evolutions involving the interaction of eruptive activity, structural setting and degradational processes. Morphological studies of cinder cones aim to evaluate volcanic hazard on the Earth and to decipher the origins of various structures on extraterrestrial planets. Efforts have been dedicated so far to the characterization of the cinder cone morphology in a systematic and comparable manner. However, manual delimitation is time-consuming and influenced by the user subjectivity but, on the other hand, automatic boundary delimitation of volcanic terrains can be affected by irregular topography. In this work, the semi-automatic delimitation of volcanic edifice boundaries proposed by Grosse et al. (2009) for stratovolcanoes was tested for the first time over monogenetic cinder cones. The method, based on the integration of the DEM-derived slope and curvature maps, is applied here to the Tancitaro-Nueva Italia region of the Michoacán-Guanajuato Volcanic Field (Mexico), where 309 Plio-Quaternary cinder cones are located. The semiautomatic extraction allowed identification of 137 of the 309 cinder cones of the Tancitaro-Nueva Italia region, recognized by means of the manual extraction. This value corresponds to the 44.3% of the total number of cinder cones. Analysis on vent alignments allowed us to identify NE-SW vent alignments and cone elongations, consistent with a NE-SW σmax and a NW-SE σmin. Constructing a vent intensity map, based on computing the number of vents within a radius r centred on each vent of the data set and choosing r = 5 km, four vent intensity maxima were derived: one is positioned in the NW with respect to the Volcano Tancitaro, one in the NE, one to the S and another vent cluster located at the SE boundary of the studied area. The spacing of centroid of each cluster (24 km) can be related to the thickness of the crust (9-10 km) overlying the magma reservoir.

  2. Robot computer problem solving system

    NASA Technical Reports Server (NTRS)

    Merriam, E. W.; Becker, J. D.

    1973-01-01

    A robot computer problem solving system which represents a robot exploration vehicle in a simulated Mars environment is described. The model exhibits changes and improvements made on a previously designed robot in a city environment. The Martian environment is modeled in Cartesian coordinates; objects are scattered about a plane; arbitrary restrictions on the robot's vision have been removed; and the robot's path contains arbitrary curves. New environmental features, particularly the visual occlusion of objects by other objects, were added to the model. Two different algorithms were developed for computing occlusion. Movement and vision capabilities of the robot were established in the Mars environment, using LISP/FORTRAN interface for computational efficiency. The graphical display program was redesigned to reflect the change to the Mars-like environment.

  3. Computer access security code system

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  4. Visualizing Parallel Computer System Performance

    NASA Technical Reports Server (NTRS)

    Malony, Allen D.; Reed, Daniel A.

    1988-01-01

    Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.

  5. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  6. Thermal Hydraulic Computer Code System.

    Energy Science and Technology Software Center (ESTSC)

    1999-07-16

    Version 00 RELAP5 was developed to describe the behavior of a light water reactor (LWR) subjected to postulated transients such as loss of coolant from large or small pipe breaks, pump failures, etc. RELAP5 calculates fluid conditions such as velocities, pressures, densities, qualities, temperatures; thermal conditions such as surface temperatures, temperature distributions, heat fluxes; pump conditions; trip conditions; reactor power and reactivity from point reactor kinetics; and control system variables. In addition to reactor applications,more » the program can be applied to transient analysis of other thermal‑hydraulic systems with water as the fluid. This package contains RELAP5/MOD1/029 for CDC computers and RELAP5/MOD1/025 for VAX or IBM mainframe computers.« less

  7. Computing statistics for Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Tupper, P. F.

    2007-08-01

    We present the results of a set of numerical experiments designed to investigate the appropriateness of various integration schemes for molecular dynamics simulations. In particular, we wish to identify which numerical methods, when applied to an ergodic Hamiltonian system, sample the state-space in an unbiased manner. We do this by describing two Hamiltonian system for which we can analytically compute some of the important statistical features of its trajectories, and then applying various numerical integration schemes to them. We can then compare the results from the numerical simulation against the exact results for the system and see how closely they agree. The statistic we study is the empirical distribution of particle velocity over long trajectories of the systems. We apply four methods: one symplectic method (Stormer-Verlet) and three energy-conserving step-and-project methods. The symplectic method performs better on both test problems, accurately computing empirical distributions for all step-lengths consistent with stability. Depending on the test system and the method, the step-and-project methods are either no longer ergodic for any step length (thus giving the wrong empirical distribution) or give the correct distribution only in the limit of step-size going to zero.

  8. CAESY - COMPUTER AIDED ENGINEERING SYSTEM

    NASA Technical Reports Server (NTRS)

    Wette, M. R.

    1994-01-01

    Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.

  9. Automated Computer Access Request System

    NASA Technical Reports Server (NTRS)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  10. Research on computer systems benchmarking

    NASA Technical Reports Server (NTRS)

    Smith, Alan Jay (Principal Investigator)

    1996-01-01

    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  11. Lower bounds on the computational efficiency of optical computing systems

    NASA Astrophysics Data System (ADS)

    Barakat, Richard; Reif, John

    1987-03-01

    A general model for determining the computational efficiency of optical computing systems, termed the VLSIO model, is described. It is a 3-dimensional generalization of the wire model of a 2-dimensional VLSI with optical beams (via Gabor's theorem) replacing the wires as communication channels. Lower bounds (in terms of simultaneous volume and time) on the computational resources of the VLSIO are obtained for computing various problems such as matrix multiplication.

  12. A face recognition embedded system

    NASA Astrophysics Data System (ADS)

    Pun, Kwok Ho; Moon, Yiu Sang; Tsang, Chi Chiu; Chow, Chun Tak; Chan, Siu Man

    2005-03-01

    This paper presents an experimental study of the implementation of a face recognition system in embedded systems. To investigate the feasibility and practicality of real time face recognition on such systems, a door access control system based on face recognition is built. Due to the limited computation power of embedded device, a semi-automatic scheme for face detection and eye location is proposed to solve these computationally hard problems. It is found that to achieve real time performance, optimization of the core face recognition module is needed. As a result, extensive profiling is done to pinpoint the execution hotspots in the system and optimization are carried out. After careful precision analysis, all slow floating point calculations are replaced with their fixed-point versions. Experimental results show that real time performance can be achieved without significant loss in recognition accuracy.

  13. When does a physical system compute?

    PubMed Central

    Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv

    2014-01-01

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245

  14. Hydronic distribution system computer model

    SciTech Connect

    Andrews, J.W.; Strasser, J.J.

    1994-10-01

    A computer model of a hot-water boiler and its associated hydronic thermal distribution loop has been developed at Brookhaven National Laboratory (BNL). It is intended to be incorporated as a submodel in a comprehensive model of residential-scale thermal distribution systems developed at Lawrence Berkeley. This will give the combined model the capability of modeling forced-air and hydronic distribution systems in the same house using the same supporting software. This report describes the development of the BNL hydronics model, initial results and internal consistency checks, and its intended relationship to the LBL model. A method of interacting with the LBL model that does not require physical integration of the two codes is described. This will provide capability now, with reduced up-front cost, as long as the number of runs required is not large.

  15. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  16. Knowledge-based system for computer security

    SciTech Connect

    Hunteman, W.J.

    1988-01-01

    The rapid expansion of computer security information and technology has provided little support for the security officer to identify and implement the safeguards needed to secure a computing system. The Department of Energy Center for Computer Security is developing a knowledge-based computer security system to provide expert knowledge to the security officer. The system is policy-based and incorporates a comprehensive list of system attack scenarios and safeguards that implement the required policy while defending against the attacks. 10 figs.

  17. Computer Aided Control System Design (CACSD)

    NASA Technical Reports Server (NTRS)

    Stoner, Frank T.

    1993-01-01

    The design of modern aerospace systems relies on the efficient utilization of computational resources and the availability of computational tools to provide accurate system modeling. This research focuses on the development of a computer aided control system design application which provides a full range of stability analysis and control design capabilities for aerospace vehicles.

  18. Software For Monitoring VAX Computer Systems

    NASA Technical Reports Server (NTRS)

    Farkas, Les; Don, Ken; Lavery, David; Baron, Amy

    1994-01-01

    VAX Continuous Monitoring System (VAXCMS) computer program developed at NASA Headquarters to aid system managers in monitoring performances of VAX computer systems through generation of graphic images summarizing trends in performance metrics over time. VAXCMS written in DCL and VAX FORTRAN for use with DEC VAX-series computers running VMS 5.1 or later.

  19. Using Expert Systems For Computational Tasks

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Regenie, Victoria A.; Brazee, Marylouise; Brumbaugh, Randal W.

    1990-01-01

    Transformation technique enables inefficient expert systems to run in real time. Paper suggests use of knowledge compiler to transform knowledge base and inference mechanism of expert-system computer program into conventional computer program. Main benefit, faster execution and reduced processing demands. In avionic systems, transformation reduces need for special-purpose computers.

  20. Computer Assisted Learning Systems in Pathology Teaching.

    ERIC Educational Resources Information Center

    Harkin, P. J. R.; And Others

    1986-01-01

    Describes the use of computer assisted instructional systems in the teaching of pathology. Explains the components of a typical computer-based system and compares interactive systems which use visual displays ranging from microfiche projectors to video discs. Discusses computer programs prepared for courses in general pathology and systemic…

  1. Transient Faults in Computer Systems

    NASA Technical Reports Server (NTRS)

    Masson, Gerald M.

    1993-01-01

    A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.

  2. Impact of new computing systems on finite element computations

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Storassili, O. O.; Fulton, R. E.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified.

  3. System for Computer Automated Typesetting (SCAT) of Computer Authored Texts.

    ERIC Educational Resources Information Center

    Keeler, F. Laurence

    This description of the System for Automated Typesetting (SCAT), an automated system for typesetting text and inserting special graphic symbols in programmed instructional materials created by the computer aided authoring system AUTHOR, provides an outline of the design architecture of the system and an overview including the component…

  4. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  5. Technical Systems for Academic Computing.

    ERIC Educational Resources Information Center

    Watkins, Nellouise

    1980-01-01

    Numerous studies clearly indicate that Computer Assisted Instruction (CAI) is a successful teaching technique; students respond favorably and it saves learning time. A computer center director offers planning and buying criteria. (Author/TG)

  6. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  7. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  8. Buyer's Guide to Computer Based Instructional Systems.

    ERIC Educational Resources Information Center

    Fratini, Robert C.

    1981-01-01

    Examines the advantages and disadvantages of shared multiterminal computer based instruction (CBI) systems, dedicated multiterminal CBI systems, and stand-alone CBI systems. A series of questions guide consumers in matching a system's capabilities with an organization's needs. (MER)

  9. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  10. Computer Programs For Automated Welding System

    NASA Technical Reports Server (NTRS)

    Agapakis, John E.

    1993-01-01

    Computer programs developed for use in controlling automated welding system described in MFS-28578. Together with control computer, computer input and output devices and control sensors and actuators, provide flexible capability for planning and implementation of schemes for automated welding of specific workpieces. Developed according to macro- and task-level programming schemes, which increases productivity and consistency by reducing amount of "teaching" of system by technician. System provides for three-dimensional mathematical modeling of workpieces, work cells, robots, and positioners.

  11. Specification of Computer Systems by Objectives.

    ERIC Educational Resources Information Center

    Eltoft, Douglas

    1989-01-01

    Discusses the evolution of mainframe and personal computers, and presents a case study of a network developed at the University of Iowa called the Iowa Computer-Aided Engineering Network (ICAEN) that combines Macintosh personal computers with Apollo workstations. Functional objectives are stressed as the best measure of system performance. (LRW)

  12. Reliability models for dataflow computer systems

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.; Buckles, B. P.

    1985-01-01

    The demands for concurrent operation within a computer system and the representation of parallelism in programming languages have yielded a new form of program representation known as data flow (DENN 74, DENN 75, TREL 82a). A new model based on data flow principles for parallel computations and parallel computer systems is presented. Necessary conditions for liveness and deadlock freeness in data flow graphs are derived. The data flow graph is used as a model to represent asynchronous concurrent computer architectures including data flow computers.

  13. Symbolic computation in system simulation and design

    NASA Astrophysics Data System (ADS)

    Evans, Brian L.; Gu, Steve X.; Kalavade, Asa; Lee, Edward A.

    1995-06-01

    This paper examines some of the roles that symbolic computation plays in assisting system- level simulation and design. By symbolic computation, we mean programs like Mathematica that perform symbolic algebra and apply transformation rules based on algebraic identities. At a behavioral level, symbolic computation can compute parameters, generate new models, and optimize parameter settings. At the synthesis level, symbolic computation can work in tandem with synthesis tools to rewrite cascade and parallel combinations on components in sub- systems to meet design constraints. Symbolic computation represents one type of tool that may be invoked in the complex flow of the system design process. The paper discusses the qualities that a formal infrastructure for managing system design should have. The paper also describes an implementation of this infrastructure called DesignMaker, implemented in the Ptolemy environment, which manages the flow of tool invocations in an efficient manner using a graphical file dependency mechanism.

  14. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  15. Using the Computer in Systems Engineering Design

    ERIC Educational Resources Information Center

    Schmidt, W.

    1970-01-01

    With the aid of the programmed computer, the systems designer can analyze systems for which certain components have not yet been manufactured or even invented, and the power of solution-technique is greatly increased. (IR)

  16. Computers as Augmentative Communication Systems.

    ERIC Educational Resources Information Center

    Vanderheiden, Gregg C.

    The paper describes concepts and principles resulting in successful applications of computer technology to the needs of the disabled. The first part describes what a microcomputer is and is not, emphasizing the microcomputer as a machine that simply carries out instructions, the role of programming, and the use of prepared application programs.…

  17. Computer Literacy in a Distance Education System

    ERIC Educational Resources Information Center

    Farajollahi, Mehran; Zandi, Bahman; Sarmadi, Mohamadreza; Keshavarz, Mohsen

    2015-01-01

    In a Distance Education (DE) system, students must be equipped with seven skills of computer (ICDL) usage. This paper aims at investigating the effect of a DE system on the computer literacy of Master of Arts students at Tehran University. The design of this study is quasi-experimental. Pre-test and post-test were used in both control and…

  18. Computer controlled thermal fatigue test system

    SciTech Connect

    Schmale, D.T.; Jones, W.B.

    1986-01-01

    A servo-controlled hydraulic mechanical test system has been configured to conduct computer-controlled thermal fatigue tests. The system uses induction heating, a digital temperature controller, infrared pyrometry, forced air cooling, and quartz rod extensometry. In addition, a digital computer controls the tests and allows precise data analysis and interpretation.

  19. Computer-Controlled, Motorized Positioning System

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1994-01-01

    Computer-controlled, motorized positioning system developed for use in robotic manipulation of samples in custom-built secondary-ion mass spectrometry (SIMS) system. Positions sample repeatably and accurately, even during analysis in three linear orthogonal coordinates and one angular coordinate under manual local control, or microprocessor-based local control or remote control by computer via general-purpose interface bus (GPIB).

  20. Interactive graphical computer-aided design system

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1975-01-01

    System is used for design, layout, and modification of large-scale-integrated (LSI) metal-oxide semiconductor (MOS) arrays. System is structured around small computer which provides real-time support for graphics storage display unit with keyboard, slave display unit, hard copy unit, and graphics tablet for designer/computer interface.

  1. 32P-postlabeling assay for carcinogen-DNA adducts: description of beta shielding apparatus and semi-automatic spotting and washing devices that facilitate the handling of multiple samples.

    PubMed

    Reddy, M V; Blackburn, G R

    1990-04-01

    The utilization of the 32P-postlabeling assay in combination with TLC for the sensitive detection and estimation of aromatic DNA adducts has been increasing in the past few years. The procedure consists of 32P-labeling of carcinogen-adducted 3'-nucleotides in the DNA digests using [gamma-32P]ATP and polynucleotide kinase, separation of 32P-labeled adducts by TLC, and their detection by autoradiography. During both 32P-labeling and initial phases of TLC, a relatively high amount of [gamma-32P]ATP (3.0-4.5 mCi) is handled when 30 samples are processed simultaneously. We describe the design of acrylic shielding apparatus, semi-automatic TLC spotting devices, and devices for development and washing of multiple TLC plates, which not only provide substantial protection from exposure to 32P beta radiation, but also allow quick and easy handling of a large number of samples, thus expediting the assay workup and making it less labor-intensive. Specifically, the equipment includes: (i) a multi-tube carousel rack (7.5 cm diameter and 7.7 cm height) having 15 wells to hold capless Eppendorf tubes (0.5 ml) and a rotatable lid with an aperture to access individual tubes; (ii) a pipet shielder; (iii) two semi-automatic spotting devices to apply radioactive solutions to TLC plates; (iv) a multi-plate holder for TLC plates; and (v) a mechanical device for washing multiple TLC plates. Item (i) is small enough to be held in one-hand, vortexed, and centrifuged to mix the solutions in each tube while beta radiation is shielded. Items (iii) to (iv) aid in the automation of the assay. PMID:2323007

  2. Partitioning of regular computation on multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Lee, Fung Fung

    1988-01-01

    Problem partitioning of regular computation over two dimensional meshes on multiprocessor systems is examined. The regular computation model considered involves repetitive evaluation of values at each mesh point with local communication. The computational workload and the communication pattern are the same at each mesh point. The regular computation model arises in numerical solutions of partial differential equations and simulations of cellular automata. Given a communication pattern, a systematic way to generate a family of partitions is presented. The influence of various partitioning schemes on performance is compared on the basis of computation to communication ratio.

  3. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  4. Computer Bits: The Ideal Computer System for Your Center.

    ERIC Educational Resources Information Center

    Brown, Dennis; Neugebauer, Roger

    1986-01-01

    Reviews five computer systems that can address the needs of a child care center: (1) Sperry PC IT with Bernoulli Box, (2) Compaq DeskPro 286, (3) Macintosh Plus, (4) Epson Equity II, and (5) Leading Edge Model "D." (HOD)

  5. Computer Reconstruction of Plant Growth and Chlorophyll Fluorescence Emission in Three Spatial Dimensions

    PubMed Central

    Bellasio, Chandra; Olejníčková, Julie; Tesař, Radek; Šebela, David; Nedbal, Ladislav

    2012-01-01

    Plant leaves grow and change their orientation as well their emission of chlorophyll fluorescence in time. All these dynamic plant properties can be semi-automatically monitored by a 3D imaging system that generates plant models by the method of coded light illumination, fluorescence imaging and computer 3D reconstruction. Here, we describe the essentials of the method, as well as the system hardware. We show that the technique can reconstruct, with a high fidelity, the leaf size, the leaf angle and the plant height. The method fails with wilted plants when leaves overlap obscuring their true area. This effect, naturally, also interferes when the method is applied to measure plant growth under water stress. The method is, however, very potent in capturing the plant dynamics under mild stress and without stress. The 3D reconstruction is also highly effective in correcting geometrical factors that distort measurements of chlorophyll fluorescence emission of naturally positioned plant leaves. PMID:22368511

  6. Autonomic Computing for Spacecraft Ground Systems

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Jones, Lori

    2007-01-01

    Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.

  7. MTA Computer Based Evaluation System.

    ERIC Educational Resources Information Center

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  8. Computer-Based Medical System

    NASA Technical Reports Server (NTRS)

    1998-01-01

    SYMED, Inc., developed a unique electronic medical records and information management system. The S2000 Medical Interactive Care System (MICS) incorporates both a comprehensive and interactive medical care support capability and an extensive array of digital medical reference materials in either text or high resolution graphic form. The system was designed, in cooperation with NASA, to improve the effectiveness and efficiency of physician practices. The S2000 is a MS (Microsoft) Windows based software product which combines electronic forms, medical documents, records management, and features a comprehensive medical information system for medical diagnostic support and treatment. SYMED, Inc. offers access to its medical systems to all companies seeking competitive advantages.

  9. Computer Jet-Engine-Monitoring System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Ray, Ronald J.

    1992-01-01

    "Intelligent Computer Assistant for Engine Monitoring" (ICAEM), computer-based monitoring system intended to distill and display data on conditions of operation of two turbofan engines of F-18, is in preliminary state of development. System reduces burden on propulsion engineer by providing single display of summary information on statuses of engines and alerting engineer to anomalous conditions. Effective use of prior engine-monitoring system requires continuous attention to multiple displays.

  10. Advanced Computed-Tomography Inspection System

    NASA Technical Reports Server (NTRS)

    Harris, Lowell D.; Gupta, Nand K.; Smith, Charles R.; Bernardi, Richard T.; Moore, John F.; Hediger, Lisa

    1993-01-01

    Advanced Computed Tomography Inspection System (ACTIS) is computed-tomography x-ray apparatus revealing internal structures of objects in wide range of sizes and materials. Three x-ray sources and adjustable scan geometry gives system unprecedented versatility. Gantry contains translation and rotation mechanisms scanning x-ray beam through object inspected. Distance between source and detector towers varied to suit object. System used in such diverse applications as development of new materials, refinement of manufacturing processes, and inspection of components.

  11. Information-computational system: atmospheric chemistry

    NASA Astrophysics Data System (ADS)

    Adamov, Dmitri P.; Akhlyostin, Alexey Y.; Fazliev, Alexandre Z.; Gordov, Eugeni P.; Karyakin, Alexey S.; Mikhailov, Sergey A.; Rodimova, Olga B.

    1999-11-01

    The atmospheric chemistry information-computational system (ICS) with Internet access is presented. The ICS is aimed summarizing fundamental data on atmospheric processes, determining the dynamics of complex chemical systems and providing educational information. The system consist of three functional blocks: data preparation, computation and information blocks, within which a user may choose the chemical reactions and atmospheric models, drive relevant kinetic equations and conservation laws, solve the kinetic equations, visualize the results of calculations and get access to related information.

  12. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  13. High-performance computing and distributed systems

    SciTech Connect

    Loken, S.C.; Greiman, W.; Jacobson, V.L.; Johnston, W.E.; Robertson, D.W.; Tierney, B.L.

    1992-09-01

    We present a scenario for a fully distributed computing environment in which computing, storage, and I/O elements are configured on demand into ``virtual systems`` that are optimal for the solution of a particular problem. We also describe present two pilot projects that illustrate some of the elements and issues of this scenario. The goal of this work is to make the most powerful computing systems those that are logically assembled from network based components, and to make those systems available independent of the geographic location of the constituent elements.

  14. High-performance computing and distributed systems

    SciTech Connect

    Loken, S.C.; Greiman, W.; Jacobson, V.L.; Johnston, W.E.; Robertson, D.W.; Tierney, B.L.

    1992-09-01

    We present a scenario for a fully distributed computing environment in which computing, storage, and I/O elements are configured on demand into virtual systems'' that are optimal for the solution of a particular problem. We also describe present two pilot projects that illustrate some of the elements and issues of this scenario. The goal of this work is to make the most powerful computing systems those that are logically assembled from network based components, and to make those systems available independent of the geographic location of the constituent elements.

  15. Method and system for benchmarking computers

    DOEpatents

    Gustafson, John L.

    1993-09-14

    A testing system and method for benchmarking computer systems. The system includes a store containing a scalable set of tasks to be performed to produce a solution in ever-increasing degrees of resolution as a larger number of the tasks are performed. A timing and control module allots to each computer a fixed benchmarking interval in which to perform the stored tasks. Means are provided for determining, after completion of the benchmarking interval, the degree of progress through the scalable set of tasks and for producing a benchmarking rating relating to the degree of progress for each computer.

  16. Refurbishment program of HANARO control computer system

    SciTech Connect

    Kim, H. K.; Choe, Y. S.; Lee, M. W.; Doo, S. K.; Jung, H. S.

    2012-07-01

    HANARO, an open-tank-in-pool type research reactor with 30 MW thermal power, achieved its first criticality in 1995. The programmable controller system MLC (Multi Loop Controller) manufactured by MOORE has been used to control and regulate HANARO since 1995. We made a plan to replace the control computer because the system supplier no longer provided technical support and thus no spare parts were available. Aged and obsolete equipment and the shortage of spare parts supply could have caused great problems. The first consideration for a replacement of the control computer dates back to 2007. The supplier did not produce the components of MLC so that this system would no longer be guaranteed. We established the upgrade and refurbishment program in 2009 so as to keep HANARO up to date in terms of safety. We designed the new control computer system that would replace MLC. The new computer system is HCCS (HANARO Control Computer System). The refurbishing activity is in progress and will finish in 2013. The goal of the refurbishment program is a functional replacement of the reactor control system in consideration of suitable interfaces, compliance with no special outage for installation and commissioning, and no change of the well-proved operation philosophy. HCCS is a DCS (Discrete Control System) using PLC manufactured by RTP. To enhance the reliability, we adapt a triple processor system, double I/O system and hot swapping function. This paper describes the refurbishment program of the HANARO control system including the design requirements of HCCS. (authors)

  17. Computer-aided dispatching system design specification

    SciTech Connect

    Briggs, M.G.

    1997-12-16

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP).

  18. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A Central Control Element (CCE) module which controls the Automatically Reconfigurable Modular System (ARMS) and allows both redundant processing and multi-computing in the same computer with real time mode switching, is discussed. The same hardware is used for either reliability enhancement, speed enhancement, or for a combination of both.

  19. A System for Cataloging Computer Software

    ERIC Educational Resources Information Center

    Pearson, Karl M., Jr.

    1973-01-01

    As a form of nonbook material, computer software can be cataloged and the collection managed by a library. The System Development Corporation (SDC) Technical Information Center has adapted the Anglo-American Cataloging Rules for descriptive cataloging of computer programs. (11 references) (Author/SJ)

  20. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A design tradeoff study is reported for a modular spaceborne computer system that is responsive to many mission types and phases. The computer uses redundancy to maximize reliability, and multiprocessing to maximize processing capacity. Fault detection and recovery features provide optimal reliability.

  1. Computational representation of biological systems

    SciTech Connect

    Frazier, Zach; McDermott, Jason E.; Guerquin, Michal; Samudrala, Ram

    2009-04-20

    Integration of large and diverse biological data sets is a daunting problem facing systems biology researchers. Exploring the complex issues of data validation, integration, and representation, we present a systematic approach for the management and analysis of large biological data sets based on data warehouses. Our system has been implemented in the Bioverse, a framework combining diverse protein information from a variety of knowledge areas such as molecular interactions, pathway localization, protein structure, and protein function.

  2. A computer primer: systems implementation.

    PubMed

    Alleyne, J

    1982-07-01

    It is important to recognize the process of implementing systems as a process of change. The hospital, through its steering committee, must manage this process, initiating change instead of responding to it. Only then will the implementation of information systems be an orderly process and the impact of these changes on the hospital's organization clearly controlled. The probability of success in implementing new systems would likely be increased if attention centers on gaining commitment to the project, gaining commitment to any changes necessitated by the new system, and assuring that the project is well defined and plans clearly specified. These issues, if monitored throughout the systems implementation, will lead to early identification of potential problems and probable failures. This highly increases the chance of success. A probably failure, once identified, can be given specific attention to assure that associated problems are successfully resolved. The cost of this special attention, monitoring and managing systems implementation, is almost always much less than the cost of the eventual implementation failure. PMID:7106436

  3. Automatic system for computer program documentation

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Elliott, R. W.; Arseven, S.; Colunga, D.

    1972-01-01

    Work done on a project to design an automatic system for computer program documentation aids was made to determine what existing programs could be used effectively to document computer programs. Results of the study are included in the form of an extensive bibliography and working papers on appropriate operating systems, text editors, program editors, data structures, standards, decision tables, flowchart systems, and proprietary documentation aids. The preliminary design for an automated documentation system is also included. An actual program has been documented in detail to demonstrate the types of output that can be produced by the proposed system.

  4. Computer Algebra Systems in Undergraduate Instruction.

    ERIC Educational Resources Information Center

    Small, Don; And Others

    1986-01-01

    Computer algebra systems (such as MACSYMA and muMath) can carry out many of the operations of calculus, linear algebra, and differential equations. Use of them with sketching graphs of rational functions and with other topics is discussed. (MNS)

  5. Computer automation for feedback system design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.

  6. Satellite system considerations for computer data transfer

    NASA Technical Reports Server (NTRS)

    Cook, W. L.; Kaul, A. K.

    1975-01-01

    Communications satellites will play a key role in the transmission of computer generated data through nationwide networks. This paper examines critical aspects of satellite system design as they relate to the computer data transfer task. In addition, it discusses the factors influencing the choice of error control technique, modulation scheme, multiple-access mode, and satellite beam configuration based on an evaluation of system requirements for a broad range of application areas including telemetry, terminal dialog, and bulk data transmission.

  7. Computer-aided surface roughness measurement system

    SciTech Connect

    Hughes, F.J.; Schankula, M.H.

    1983-11-01

    A diamond stylus profilometer with computer-based data acquisition/analysis system is being used to characterize surfaces of reactor components and materials, and to examine the effects of surface topography on thermal contact conductance. The current system is described; measurement problems and system development are discussed in general terms and possible future improvements are outlined.

  8. Computer Systems for Distributed and Distance Learning.

    ERIC Educational Resources Information Center

    Anderson, M.; Jackson, David

    2000-01-01

    Discussion of network-based learning focuses on a survey of computer systems for distributed and distance learning. Both Web-based systems and non-Web-based systems are reviewed in order to highlight some of the major trends of past projects and to suggest ways in which progress may be made in the future. (Contains 92 references.) (Author/LRW)

  9. Computer-Aided dispatching system design specification

    SciTech Connect

    Briggs, M.G.

    1996-05-03

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This system is defined as a Commercial-Off the-Shelf computer dispatching system providing both text and graphical display information while interfacing with the diverse reporting system within the Hanford Facility. This system also provided expansion capabilities to integrate Hanford Fire and the Occurrence Notification Center and provides back-up capabilities for the Plutonium Processing Facility.

  10. Computational approaches for systems metabolomics.

    PubMed

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J

    2016-06-01

    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field. PMID:27135552

  11. Managing secure computer systems and networks.

    PubMed

    Von Solms, B

    1996-10-01

    No computer system or computer network can today be operated without the necessary security measures to secure and protect the electronic assets stored, processed and transmitted using such systems and networks. Very often the effort in managing such security and protection measures are totally underestimated. This paper provides an overview of the security management needed to secure and protect a typical IT system and network. Special reference is made to this management effort in healthcare systems, and the role of the information security officer is also highlighted. PMID:8960921

  12. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  13. Computational studies of polymeric systems

    NASA Astrophysics Data System (ADS)

    Carrillo, Jan-Michael Y.

    Polymeric systems involving polyelectrolytes in surfaces and interfaces, semiflexible polyelectrolytes and biopolymers in solution, complex polymeric systems that had applications in nanotechnology were modeled using coarse grained molecular dynamics simulation. In the area of polyelectrolytes in surfaces and interfaces, the phenomena of polyelectrolyte adsorption at oppositely charge surface was investigated. Simulations found that short range van der Waals interaction was a major factor in determining morphology and thickness of the adsorbed layer. Hydrophobic polyelectrolytes adsorbed in hydrophobic surfaces tend to be the most effective in forming multi-layers because short range attraction enhances the adsorption process. Adsorbed polyelectrolytes could move freely along the surface which was in contrast to polyelectrolyte brushes. The morphologies of hydrophobic polyelectrolyte brushes were investigated and simulations found that brushes had different morphologies depending on the strength of the short range monomer-monomer attraction, electrostatic interaction and counterion condensation. Planar polyelectrolyte brushes formed: (1) vertically oriented cylindrical aggregates, (2) maze-like aggregate structures, or (3) thin polymeric layer covering a substrate. While, the spherical polyelectrolyte brushes could be in any of the previous morphologies or be in a micelle-like conformation with a dense core and charged corona. In the area of biopolymers and semiflexible polyelectrolytes in solution, simulations demonstrated that the bending rigidity of these polymers was scale-dependent. The bond-bond correlation function describing a chain's orientational memory could be approximated by a sum of two exponential functions manifesting the existence of the two characteristic length scales. The existence of the two length scales challenged the current practice of describing chain stretching experiments using a single length scale. In the field of nanotechnology

  14. Intelligent computational systems for space applications

    NASA Astrophysics Data System (ADS)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  15. Lewis hybrid computing system, users manual

    NASA Technical Reports Server (NTRS)

    Bruton, W. M.; Cwynar, D. S.

    1979-01-01

    The Lewis Research Center's Hybrid Simulation Lab contains a collection of analog, digital, and hybrid (combined analog and digital) computing equipment suitable for the dynamic simulation and analysis of complex systems. This report is intended as a guide to users of these computing systems. The report describes the available equipment' and outlines procedures for its use. Particular is given to the operation of the PACER 100 digital processor. System software to accomplish the usual digital tasks such as compiling, editing, etc. and Lewis-developed special purpose software are described.

  16. Telemetry Computer System at Wallops Flight Center

    NASA Technical Reports Server (NTRS)

    Bell, H.; Strock, J.

    1980-01-01

    This paper describes the Telemetry Computer System in operation at NASA's Wallops Flight Center for real-time or off-line processing, storage, and display of telemetry data from rockets and aircraft. The system accepts one or two PCM data streams and one FM multiplex, converting each type of data into computer format and merging time-of-day information. A data compressor merges the active streams, and removes redundant data if desired. Dual minicomputers process data for display, while storing information on computer tape for further processing. Real-time displays are located at the station, at the rocket launch control center, and in the aircraft control tower. The system is set up and run by standard telemetry software under control of engineers and technicians. Expansion capability is built into the system to take care of possible future requirements.

  17. Data Integration in Computer Distributed Systems

    NASA Astrophysics Data System (ADS)

    Kwiecień, Błażej

    In this article the author analyze a problem of data integration in a computer distributed systems. Exchange of information between different levels in integrated pyramid of enterprise process is fundamental with regard to efficient enterprise work. Communication and data exchange between levels are not always the same cause of necessity of different network protocols usage, communication medium, system response time, etc.

  18. A New Computer-Based Examination System.

    ERIC Educational Resources Information Center

    Los Arcos, J. M.; Vano, E.

    1978-01-01

    Describes a computer-managed instructional system used to formulate, print, and evaluate true-false questions for testing purposes. The design of the system and its application in medical and nuclear engineering courses in two Spanish institutions of higher learning are detailed. (RAO)

  19. COMPUTER AIDED DESIGN OF DIFFUSED AERATION SYSTEMS

    EPA Science Inventory

    CADDAS (Computer Aided Design of Diffused Aeration Systems) is a microcomputer-based program that analyzes the cost and performance of diffused aeration used in activated sludge wastewater treatment systems. The program can analyze both coarse bubble and fine pore diffusers as we...

  20. Characterizing Computer Systems Used in Medical Education

    PubMed Central

    Church, V. E.; Tidball, C. S.

    1979-01-01

    The diversity in computer systems used in medical education is described, and the lack of consistant classifications and comparisons noted. A classification scheme based on those characteristics specific to the development and presentation of instructional software is proposed. A graphic system profile approach is used to ensure clarity, while categorization of users and desirable features provides breadth and precision of coverage.

  1. Terrace Layout Using a Computer Assisted System

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Development of a web-based terrace design tool based on the MOTERR program is presented, along with representative layouts for conventional and parallel terrace systems. Using digital elevation maps and geographic information systems (GIS), this tool utilizes personal computers to rapidly construct ...

  2. Approach to constructing reconfigurable computer vision system

    NASA Astrophysics Data System (ADS)

    Xue, Jianru; Zheng, Nanning; Wang, Xiaoling; Zhang, Yongping

    2000-10-01

    In this paper, we propose an approach to constructing reconfigurable vision system. We found that timely and efficient execution of early tasks can significantly enhance the performance of whole computer vision tasks, and abstract out a set of basic, computationally intensive stream operations that may be performed in parallel and embodies them in a series of specific front-end processors. These processors which based on FPGAs (Field programmable gate arrays) can be re-programmable to permit a range of different types of feature maps, such as edge detection & linking, image filtering. Front-end processors and a powerful DSP constitute a computing platform which can perform many CV tasks. Additionally we adopt the focus-of-attention technologies to reduce the I/O and computational demands by performing early vision processing only within a particular region of interest. Then we implement a multi-page, dual-ported image memory interface between the image input and computing platform (including front-end processors, DSP). Early vision features were loaded into banks of dual-ported image memory arrays, which are continually raster scan updated at high speed from the input image or video data stream. Moreover, the computing platform can be complete asynchronous, random access to the image data or any other early vision feature maps through the dual-ported memory banks. In this way, the computing platform resources can be properly allocated to a region of interest and decoupled from the task of dealing with a high speed serial raster scan input. Finally, we choose PCI Bus as the main channel between the PC and computing platform. Consequently, front-end processors' control registers and DSP's program memory were mapped into the PC's memory space, which provides user access to reconfigure the system at any time. We also present test result of a computer vision application based on the system.

  3. Theoretical kinetic computations in complex reacting systems

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1986-01-01

    Nasa Lewis' studies of complex reacting systems at high temperature are discussed. The changes which occur are the result of many different chemical reactions occurring at the same time. Both an experimental and a theoretical approach are needed to fully understand what happens in these systems. The latter approach is discussed. The differential equations which describe the chemical and thermodynamic changes are given. Their solution by numerical techniques using a detailed chemical mechanism is described. Several different comparisons of computed results with experimental measurements are also given. These include the computation of (1) species concentration profiles in batch and flow reactions, (2) rocket performance in nozzle expansions, and (3) pressure versus time profiles in hydrocarbon ignition processes. The examples illustrate the use of detailed kinetic computations to elucidate a chemical mechanism and to compute practical quantities such as rocket performance, ignition delay times, and ignition lengths in flow processes.

  4. Computer surety: computer system inspection guidance. [Contains glossary

    SciTech Connect

    Not Available

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  5. Distributed-Computer System Optimizes SRB Joints

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Young, Katherine C.; Barthelemy, Jean-Francois M.

    1991-01-01

    Initial calculations of redesign of joint on solid rocket booster (SRB) that failed during Space Shuttle tragedy showed redesign increased weight. Optimization techniques applied to determine whether weight could be reduced while keeping joint closed and limiting stresses. Analysis system developed by use of existing software coupling structural analysis with optimization computations. Software designed executable on network of computer workstations. Took advantage of parallelism offered by finite-difference technique of computing gradients to enable several workstations to contribute simultaneously to solution of problem. Key features, effective use of redundancies in hardware and flexible software, enabling optimization to proceed with minimal delay and decreased overall time to completion.

  6. Computed tomography of the central nervous system

    SciTech Connect

    Bentson, J.R.

    1982-01-01

    The objective of this chapter is to review the most pertinent articles published during the past year on the subject of computed tomography of the central nervous system. The chapter contains sections on pediatric computed tomography, and on the diagnostic use of CT in white matter disease, in infectious disease, for intracranial aneurysms, trauma, and intracranial tumors. Metrizamide flow studies and contrast enhancement are also examined. (KRM)

  7. System balance analysis for vector computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Poole, W. G., Jr.; Voight, R. G.

    1975-01-01

    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data.

  8. Fault tolerant hypercube computer system architecture

    NASA Technical Reports Server (NTRS)

    Madan, Herb S. (Inventor); Chow, Edward (Inventor)

    1989-01-01

    A fault-tolerant multiprocessor computer system of the hypercube type comprising a hierarchy of computers of like kind which can be functionally substituted for one another as necessary is disclosed. Communication between the working nodes is via one communications network while communications between the working nodes and watch dog nodes and load balancing nodes higher in the structure is via another communications network separate from the first. A typical branch of the hierarchy reporting to a master node or host computer comprises, a plurality of first computing nodes; a first network of message conducting paths for interconnecting the first computing nodes as a hypercube. The first network provides a path for message transfer between the first computing nodes; a first watch dog node; and a second network of message connecting paths for connecting the first computing nodes to the first watch dog node independent from the first network, the second network provides an independent path for test message and reconfiguration affecting transfers between the first computing nodes and the first switch watch dog node. There is additionally, a plurality of second computing nodes; a third network of message conducting paths for interconnecting the second computing nodes as a hypercube. The third network provides a path for message transfer between the second computing nodes; a fourth network of message conducting paths for connecting the second computing nodes to the first watch dog node independent from the third network. The fourth network provides an independent path for test message and reconfiguration affecting transfers between the second computing nodes and the first watch dog node; and a first multiplexer disposed between the first watch dog node and the second and fourth networks for allowing the first watch dog node to selectively communicate with individual ones of the computing nodes through the second and fourth networks; as well as, a second watch dog node

  9. Computer-Aided dispatching system design specification

    SciTech Connect

    Briggs, M.G.

    1996-09-27

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This document outlines the negotiated requirements as agreed to by GTE Northwest during technical contract discussions. This system defines a commercial off-the-shelf computer dispatching system providing both test and graphic display information while interfacing with diverse alarm reporting system within the Hanford Site. This system provided expansion capability to integrate Hanford Fire and the Occurrence Notification Center. The system also provided back-up capability for the Plutonium Processing Facility (PFP).

  10. Monitoring SLAC High Performance UNIX Computing Systems

    SciTech Connect

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC

    2005-12-15

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.

  11. Computer design and analysis of vacuum systems

    SciTech Connect

    Santeler, D.J.

    1987-07-01

    A computer program has been developed for an IBM compatible personal computer to assist in the design and analysis of vacuum systems. The program has a selection of 12 major schematics with several thousand minor variants incorporating diffusion, turbomolecular, cryogenic, ion, mechanical, and sorption pumps as well as circular tubes, bends, valves, traps, and purge gas connections. The gas throughput versus the inlet pressure of the pump is presented on a log--log graphical display. The conductance of each series component is sequentially added to the graph to obtain the net system behavior Q/sub (//sub P//sub )/. The component conductances may be calculated either from the inlet area and the transmission probability or from the tube length and the diameter. The gas-flow calculations are valid for orifices, short tubes, and long tubes throughout the entire pressure range from molecular through viscous to choked and nonchoked exit flows. The roughing-pump and high-vacuum-pump characteristic curves are numerically integrated to provide a graphical presentation of the system pumpdown. Outgassing data for different materials is then combined to produce a graph of the net system ''outgassing pressure.'' Computer routines are provided for differentiating a real pumpdown curve for system analysis. The computer program is included with the American Vacuum Society course, ''Advanced Vacuum System Design and Analysis,'' or it may be purchased from Process Applications, Inc.

  12. A semi-automatic method of generating subject-specific pediatric head finite element models for impact dynamic responses to head injury.

    PubMed

    Li, Zhigang; Han, Xiaoqiang; Ge, Hao; Ma, Chunsheng

    2016-07-01

    To account for the effects of head realistic morphological feature variation on the impact dynamic responses to head injury, it is necessary to develop multiple subject-specific pediatric head finite element (FE) models based on computed tomography (CT) or magnetic resonance imaging (MRI) scans. However, traditional manual model development is very time-consuming. In this study, a new automatic method was developed to extract anatomical points from pediatric head CT scans to represent pediatric head morphological features (head size/shape, skull thickness, and suture/fontanel width). Subsequently, a geometry-adaptive mesh morphing method based on radial basis function was developed that can automatically morph a baseline pediatric head FE model into target FE models with geometries corresponding to the extracted head morphological features. In the end, five subject-specific head FE models of approximately 6-month-old (6MO) were automatically generated using the developed method. These validated models were employed to investigate differences in the head dynamic responses among subjects with different head morphologies. The results show that variations in head morphological features have a relatively large effect on pediatric head dynamic response. The results of this study indicate that pediatric head morphological variation had better be taken into account when reconstructing pediatric head injury due to traffic/fall accidents or child abuses using computational models as well as predicting head injury risk for children with obvious difference in head size and morphologies. PMID:27058003

  13. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with an adaptive control system. Therefore, pursuant to 10 CFR 430.27, a waiver must be obtained to... adaptive control systems, must submit a petition for waiver pursuant to 10 CFR 430.27 to establish an... 10 Energy 3 2010-01-01 2010-01-01 false Uniform Test Method for Measuring the Energy...

  14. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... with an adaptive control system. Therefore, pursuant to 10 CFR 430.27, a waiver must be obtained to... adaptive control systems, must submit a petition for waiver pursuant to 10 CFR 430.27 to establish an... 10 Energy 3 2011-01-01 2011-01-01 false Uniform Test Method for Measuring the Energy...

  15. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... with an adaptive control system. Therefore, pursuant to 10 CFR 430.27, a waiver must be obtained to... cloth. The energy test cloth shall be clean and consist of the following: 2.6.1.1Pure finished bleached... adaptive control systems, must submit a petition for waiver pursuant to 10 CFR 430.27 to establish...

  16. Scalable Evolutionary Computation for Efficient Information Extraction from Remote Sensed Imagery

    NASA Astrophysics Data System (ADS)

    Almutairi, L. M.; Shetty, S.; Momm, H. G.

    2014-11-01

    Evolutionary computation, in the form of genetic programming, is used to aid information extraction process from high-resolution satellite imagery in a semi-automatic fashion. Distributing and parallelizing the task of evaluating all candidate solutions during the evolutionary process could significantly reduce the inherent computational cost of evolving solutions that are composed of multichannel large images. In this study, we present the design and implementation of a system that leverages cloud-computing technology to expedite supervised solution development in a centralized evolutionary framework. The system uses the MapReduce programming model to implement a distributed version of the existing framework in a cloud-computing platform. The proposed system has two major subsystems; (i) data preparation: the generation of random spectral indices; and (ii) distributed processing: the distributed implementation of genetic programming, which is used to spectrally distinguish the features of interest from the remaining image background in the cloud computing environment in order to improve scalability. The proposed system reduces response time by leveraging the vast computational and storage resources in a cloud computing environment. The results demonstrate that distributing the candidate solutions reduces the execution time by 91.58%. These findings indicate that such technology could be applied to more complex problems that involve a larger population size and number of generations.

  17. Programmable hardware for reconfigurable computing systems

    NASA Astrophysics Data System (ADS)

    Smith, Stephen

    1996-10-01

    In 1945 the work of J. von Neumann and H. Goldstein created the principal architecture for electronic computation that has now lasted fifty years. Nevertheless alternative architectures have been created that have computational capability, for special tasks, far beyond that feasible with von Neumann machines. The emergence of high capacity programmable logic devices has made the realization of these architectures practical. The original ENIAC and EDVAC machines were conceived to solve special mathematical problems that were far from today's concept of 'killer applications.' In a similar vein programmable hardware computation is being used today to solve unique mathematical problems. Our programmable hardware activity is focused on the research and development of novel computational systems based upon the reconfigurability of our programmable logic devices. We explore our programmable logic architectures and their implications for programmable hardware. One programmable hardware board implementation is detailed.

  18. Distributed Storage Systems for Data Intensive Computing

    SciTech Connect

    Vazhkudai, Sudharshan S; Butt, Ali R; Ma, Xiaosong

    2012-01-01

    In this chapter, the authors present an overview of the utility of distributed storage systems in supporting modern applications that are increasingly becoming data intensive. Their coverage of distributed storage systems is based on the requirements imposed by data intensive computing and not a mere summary of storage systems. To this end, they delve into several aspects of supporting data-intensive analysis, such as data staging, offloading, checkpointing, and end-user access to terabytes of data, and illustrate the use of novel techniques and methodologies for realizing distributed storage systems therein. The data deluge from scientific experiments, observations, and simulations is affecting all of the aforementioned day-to-day operations in data-intensive computing. Modern distributed storage systems employ techniques that can help improve application performance, alleviate I/O bandwidth bottleneck, mask failures, and improve data availability. They present key guiding principles involved in the construction of such storage systems, associated tradeoffs, design, and architecture, all with an eye toward addressing challenges of data-intensive scientific applications. They highlight the concepts involved using several case studies of state-of-the-art storage systems that are currently available in the data-intensive computing landscape.

  19. NIF Integrated Computer Controls System Description

    SciTech Connect

    VanArsdall, P.

    1998-01-26

    This System Description introduces the NIF Integrated Computer Control System (ICCS). The architecture is sufficiently abstract to allow the construction of many similar applications from a common framework. As discussed below, over twenty software applications derived from the framework comprise the NIF control system. This document lays the essential foundation for understanding the ICCS architecture. The NIF design effort is motivated by the magnitude of the task. Figure 1 shows a cut-away rendition of the coliseum-sized facility. The NIF requires integration of about 40,000 atypical control points, must be highly automated and robust, and will operate continuously around the clock. The control system coordinates several experimental cycles concurrently, each at different stages of completion. Furthermore, facilities such as the NIF represent major capital investments that will be operated, maintained, and upgraded for decades. The computers, control subsystems, and functionality must be relatively easy to extend or replace periodically with newer technology.

  20. Scalable computer architecture for digital vascular systems

    NASA Astrophysics Data System (ADS)

    Goddard, Iain; Chao, Hui; Skalabrin, Mark

    1998-06-01

    Digital vascular computer systems are used for radiology and fluoroscopy (R/F), angiography, and cardiac applications. In the United States alone, about 26 million procedures of these types are performed annually: about 81% R/F, 11% cardiac, and 8% angiography. Digital vascular systems have a very wide range of performance requirements, especially in terms of data rates. In addition, new features are added over time as they are shown to be clinically efficacious. Application-specific processing modes such as roadmapping, peak opacification, and bolus chasing are particular to some vascular systems. New algorithms continue to be developed and proven, such as Cox and deJager's precise registration methods for masks and live images in digital subtraction angiography. A computer architecture must have high scalability and reconfigurability to meet the needs of this modality. Ideally, the architecture could also serve as the basis for a nonvascular R/F system.

  1. A Computer System for Mission Managers

    NASA Technical Reports Server (NTRS)

    Tolchin, Robert; Achar, Sathy; Yang, Tina; Lee, Tom

    1987-01-01

    Mission Managers' Workstation (MMW) is personal-computer-based system providing data management and reporting functions to assist Space Shuttle mission managers. Allows to relate events and stored data in timely and organized fashion. Using MMW, standard reports formatted, generated, edited, and electronically communicated with minimum clerical help. Written in PASCAL, BASIC, and assembler.

  2. Some Unexpected Results Using Computer Algebra Systems.

    ERIC Educational Resources Information Center

    Alonso, Felix; Garcia, Alfonsa; Garcia, Francisco; Hoya, Sara; Rodriguez, Gerardo; de la Villa, Agustin

    2001-01-01

    Shows how teachers can often use unexpected outputs from Computer Algebra Systems (CAS) to reinforce concepts and to show students the importance of thinking about how they use the software and reflecting on their results. Presents different examples where DERIVE, MAPLE, or Mathematica does not work as expected and suggests how to use them as a…

  3. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  4. Logical Access Control Mechanisms in Computer Systems.

    ERIC Educational Resources Information Center

    Hsiao, David K.

    The subject of access control mechanisms in computer systems is concerned with effective means to protect the anonymity of private information on the one hand, and to regulate the access to shareable information on the other hand. Effective means for access control may be considered on three levels: memory, process and logical. This report is a…

  5. A universal computer control system for motors

    NASA Technical Reports Server (NTRS)

    Szakaly, Zoltan F. (Inventor)

    1991-01-01

    A control system for a multi-motor system such as a space telerobot, having a remote computational node and a local computational node interconnected with one another by a high speed data link is described. A Universal Computer Control System (UCCS) for the telerobot is located at each node. Each node is provided with a multibus computer system which is characterized by a plurality of processors with all processors being connected to a common bus, and including at least one command processor. The command processor communicates over the bus with a plurality of joint controller cards. A plurality of direct current torque motors, of the type used in telerobot joints and telerobot hand-held controllers, are connected to the controller cards and responds to digital control signals from the command processor. Essential motor operating parameters are sensed by analog sensing circuits and the sensed analog signals are converted to digital signals for storage at the controller cards where such signals can be read during an address read/write cycle of the command processing processor.

  6. Space systems computer-aided design technology

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1984-01-01

    The interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system is described, together with planned capability increases in the IDEAS system. The system's disciplines consist of interactive graphics and interactive computing. A single user at an interactive terminal can create, design, analyze, and conduct parametric studies of earth-orbiting satellites, which represents a timely and cost-effective method during the conceptual design phase where various missions and spacecraft options require evaluation. Spacecraft concepts evaluated include microwave radiometer satellites, communication satellite systems, solar-powered lasers, power platforms, and orbiting space stations.

  7. Adaptive Fuzzy Systems in Computational Intelligence

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1996-01-01

    In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.

  8. Integrative Genomics and Computational Systems Medicine

    SciTech Connect

    McDermott, Jason E.; Huang, Yufei; Zhang, Bing; Xu, Hua; Zhao, Zhongming

    2014-01-01

    The exponential growth in generation of large amounts of genomic data from biological samples has driven the emerging field of systems medicine. This field is promising because it improves our understanding of disease processes at the systems level. However, the field is still in its young stage. There exists a great need for novel computational methods and approaches to effectively utilize and integrate various omics data.

  9. A rule based computer aided design system

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1986-01-01

    A Computer Aided Design (CAD) system is presented which supports the iterative process of design, the dimensional continuity between mating parts, and the hierarchical structure of the parts in their assembled configuration. Prolog, an interactive logic programming language, is used to represent and interpret the data base. The solid geometry representing the parts is defined in parameterized form using the swept volume method. The system is demonstrated with a design of a spring piston.

  10. Focus stacking: Comparing commercial top-end set-ups with a semi-automatic low budget approach. A possible solution for mass digitization of type specimens.

    PubMed

    Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick

    2014-01-01

    In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget. PMID:25589866

  11. Focus stacking: Comparing commercial top-end set-ups with a semi-automatic low budget approach. A possible solution for mass digitization of type specimens

    PubMed Central

    Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick

    2014-01-01

    Abstract In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget. PMID:25589866

  12. 10 CFR Appendix J2 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... clothes washer with an adaptive control system. A waiver must be obtained pursuant to 10 CFR 430.27 to... cm by 86.4 ±1.3 cm) before washing. The energy test cloth shall be clean and shall not be used for... energy stuffer cloth shall be clean and shall not be used for more than 60 test runs...

  13. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... control system. Therefore, pursuant to 10 CFR 430.27, a waiver must be obtained to establish an acceptable... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Any subsequent amendment to a standard by... CFR 430.27 to establish an acceptable test procedure for that clothes washer. For these and...

  14. 10 CFR Appendix J2 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... clothes washer with an adaptive control system. A waiver must be obtained pursuant to 10 CFR 430.27 to... washing. The energy test cloth shall be clean and shall not be used for more than 60 test runs (after... 10 Energy 3 2013-01-01 2013-01-01 false Uniform Test Method for Measuring the Energy...

  15. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... control system. Therefore, pursuant to 10 CFR 430.27, a waiver must be obtained to establish an acceptable...) before washing. The energy test cloth shall be clean and shall not be used for more than 60 test runs...) before washing. The energy stuffer cloth shall be clean and shall not be used for more than 60 test...

  16. Computer-aided protective system (CAPS)

    SciTech Connect

    Squire, R.K.

    1988-01-01

    A method of improving the security of materials in transit is described. The system provides a continuously monitored position location system for the transport vehicle, an internal computer-based geographic delimiter that makes continuous comparisons of actual positions with the preplanned routing and schedule, and a tamper detection/reaction system. The position comparison is utilized to institute preprogrammed reactive measures if the carrier is taken off course or schedule, penetrated, or otherwise interfered with. The geographic locater could be an independent internal platform or an external signal-dependent system utilizing GPS, Loran or similar source of geographic information; a small (micro) computer could provide adequate memory and computational capacity; the insurance of integrity of the system indicates the need for a tamper-proof container and built-in intrusion sensors. A variant of the system could provide real-time transmission of the vehicle position and condition to a central control point for; such transmission could be encrypted to preclude spoofing.

  17. Landauer bound for analog computing systems.

    PubMed

    Diamantini, M Cristina; Gammaitoni, Luca; Trugenberger, Carlo A

    2016-07-01

    By establishing a relation between information erasure and continuous phase transitions we generalize the Landauer bound to analog computing systems. The entropy production per degree of freedom during erasure of an analog variable (reset to standard value) is given by the logarithm of the configurational volume measured in units of its minimal quantum. As a consequence, every computation has to be carried on with a finite number of bits and infinite precision is forbidden by the fundamental laws of physics, since it would require an infinite amount of energy. PMID:27575108

  18. A look at computer system selection criteria

    NASA Technical Reports Server (NTRS)

    Poole, E. W.; Flowers, F. L.; Stanley, W. I. (Principal Investigator)

    1979-01-01

    There is no difficulty in identifying the criteria involved in the computer selection process; complexity arises in objectively evaluating various candidate configurations against the criteria, based on the user's specific needs. A model for formalizing the selection process consists of two major steps: verifying that the candidate configuration is adequate to the user's programming requirements, and determining an overall system evaluation rating based on cost, usability, adaptability, and availability. A 36 step instruction for computer sizing evaluation is included in the appendix along with a sample application of the configuration adequacy model. Selection criteria and the weighting process are also discussed.

  19. Cluster Computing for Embedded/Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  20. Embedded systems for supporting computer accessibility.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol. PMID:26294501

  1. National Ignition Facility integrated computer control system

    NASA Astrophysics Data System (ADS)

    Van Arsdall, Paul J.; Bettenhausen, R. C.; Holloway, Frederick W.; Saroyan, R. A.; Woodruff, J. P.

    1999-07-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control system. The framework provides an open, extensive architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. THe ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensor to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance.

  2. National Ignition Facility integrated computer control system

    SciTech Connect

    Van Arsdall, P.J., LLNL

    1998-06-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance.

  3. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  4. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  5. Transient upset models in computer systems

    NASA Technical Reports Server (NTRS)

    Mason, G. M.

    1983-01-01

    Essential factors for the design of transient upset monitors for computers are discussed. The upset is a system level event that is software dependent. It can occur in the program flow, the opcode set, the opcode address domain, the read address domain, and the write address domain. Most upsets are in the program flow. It is shown that simple, external monitors functioning transparently relative to the system operations can be built if a detailed accounting is made of the characteristics of the faults that can happen. Sample applications are provided for different states of the Z-80 and 8085 based system.

  6. Analog system for computing sparse codes

    DOEpatents

    Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell

    2010-08-24

    A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.

  7. Computing Lyapunov exponents of switching systems

    NASA Astrophysics Data System (ADS)

    Guglielmi, Nicola; Protasov, Vladimir

    2016-06-01

    We discuss a new approach for constructing polytope Lyapunov functions for continuous-time linear switching systems. The method we propose allows to decide the uniform stability of a switching system and to compute the Lyapunov exponent with an arbitrary precision. The method relies on the discretization of the system and provides - for any given discretization stepsize - a lower and an upper bound for the Lyapunov exponent. The efficiency of the new method is illustrated by numerical examples. For a more extensive discussion we remand the reader to [8].

  8. Computer-controlled radiation monitoring system

    SciTech Connect

    Homann, S.G.

    1994-09-27

    A computer-controlled radiation monitoring system was designed and installed at the Lawrence Livermore National Laboratory`s Multiuser Tandem Laboratory (10 MV tandem accelerator from High Voltage Engineering Corporation). The system continuously monitors the photon and neutron radiation environment associated with the facility and automatically suspends accelerator operation if preset radiation levels are exceeded. The system has proved reliable real-time radiation monitoring over the past five years, and has been a valuable tool for maintaining personnel exposure as low as reasonably achievable.

  9. Semi-automatic 2D-to-3D conversion of human-centered videos enhanced by age and gender estimation

    NASA Astrophysics Data System (ADS)

    Fard, Mani B.; Bayazit, Ulug

    2014-01-01

    In this work, we propose a feasible 3D video generation method to enable high quality visual perception using a monocular uncalibrated camera. Anthropometric distances between face standard landmarks are approximated based on the person's age and gender. These measurements are used in a 2-stage approach to facilitate the construction of binocular stereo images. Specifically, one view of the background is registered in initial stage of video shooting. It is followed by an automatically guided displacement of the camera toward its secondary position. At the secondary position the real-time capturing is started and the foreground (viewed person) region is extracted for each frame. After an accurate parallax estimation the extracted foreground is placed in front of the background image that was captured at the initial position. So the constructed full view of the initial position combined with the view of the secondary (current) position, form the complete binocular pairs during real-time video shooting. The subjective evaluation results present a competent depth perception quality through the proposed system.

  10. Computer Visual System (CVS) reference manual

    SciTech Connect

    Snider, D.M.; Wagner, K.L.

    1985-01-01

    This is a reference manual for the Idaho National Engineering Laboratory (INEL) Computer Visual System (CVS). The manual contains a summary of the operation of the CVS program and describes the hardware requirements. It explains the main features and operation of the program as well as how to communicate with the host computer. CVS is a high performance color graphics system. This system enables color pictures to be generated and edited on a color raster terminal and subsequently output as 35mm color transparencies and gray-scale or black and white hard copies. CVS provides processors for the construction of word slides, tables, charts, and sketches. Commands for the creation and editing of images are entered by picking items from menus displayed on the terminal screen. The CVS program executes on a Control Data Corporation (CDC) CYBER 176 computer under the NOS operating system. The user interacts with CVS through a terminal, entering commands from a keyboard and tablet. Currently, only the Advanced Electronic Design (AED) line of color terminals are supported for graphic input to CVS. As other terminals become available, modules to drive these terminals will be included.

  11. Human systems dynamics: Toward a computational model

    NASA Astrophysics Data System (ADS)

    Eoyang, Glenda H.

    2012-09-01

    A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.

  12. Computer aided system engineering for space construction

    NASA Technical Reports Server (NTRS)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  13. TU-A-9A-06: Semi-Automatic Segmentation of Skin Cancer in High-Frequency Ultrasound Images: Initial Comparison with Histology

    SciTech Connect

    Gao, Y; Li, X; Fishman, K; Yang, X; Liu, T

    2014-06-15

    Purpose: In skin-cancer radiotherapy, the assessment of skin lesion is challenging, particularly with important features such as the depth and width hard to determine. The aim of this study is to develop interative segmentation method to delineate tumor boundary using high-frequency ultrasound images and to correlate the segmentation results with the histopathological tumor dimensions. Methods: We analyzed 6 patients who comprised a total of 10 skin lesions involving the face, scalp, and hand. The patient’s various skin lesions were scanned using a high-frequency ultrasound system (Episcan, LONGPORT, INC., PA, U.S.A), with a 30-MHz single-element transducer. The lateral resolution was 14.6 micron and the axial resolution was 3.85 micron for the ultrasound image. Semiautomatic image segmentation was performed to extract the cancer region, using a robust statistics driven active contour algorithm. The corresponding histology images were also obtained after tumor resection and served as the reference standards in this study. Results: Eight out of the 10 lesions are successfully segmented. The ultrasound tumor delineation correlates well with the histology assessment, in all the measurements such as depth, size, and shape. The depths measured by the ultrasound have an average of 9.3% difference comparing with that in the histology images. The remaining 2 cases suffered from the situation of mismatching between pathology and ultrasound images. Conclusion: High-frequency ultrasound is a noninvasive, accurate and easy-accessible modality to image skin cancer. Our segmentation method, combined with high-frequency ultrasound technology, provides a promising tool to estimate the extent of the tumor to guide the radiotherapy procedure and monitor treatment response.

  14. Reconfigurable neuromorphic computation in biochemical systems.

    PubMed

    Chiang, Hui-Ju Katherine; Jiang, Jie-Hong R; Fages, Francois

    2015-08-01

    Implementing application-specific computation and control tasks within a biochemical system has been an important pursuit in synthetic biology. Most synthetic designs to date have focused on realizing systems of fixed functions using specifically engineered components, thus lacking flexibility to adapt to uncertain and dynamically-changing environments. To remedy this limitation, an analog and modularized approach to realize reconfigurable neuromorphic computation with biochemical reactions is presented. We propose a biochemical neural network consisting of neuronal modules and interconnects that are both reconfigurable through external or internal control over the concentrations of certain molecular species. Case studies on classification and machine learning applications using the DNA strain displacement technology demonstrate the effectiveness of our design in both reconfiguration and autonomous adaptation. PMID:26736417

  15. Checkpoint triggering in a computer system

    DOEpatents

    Cher, Chen-Yong

    2016-09-06

    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.

  16. Interactive computer-enhanced remote viewing system

    SciTech Connect

    Tourtellott, J.A.; Wagner, J.F.

    1995-10-01

    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically.

  17. Traffic study of a computer system.

    NASA Technical Reports Server (NTRS)

    Cramer, R. L.

    1973-01-01

    A study which may guide the operations of existing computer installations, as well as the design of future networks, is described. Performance data and evaluations are considered with reference to interarrival time, users' habits, waiting time for execution, time spent in a partition, figures of merit, and states of the system. The analysis of the variables proceeds from examination of typical data with appropriate statistical tests to conclusions about the possible state of nature.

  18. Computer systems to support census geography.

    PubMed

    Deecker, G; Cunningham, R; Kidd, K

    1986-01-01

    The development of computer systems that support the geography of the census in Canada is described. The authors "present proposals for the 1991 Census that increase the level of automation, whether the geography changes or not. In addition, opportunities for change and improvement, should the geography change, are outlined. To minimize the risks involved, it is important to give these different proposals serious consideration early in the planning of a census." PMID:12341362

  19. Visual Turing test for computer vision systems.

    PubMed

    Geman, Donald; Geman, Stuart; Hallonquist, Neil; Younes, Laurent

    2015-03-24

    Today, computer vision systems are tested by their accuracy in detecting and localizing instances of objects. As an alternative, and motivated by the ability of humans to provide far richer descriptions and even tell a story about an image, we construct a "visual Turing test": an operator-assisted device that produces a stochastic sequence of binary questions from a given test image. The query engine proposes a question; the operator either provides the correct answer or rejects the question as ambiguous; the engine proposes the next question ("just-in-time truthing"). The test is then administered to the computer-vision system, one question at a time. After the system's answer is recorded, the system is provided the correct answer and the next question. Parsing is trivial and deterministic; the system being tested requires no natural language processing. The query engine employs statistical constraints, learned from a training set, to produce questions with essentially unpredictable answers-the answer to a question, given the history of questions and their correct answers, is nearly equally likely to be positive or negative. In this sense, the test is only about vision. The system is designed to produce streams of questions that follow natural story lines, from the instantiation of a unique object, through an exploration of its properties, and on to its relationships with other uniquely instantiated objects. PMID:25755262

  20. Telementoring and teleparamedic communication platforms and robotic systems for battlefield biomedical applications

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Kostrzewski, Andrew A.; Zeltser, Gregory; Forrester, Thomas

    2000-08-01

    A new approach to C4, in the form of supercomputer-path soft communication and computing (SC2), provides enabling technology baseline for teleparamedic and telementoring communication platforms and robotic systems. In particular, this new information technology offers full-duplex 2-D and/or 3-D wireless communication and interactive telepresence, as well as remotely-controlled semi-automatic sensing, within so-called telementoring scheme, being the specific brand of telemedicine. In this paper, we discuss the SC2 capabilities, including: 20-times, higher than prior art, compression of digital multimedia data (especially including digital video) with computing power higher than the of 100 Pentiums. The further extension of SC2- technologies, combined with nearly-autonomous teleparamedic scheme, will be also discussed.

  1. Computing using delayed feedback systems: towards photonics

    NASA Astrophysics Data System (ADS)

    Appeltant, L.; Soriano, M. C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C. R.; Fischer, I.

    2012-06-01

    Reservoir computing has recently been introduced as a new paradigm in the eld of machine learning. It is based on the dynamical properties of a network of randomly connected nodes or neurons and shows to be very promising to solve complex classication problems in a computationally ecient way. The key idea is that an input generates nonlinearly transient behavior rendering transient reservoir states suitable for linear classication. Our goal is to study up to which extent systems with delay, and especially photonic systems, can be used as reservoirs. Recently an new architecture has been proposed1 , based on a single nonlinear node with delayed feedback. An electronic1 and an opto-electronic implementation2, 3 have been demonstrated and both have proven to be very successful in terms of performance. This simple conguration, which replaces an entire network of randomly connected nonlinear nodes with one single hardware node and a delay line, is signicantly easier to implement experimentally. It is no longer necessary to construct an entire network of hundreds or even thousands of circuits, each one representing a node. With this approach one node and a delay line suce to construct a computational unit. In this manuscript, we present a further investigation of the properties of delayed feedback congurations used as a reservoir. Instead of quantifying the performance as an error obtained for a certain benchmark, we now investigate a task-independent property, the linear memory of the system.

  2. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  3. Some queuing network models of computer systems

    NASA Technical Reports Server (NTRS)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  4. View southeast of computer controlled energy monitoring system. System replaced ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View southeast of computer controlled energy monitoring system. System replaced strip chart recorders and other instruments under the direct observation of the load dispatcher. - Thirtieth Street Station, Load Dispatch Center, Thirtieth & Market Streets, Railroad Station, Amtrak (formerly Pennsylvania Railroad Station), Philadelphia, Philadelphia County, PA

  5. System/360 Computer Assisted Network Scheduling (CANS) System

    NASA Technical Reports Server (NTRS)

    Brewer, A. C.

    1972-01-01

    Computer assisted scheduling techniques that produce conflict-free and efficient schedules have been developed and implemented to meet needs of the Manned Space Flight Network. CANS system provides effective management of resources in complex scheduling environment. System is automated resource scheduling, controlling, planning, information storage and retrieval tool.

  6. Interactive computer-enhanced remote viewing system

    SciTech Connect

    Tourtellott, J.A.; Wagner, J.F.

    1995-12-01

    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths am clear of obstacles. This need for a task space model is most pronounced in the remediation of obsolete production facilities and underground storage tanks. Production facilities at many sites contain compact process machinery and systems that were used to produce weapons grade material. For many such systems, a complex maze of pipes (with potentially dangerous contents) must be removed, and this represents a significant D&D challenge. In an analogous way, the underground storage tanks at sites such as Hanford represent a challenge because of their limited entry and the tumbled profusion of in-tank hardware. In response to this need, the Interactive Computer-Enhanced Remote Viewing System (ICERVS) is being designed as a software system to: (1) Provide a reliable geometric description of a robotic task space, and (2) Enable robotic remediation to be conducted more effectively and more economically than with available techniques. A system such as ICERVS is needed because of the problems discussed below.

  7. Visual Turing test for computer vision systems

    PubMed Central

    Geman, Donald; Geman, Stuart; Hallonquist, Neil; Younes, Laurent

    2015-01-01

    Today, computer vision systems are tested by their accuracy in detecting and localizing instances of objects. As an alternative, and motivated by the ability of humans to provide far richer descriptions and even tell a story about an image, we construct a “visual Turing test”: an operator-assisted device that produces a stochastic sequence of binary questions from a given test image. The query engine proposes a question; the operator either provides the correct answer or rejects the question as ambiguous; the engine proposes the next question (“just-in-time truthing”). The test is then administered to the computer-vision system, one question at a time. After the system’s answer is recorded, the system is provided the correct answer and the next question. Parsing is trivial and deterministic; the system being tested requires no natural language processing. The query engine employs statistical constraints, learned from a training set, to produce questions with essentially unpredictable answers—the answer to a question, given the history of questions and their correct answers, is nearly equally likely to be positive or negative. In this sense, the test is only about vision. The system is designed to produce streams of questions that follow natural story lines, from the instantiation of a unique object, through an exploration of its properties, and on to its relationships with other uniquely instantiated objects. PMID:25755262

  8. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  9. Semi-Automatic Assembly of Learning Resources

    ERIC Educational Resources Information Center

    Verbert, K.; Ochoa, X.; Derntl, M.; Wolpers, M.; Pardo, A.; Duval, E.

    2012-01-01

    Technology Enhanced Learning is a research field that has matured considerably over the last decade. Many technical solutions to support design, authoring and use of learning activities and resources have been developed. The first datasets that reflect the tracking of actual use of these tools in real-life settings are beginning to become…

  10. Semi-automatic analysis of fire debris

    PubMed

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-01

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians. PMID:10802196

  11. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  12. Architecture and applications of the HEP multiprocessor computer system

    SciTech Connect

    Smith, B.J.

    1981-01-01

    The HEP computer system is a large scale scientific parallel computer employing shared-resource MIMD architecture. The hardware and software facilities provided by the system are described, and techniques found useful in programming the system are discussed. 3 references.

  13. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Processor-interrupt software associated with previously designated safety-critical computer system functions... systems and software. (a) A launch operator must document a system safety process that identifies the... its computing systems and software. Safety-critical computing system and software functions...

  14. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Processor-interrupt software associated with previously designated safety-critical computer system functions... systems and software. (a) A launch operator must document a system safety process that identifies the... its computing systems and software. Safety-critical computing system and software functions...

  15. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Processor-interrupt software associated with previously designated safety-critical computer system functions... systems and software. (a) A launch operator must document a system safety process that identifies the... its computing systems and software. Safety-critical computing system and software functions...

  16. Computation in Dynamically Bounded Asymmetric Systems

    PubMed Central

    Rutishauser, Ueli; Slotine, Jean-Jacques; Douglas, Rodney

    2015-01-01

    Previous explanations of computations performed by recurrent networks have focused on symmetrically connected saturating neurons and their convergence toward attractors. Here we analyze the behavior of asymmetrical connected networks of linear threshold neurons, whose positive response is unbounded. We show that, for a wide range of parameters, this asymmetry brings interesting and computationally useful dynamical properties. When driven by input, the network explores potential solutions through highly unstable ‘expansion’ dynamics. This expansion is steered and constrained by negative divergence of the dynamics, which ensures that the dimensionality of the solution space continues to reduce until an acceptable solution manifold is reached. Then the system contracts stably on this manifold towards its final solution trajectory. The unstable positive feedback and cross inhibition that underlie expansion and divergence are common motifs in molecular and neuronal networks. Therefore we propose that very simple organizational constraints that combine these motifs can lead to spontaneous computation and so to the spontaneous modification of entropy that is characteristic of living systems. PMID:25617645

  17. A computer system for clinical microbiology.

    PubMed Central

    Williams, K N; Davidson, J M; Lynn, R; Rice, E; Phillips, I

    1978-01-01

    The Department of Clinical Microbiology at St Thomas' Hospital has been producing bacteriological reports on a computer for more than three years and is now producing some 2300 reports per week. The system is operated entirely by laboratory staff without special training, and involves the use of optical mark reader (OMR) forms as worksheets, automatic validation and release of most reports, the use of local terminals, and scrutiny of reports by pathologists using a visual display unit. The OMR worksheet records not only the final result but also most of the tests and observations made on the samples; it is the only working document used by technicians. One specialist clinic submits its laboratory requests on an OMR form, which is subsequently used to record the results. The reports are printed and also filed in the computer to produce analyses for hospital, laboratory, and clinical management. PMID:748389

  18. System administration of ATLAS TDAQ computing environment

    NASA Astrophysics Data System (ADS)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  19. Systems analysis of the space shuttle. [communication systems, computer systems, and power distribution

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Oh, S. J.; Thau, F.

    1975-01-01

    Developments in communications systems, computer systems, and power distribution systems for the space shuttle are described. The use of high speed delta modulation for bit rate compression in the transmission of television signals is discussed. Simultaneous Multiprocessor Organization, an approach to computer organization, is presented. Methods of computer simulation and automatic malfunction detection for the shuttle power distribution system are also described.

  20. Multiscale Computational Models of Complex Biological Systems

    PubMed Central

    Walpole, Joseph; Papin, Jason A.; Peirce, Shayn M.

    2014-01-01

    Integration of data across spatial, temporal, and functional scales is a primary focus of biomedical engineering efforts. The advent of powerful computing platforms, coupled with quantitative data from high-throughput experimental platforms, has allowed multiscale modeling to expand as a means to more comprehensively investigate biological phenomena in experimentally relevant ways. This review aims to highlight recently published multiscale models of biological systems while using their successes to propose the best practices for future model development. We demonstrate that coupling continuous and discrete systems best captures biological information across spatial scales by selecting modeling techniques that are suited to the task. Further, we suggest how to best leverage these multiscale models to gain insight into biological systems using quantitative, biomedical engineering methods to analyze data in non-intuitive ways. These topics are discussed with a focus on the future of the field, the current challenges encountered, and opportunities yet to be realized. PMID:23642247

  1. Knowledge and intelligent computing system in medicine.

    PubMed

    Pandey, Babita; Mishra, R B

    2009-03-01

    Knowledge-based systems (KBS) and intelligent computing systems have been used in the medical planning, diagnosis and treatment. The KBS consists of rule-based reasoning (RBR), case-based reasoning (CBR) and model-based reasoning (MBR) whereas intelligent computing method (ICM) encompasses genetic algorithm (GA), artificial neural network (ANN), fuzzy logic (FL) and others. The combination of methods in KBS such as CBR-RBR, CBR-MBR and RBR-CBR-MBR and the combination of methods in ICM is ANN-GA, fuzzy-ANN, fuzzy-GA and fuzzy-ANN-GA. The combination of methods from KBS to ICM is RBR-ANN, CBR-ANN, RBR-CBR-ANN, fuzzy-RBR, fuzzy-CBR and fuzzy-CBR-ANN. In this paper, we have made a study of different singular and combined methods (185 in number) applicable to medical domain from mid 1970s to 2008. The study is presented in tabular form, showing the methods and its salient features, processes and application areas in medical domain (diagnosis, treatment and planning). It is observed that most of the methods are used in medical diagnosis very few are used for planning and moderate number in treatment. The study and its presentation in this context would be helpful for novice researchers in the area of medical expert system. PMID:19201398

  2. Embedded Computer System on the Rosetta Spacecraft

    NASA Astrophysics Data System (ADS)

    Baksa, A.; Balázs, A.; Pálos, Z.; Szalai, S.; Várhalmi, L.

    The KFKI Research Institute for Particle and Nuclear Physycs is participating in the Rosetta mission of the European Space Agency and contributes to the project by providing the on-board computer system of the Rosetta Lander. The Rosetta Spacecraft will rendezvous with a comet called Churyumov-Gerasimenko beyond the orbit of the Mars and it has the Lander to descend down to its surface. The life-time of the Lander on the surface of the comet should be at least four days while it will be powered by non-chargeable primary batteries Afterwards solar panels may provide power even for several months.

  3. Development INTERDATA 8/32 computer system

    NASA Technical Reports Server (NTRS)

    Sonett, C. P.

    1983-01-01

    The capabilities of the Interdata 8/32 minicomputer were examined regarding data and word processing, editing, retrieval, and budgeting as well as data management demands of the user groups in the network. Based on four projected needs: (1) a hands on (open shop) computer for data analysis with large core and disc capability; (2) the expected requirements of the NASA data networks; (3) the need for intermittent large core capacity for theoretical modeling; (4) the ability to access data rapidly either directly from tape or from core onto hard copy, the system proved useful and adequate for the planned requirements.

  4. Memory System Technologies for Future High-End Computing Systems

    SciTech Connect

    McKee, S A; de Supinski, B R; Mueller, F; Tyson, G S

    2003-05-16

    Our ability to solve Grand Challenge Problems in computing hinges on the development of reliable and efficient High-End Computing systems. Unfortunately, the increasing gap between memory and processor speeds remains one of the major bottlenecks in modern architectures. Uniprocessor nodes still suffer, but symmetric multiprocessor nodes--where access to physical memory is shared among all processors--are among the hardest hit. In the latter case, the memory system must juggle multiple working sets and maintain memory coherence, on top of simply responding to access requests. To illustrate the severity of the current situation, consider two important examples: even the high-performance parallel supercomputers in use at Department of Energy National labs observe single-processor utilization rates as low as 5%, and transaction processing commercial workloads see utilizations of at most about 33%. A wealth of research demonstrates that traditional memory systems are incapable of bridging the processor/memory performance gap, and the problem continues to grow. The success of future High-End Computing platforms therefore depends on our developing hardware and software technologies to dramatically relieve the memory bottleneck. In order to take better advantage of the tremendous computing power of modern microprocessors and future High-End systems, we consider it crucial to develop the hardware for intelligent, adaptable memory systems; the middleware and OS modifications to manage them; and the compiler technology and performance tools to exploit them. Taken together, these will provide the foundations for meeting the requirements of future generations of performance-critical, parallel systems based on either uniprocessor or SMP nodes (including PIM organizations). We feel that such solutions should not be vendor-specific, but should be sufficiently general and adaptable such that the technologies could be leveraged by any commercial vendor of High-End Computing systems

  5. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  6. 10 CFR 35.457 - Therapy-related computer systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  7. 10 CFR 35.457 - Therapy-related computer systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  8. 10 CFR 35.457 - Therapy-related computer systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  9. 10 CFR 35.457 - Therapy-related computer systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  10. 10 CFR 35.457 - Therapy-related computer systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  11. Computer Assisted Reference Locator (CARL) System: An Overview.

    ERIC Educational Resources Information Center

    Sands, William A.

    The Computer Assisted Reference Locator (CARL) is a computer-based information retrieval system which uses coordinate indexing. Objectives established in designing the system are: (1) simplicity of reference query and retrieval; (2) ease of system maintenance; and (3) adaptability for alternative computer systems. The source documents input into…

  12. Interrupt handling in a multiprocessor computing system

    SciTech Connect

    D'Amico, L.W.; Guyer, J.M.

    1989-01-03

    A multiprocessor computing system is described comprising: a system bus, including an address bus for carrying an address phase of an instruction and a data bus for carrying a data phase of an instruction; a plurality of processing units connected to the system bus, each processing unit including means for generating broadcast interrupt origin request instructions on the system bus; asynchronous input/output channel controllers connected to the system bus, each of the input/output channel controllers including means for generating a synchronizing signal in response to completion of an address phase of a broadcast instruction on the system bus, and corresponding to a different one of the processing units connected through each of the input/output channel controllers, the input/output channel controllers being arranged on the priority lines in order of priority, the priority lines being gated in an input/output channel controller so that priority is asserted over all lower priority input/output channel controllers on a priority line by an input/output channel controller if the input/output channel controller has an interrupt pending in the input/output channel controller for the processing unit corresponding to the priority line.

  13. An Applet-based Anonymous Distributed Computing System.

    ERIC Educational Resources Information Center

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  14. Computationally efficient and flexible modular modelling approach for river and urban drainage systems based on surrogate conceptual models

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Willems, Patrick

    2015-04-01

    Water managers rely increasingly on mathematical simulation models that represent individual parts of the water system, such as the river, sewer system or waste water treatment plant. The current evolution towards integral water management requires the integration of these distinct components, leading to an increased model scale and scope. Besides this growing model complexity, certain applications gained interest and importance, such as uncertainty and sensitivity analyses, auto-calibration of models and real time control. All these applications share the need for models with a very limited calculation time, either for performing a large number of simulations, or a long term simulation followed by a statistical post-processing of the results. The use of the commonly applied detailed models that solve (part of) the de Saint-Venant equations is infeasible for these applications or such integrated modelling due to several reasons, of which a too long simulation time and the inability to couple submodels made in different software environments are the main ones. Instead, practitioners must use simplified models for these purposes. These models are characterized by empirical relationships and sacrifice model detail and accuracy for increased computational efficiency. The presented research discusses the development of a flexible integral modelling platform that complies with the following three key requirements: (1) Include a modelling approach for water quantity predictions for rivers, floodplains, sewer systems and rainfall runoff routing that require a minimal calculation time; (2) A fast and semi-automatic model configuration, thereby making maximum use of data of existing detailed models and measurements; (3) Have a calculation scheme based on open source code to allow for future extensions or the coupling with other models. First, a novel and flexible modular modelling approach based on the storage cell concept was developed. This approach divides each

  15. The computer emergency response team system (CERT-System)

    SciTech Connect

    Schultz, E.E.

    1991-10-11

    This paper describes CERT-System, an international affiliation of computer security response teams. Formed after the WANK and OILZ worms attacked numerous systems connected to the Internet, an operational charter was signed by representatives of 11 response teams. This affiliation's purpose is to provide a forum for ideas about incident response and computer security, share information, solve common problems, and develop strategies for responding to threats, incidents, etc. The achievements and advantages of participation in CERT-System are presented along with suggested growth areas for this affiliation. The views presented in this paper are the views of one member, and do not necessarily represent the views of others affiliated with CERT-System.

  16. The computer emergency response team system (CERT-System)

    SciTech Connect

    Schultz, E.E.

    1991-10-11

    This paper describes CERT-System, an international affiliation of computer security response teams. Formed after the WANK and OILZ worms attacked numerous systems connected to the Internet, an operational charter was signed by representatives of 11 response teams. This affiliation`s purpose is to provide a forum for ideas about incident response and computer security, share information, solve common problems, and develop strategies for responding to threats, incidents, etc. The achievements and advantages of participation in CERT-System are presented along with suggested growth areas for this affiliation. The views presented in this paper are the views of one member, and do not necessarily represent the views of others affiliated with CERT-System.

  17. Coordinate System Issues in Binary Star Computations

    NASA Astrophysics Data System (ADS)

    Kaplan, George H.

    2015-08-01

    It has been estimated that half of all stars are components of binary or multiple systems. Yet the number of known orbits for astrometric and spectroscopic binary systems together is less than 7,000 (including redundancies), almost all of them for bright stars. A new generation of deep all-sky surveys such as Pan-STARRS, Gaia, and LSST are expected to lead to the discovery of millions of new systems. Although for many of these systems, the orbits may be undersampled initially, it is to be expected that combinations of new and old data sources will eventually lead to many more orbits being known. As a result, a revolution in the scientific understanding of these systems may be upon us.The current database of visual (astrometric) binary orbits represents them relative to the “plane of the sky”, that is, the plane orthogonal to the line of sight. Although the line of sight to stars constantly changes due to proper motion, aberration, and other effects, there is no agreed upon standard for what line of sight defines the orbital reference plane. Furthermore, the computation of differential coordinates (component B relative to A) for a given date must be based on the binary system’s direction at that date. Thus, a different “plane of the sky” is appropriate for each such date, i.e., each observation. However, projection effects between the reference planes, differential aberration, and the curvature of the sky are generally neglected in such computations. Usually the only correction applied is for the change in the north direction (position angle zero) due to precession (and sometimes also proper motion). This paper will present an algorithm for a more complete model of the geometry involved, and will show that such a model is necessary to avoid errors in the computed observables that are significant at modern astrometric accuracy. The paper will also suggest where conventions need to be established to avoid ambiguities in how quantities related to binary star

  18. Computer-aided retinal photocoagulation system

    NASA Astrophysics Data System (ADS)

    Barrett, Steven F.; Wright, Cameron H.; Jerath, Maya R.; Lewis, R. Stephen; Dillard, Bryan C.; Rylander, Henry G.; Welch, Ashley J.

    1996-01-01

    Researchers at the University of Texas at Austin's Biomedical Engineering Laser Laboratory and the U.S. Air Force Academy's Department of Electrical Engineering are developing a computer-assisted prototype retinal photocoagulation system. The project goal is to rapidly and precisely automatically place laser lesions in the retina for the treatment of disorders such as diabetic retinopathy and retinal tears while dynamically controlling the extent of the lesion. Separate prototype subsystems have been developed to control lesion parameters (diameter or depth) using lesion reflectance feedback and lesion placement using retinal vessels as tracking landmarks. Successful subsystem testing results in vivo on pigmented rabbits using an argon continuous wave laser are presented. A prototype integrated system design to simultaneously control lesion parameters and placement at clinically significant speeds is provided.

  19. Computational Intelligence Techniques for Tactile Sensing Systems

    PubMed Central

    Gastaldo, Paolo; Pinna, Luigi; Seminara, Lucia; Valle, Maurizio; Zunino, Rodolfo

    2014-01-01

    Tactile sensing helps robots interact with humans and objects effectively in real environments. Piezoelectric polymer sensors provide the functional building blocks of the robotic electronic skin, mainly thanks to their flexibility and suitability for detecting dynamic contact events and for recognizing the touch modality. The paper focuses on the ability of tactile sensing systems to support the challenging recognition of certain qualities/modalities of touch. The research applies novel computational intelligence techniques and a tensor-based approach for the classification of touch modalities; its main results consist in providing a procedure to enhance system generalization ability and architecture for multi-class recognition applications. An experimental campaign involving 70 participants using three different modalities in touching the upper surface of the sensor array was conducted, and confirmed the validity of the approach. PMID:24949646

  20. Computer Information System For Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Cahill, P. T.; Knowles, R. J.....; Tsen, O.

    1983-12-01

    To meet the complex needs of a nuclear medicine division serving a 1100-bed hospital, a computer information system has been developed in sequential phases. This database management system is based on a time-shared minicomputer linked to a broadband communications network. The database contains information on patient histories, billing, types of procedures, doses of radiopharmaceuticals, times of study, scanning equipment used, and technician performing the procedure. These patient records are cycled through three levels of storage: (a) an active file of 100 studies for those patients currently scheduled, (b) a temporary storage level of 1000 studies, and (c) an archival level of 10,000 studies containing selected information. Merging of this information with reports and various statistical analyses are possible. This first phase has been in operation for well over a year. The second phase is an upgrade of the size of the various storage levels by a factor of ten.

  1. Potential of Cognitive Computing and Cognitive Systems

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2014-11-01

    Cognitive computing and cognitive technologies are game changers for future engineering systems, as well as for engineering practice and training. They are major drivers for knowledge automation work, and the creation of cognitive products with higher levels of intelligence than current smart products. This paper gives a brief review of cognitive computing and some of the cognitive engineering systems activities. The potential of cognitive technologies is outlined, along with a brief description of future cognitive environments, incorporating cognitive assistants - specialized proactive intelligent software agents designed to follow and interact with humans and other cognitive assistants across the environments. The cognitive assistants engage, individually or collectively, with humans through a combination of adaptive multimodal interfaces, and advanced visualization and navigation techniques. The realization of future cognitive environments requires the development of a cognitive innovation ecosystem for the engineering workforce. The continuously expanding major components of the ecosystem include integrated knowledge discovery and exploitation facilities (incorporating predictive and prescriptive big data analytics); novel cognitive modeling and visual simulation facilities; cognitive multimodal interfaces; and cognitive mobile and wearable devices. The ecosystem will provide timely, engaging, personalized / collaborative, learning and effective decision making. It will stimulate creativity and innovation, and prepare the participants to work in future cognitive enterprises and develop new cognitive products of increasing complexity. http://www.aee.odu.edu/cognitivecomp

  2. Mammographic computer-aided detection systems.

    PubMed

    2003-04-01

    While mammography is regarded as the best means available to screen for breast cancer, reading mammograms is a tedious, error-prone task. Given the repetitiveness of the process and the fact that less than 1% of mammograms in the average screening population contain cancer, it's no wonder that a significant number of breast cancers--about 28%--are missed by radiologists. The fact that human error is such a significant obstacle makes mammography screening an ideal application for computer-aided detection (CAD) systems. CAD systems serve as a "second pair of eyes" to ensure that radiologists don't miss a suspect area on an image. They analyze patterns on a digitized mammographic image, identify regions that may contain an abnormality indicating cancer, and mark these regions. The marks are then inspected and classified by a radiologist. But CAD systems provide no diagnosis of any kind--it's up to the radiologist to analyze the marked area and decide if it shows cancer. In this Evaluation, we describe the challenges posed by screening mammography, the operating principles and overall efficacy of CAD systems, and the characteristics to consider when purchasing a system. We also compare the performance of two commercially available systems, iCAD's MammoReader and R2's ImageChecker. Because the two systems offer comparable sensitivity, our judgments are based on other performance characteristics, including their ease of use, the number of false marks they produce, the degree to which they can integrate with hospital information systems, and their processing speed. PMID:12760158

  3. Multiaxis, Lightweight, Computer-Controlled Exercise System

    NASA Technical Reports Server (NTRS)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William

    2006-01-01

    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via

  4. Evaluation and testing of computed radiography systems.

    PubMed

    Charnock, P; Connolly, P A; Hughes, D; Moores, B M

    2005-01-01

    The implementation of film replacement digital radiographic imaging systems throughout Europe is now gathering momentum. Such systems create the foundations for totally digital departments of radiology, since radiographic examinations constitute the most prevalent modality. Although this type of development will lead to improvements in the delivery and management of radiological service, such widespread implementation of new technology must be carefully monitored. The implementation of effective QA tests on installation, at periodic intervals and as part of a routine programme will aid this process. This paper presents the results of commissioning tests undertaken on a number of computed radiography imaging systems provided by different manufacturers. The aim of these tests was not only to provide baseline performance measurements against which subsequent measurements can be compared but also to explore any differences in performance, which might exist between different units. Results of measurements will be presented for (1) monitor and laser printer set-up; (2) imaging plates, including sensitivity, consistency and uniformity; (3) resolution and contrast detectability; and (4) signal and noise performance. Results from the latter are analysed in relationship with both system and quantum noise components. PMID:15933109

  5. Computer-controlled endoscopic performance assessment system.

    PubMed

    Hanna, G B; Drew, T; Clinch, P; Hunter, B; Cuschieri, A

    1998-07-01

    We have devised an advanced computer-controlled system (ADEPT) for the objective evaluation of endoscopic task performance. The system's hardware consists of a dual gimbal mechanism that accepts a variety of 5.0-mm standard endoscopic instruments for manipulation in a precisely mapped and enclosed work space. The target object consists of a sprung base plate incorporating various tasks. It is covered by a sprung perforated transparent top plate that has to be moved and held in the correct position by the operator to gain access to the various tasks. Standard video endoscope equipment provides the visual interface between the operator and the target-instrument field. Different target modules can be used, and the level of task difficulty can be adjusted by varying the manipulation, elevation, and azimuth angles. The system's software is designed to (a) prompt the surgeon with the information necessary to perform the task, (b) collect and collate data on performance during execution of specified tasks, and (c) save the data for future analysis. The system was alpha and beta tested to ensure that all functions operated correctly. PMID:9632879

  6. Computational dynamics of acoustically driven microsphere systems

    NASA Astrophysics Data System (ADS)

    Glosser, Connor; Piermarocchi, Carlo; Li, Jie; Dault, Dan; Shanker, B.

    2016-01-01

    We propose a computational framework for the self-consistent dynamics of a microsphere system driven by a pulsed acoustic field in an ideal fluid. Our framework combines a molecular dynamics integrator describing the dynamics of the microsphere system with a time-dependent integral equation solver for the acoustic field that makes use of fields represented as surface expansions in spherical harmonic basis functions. The presented approach allows us to describe the interparticle interaction induced by the field as well as the dynamics of trapping in counter-propagating acoustic pulses. The integral equation formulation leads to equations of motion for the microspheres describing the effect of nondissipative drag forces. We show (1) that the field-induced interactions between the microspheres give rise to effective dipolar interactions, with effective dipoles defined by their velocities and (2) that the dominant effect of an ultrasound pulse through a cloud of microspheres gives rise mainly to a translation of the system, though we also observe both expansion and contraction of the cloud determined by the initial system geometry.

  7. Computational dynamics of acoustically driven microsphere systems.

    PubMed

    Glosser, Connor; Piermarocchi, Carlo; Li, Jie; Dault, Dan; Shanker, B

    2016-01-01

    We propose a computational framework for the self-consistent dynamics of a microsphere system driven by a pulsed acoustic field in an ideal fluid. Our framework combines a molecular dynamics integrator describing the dynamics of the microsphere system with a time-dependent integral equation solver for the acoustic field that makes use of fields represented as surface expansions in spherical harmonic basis functions. The presented approach allows us to describe the interparticle interaction induced by the field as well as the dynamics of trapping in counter-propagating acoustic pulses. The integral equation formulation leads to equations of motion for the microspheres describing the effect of nondissipative drag forces. We show (1) that the field-induced interactions between the microspheres give rise to effective dipolar interactions, with effective dipoles defined by their velocities and (2) that the dominant effect of an ultrasound pulse through a cloud of microspheres gives rise mainly to a translation of the system, though we also observe both expansion and contraction of the cloud determined by the initial system geometry. PMID:26871188

  8. Intelligent Computer Vision System for Automated Classification

    SciTech Connect

    Jordanov, Ivan; Georgieva, Antoniya

    2010-05-21

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPtauS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  9. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...-critical computer system function for any operation performed during launch processing or flight that could... safety review document must list and describe all safety-critical computer system functions involved in a...-critical computer system function, an applicant's safety review document must: (1) Describe all...

  10. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-critical computer system function for any operation performed during launch processing or flight that could... safety review document must list and describe all safety-critical computer system functions involved in a...-critical computer system function, an applicant's safety review document must: (1) Describe all...

  11. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...-critical computer system function for any operation performed during launch processing or flight that could... safety review document must list and describe all safety-critical computer system functions involved in a...-critical computer system function, an applicant's safety review document must: (1) Describe all...

  12. 21 CFR 892.1200 - Emission computed tomography system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Emission computed tomography system. 892.1200... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1200 Emission computed tomography system. (a) Identification. An emission computed tomography system is a device intended to detect...

  13. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a...

  14. Laptop Computer - Based Facial Recognition System Assessment

    SciTech Connect

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results. After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in

  15. Computing the Moore-Penrose Inverse of a Matrix with a Computer Algebra System

    ERIC Educational Resources Information Center

    Schmidt, Karsten

    2008-01-01

    In this paper "Derive" functions are provided for the computation of the Moore-Penrose inverse of a matrix, as well as for solving systems of linear equations by means of the Moore-Penrose inverse. Making it possible to compute the Moore-Penrose inverse easily with one of the most commonly used Computer Algebra Systems--and to have the blueprint…

  16. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  17. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  18. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  19. A micro-computer based system to compute magnetic variation

    NASA Technical Reports Server (NTRS)

    Kaul, R.

    1984-01-01

    A mathematical model of magnetic variation in the continental United States (COT48) was implemented in the Ohio University LORAN C receiver. The model is based on a least squares fit of a polynomial function. The implementation on the microprocessor based LORAN C receiver is possible with the help of a math chip, Am9511 which performs 32 bit floating point mathematical operations. A Peripheral Interface Adapter (M6520) is used to communicate between the 6502 based micro-computer and the 9511 math chip. The implementation provides magnetic variation data to the pilot as a function of latitude and longitude. The model and the real time implementation in the receiver are described.

  20. Computer vision for driver assistance systems

    NASA Astrophysics Data System (ADS)

    Handmann, Uwe; Kalinke, Thomas; Tzomakas, Christos; Werner, Martin; von Seelen, Werner

    1998-07-01

    Systems for automated image analysis are useful for a variety of tasks and their importance is still increasing due to technological advances and an increase of social acceptance. Especially in the field of driver assistance systems the progress in science has reached a level of high performance. Fully or partly autonomously guided vehicles, particularly for road-based traffic, pose high demands on the development of reliable algorithms due to the conditions imposed by natural environments. At the Institut fur Neuroinformatik, methods for analyzing driving relevant scenes by computer vision are developed in cooperation with several partners from the automobile industry. We introduce a system which extracts the important information from an image taken by a CCD camera installed at the rear view mirror in a car. The approach consists of a sequential and a parallel sensor and information processing. Three main tasks namely the initial segmentation (object detection), the object tracking and the object classification are realized by integration in the sequential branch and by fusion in the parallel branch. The main gain of this approach is given by the integrative coupling of different algorithms providing partly redundant information.

  1. A computing system for LBB considerations

    SciTech Connect

    Ikonen, K.; Miettinen, J.; Raiko, H.; Keskinen, R.

    1997-04-01

    A computing system has been developed at VTT Energy for making efficient leak-before-break (LBB) evaluations of piping components. The system consists of fracture mechanics and leak rate analysis modules which are linked via an interactive user interface LBBCAL. The system enables quick tentative analysis of standard geometric and loading situations by means of fracture mechanics estimation schemes such as the R6, FAD, EPRI J, Battelle, plastic limit load and moments methods. Complex situations are handled with a separate in-house made finite-element code EPFM3D which uses 20-noded isoparametric solid elements, automatic mesh generators and advanced color graphics. Analytical formulas and numerical procedures are available for leak area evaluation. A novel contribution for leak rate analysis is the CRAFLO code which is based on a nonequilibrium two-phase flow model with phase slip. Its predictions are essentially comparable with those of the well known SQUIRT2 code; additionally it provides outputs for temperature, pressure and velocity distributions in the crack depth direction. An illustrative application to a circumferentially cracked elbow indicates expectedly that a small margin relative to the saturation temperature of the coolant reduces the leak rate and is likely to influence the LBB implementation to intermediate diameter (300 mm) primary circuit piping of BWR plants.

  2. Reliable timing systems for computer controlled accelerators

    NASA Astrophysics Data System (ADS)

    Knott, Jürgen; Nettleton, Robert

    1986-06-01

    Over the past decade the use of computers has set new standards for control systems of accelerators with ever increasing complexity coupled with stringent reliability criteria. In fact, with very slow cycling machines or storage rings any erratic operation or timing pulse will cause the loss of precious particles and waste hours of time and effort of preparation. Thus, for the CERN linac and LEAR (Low Energy Antiproton Ring) timing system reliability becomes a crucial factor in the sense that all components must operate practically without fault for very long periods compared to the effective machine cycle. This has been achieved by careful selection of components and design well below thermal and electrical limits, using error detection and correction where possible, as well as developing "safe" decoding techniques for serial data trains. Further, consistent structuring had to be applied in order to obtain simple and flexible modular configurations with very few components on critical paths and to minimize the exchange of information to synchronize accelerators. In addition, this structuring allows the development of efficient strategies for on-line and off-line fault diagnostics. As a result, the timing system for Linac 2 has, so far, been operating without fault for three years, the one for LEAR more than one year since its final debugging.

  3. A computed tomographic imaging system for experimentation

    NASA Astrophysics Data System (ADS)

    Lu, Yanping; Wang, Jue; Liu, Fenglin; Yu, Honglin

    2008-03-01

    Computed tomography (CT) is a non-invasive imaging technique, which is widely applied in medicine for diagnosis and surgical planning, and in industry for non-destructive testing (NDT) and non-destructive evaluation (NDE). So, it is significant for college students to understand the fundamental of CT. In this work, A CT imaging system named CD-50BG with 50mm field-of-view has been developed for experimental teaching at colleges. With the translate-rotate scanning mode, the system makes use of a 7.4×10 8Bq (20mCi) activity 137Cs radioactive source which is held in a tungsten alloy to shield the radiation and guarantee no harm to human body, and a single plastic scintillator + photomultitude detector which is convenient for counting because of its short-time brightness and good single pulse. At same time, an image processing software with the functions of reconstruction, image processing and 3D visualization has also been developed to process the 16 bits acquired data. The reconstruction time for a 128×128 image is less than 0.1 second. High quality images with 0.8mm spatial resolution and 2% contrast sensitivity can be obtained. So far in China, more than ten institutions of higher education, including Tsinghua University and Peking University, have already applied the system for elementary teaching.

  4. Requirements development for a patient computing system.

    PubMed Central

    Wald, J. S.; Pedraza, L. A.; Reilly, C. A.; Murphy, M. E.; Kuperman, G. J.

    2001-01-01

    Critical parts of the software development life cycle are concerned with eliciting, understanding, and managing requirements. Though the literature on this subject dates back for several decades, practicing effective requirements development remains a current and challenging area. Some projects flourish with a requirements development process (RDP) that is implicit and informal, but this approach may be overly risky, particularly for large projects that involve multiple individuals, groups, and systems over time. At Partners HealthCare System in Boston, Massachusetts, we have applied a more formal approach for requirements development to the Patient Computing Project. The goal of the project is to create web-based software that connects patients electronically with their physician's offices and has the potential to improve care efficiency and quality. It is a large project, with over 500 function points. Like most technological innovation, the successful introduction of this system requires as much attention to understanding the business needs and workflow details as it does to technical design and implementation. This paper describes our RDP approach, and key business requirements discovered through this process. We believe that a formal RDP is essential, and that informatics as a field must include proficiencies in this area. PMID:11825282

  5. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  6. Genost: A System for Introductory Computer Science Education with a Focus on Computational Thinking

    NASA Astrophysics Data System (ADS)

    Walliman, Garret

    Computational thinking, the creative thought process behind algorithmic design and programming, is a crucial introductory skill for both computer scientists and the population in general. In this thesis I perform an investigation into introductory computer science education in the United States and find that computational thinking is not effectively taught at either the high school or the college level. To remedy this, I present a new educational system intended to teach computational thinking called Genost. Genost consists of a software tool and a curriculum based on teaching computational thinking through fundamental programming structures and algorithm design. Genost's software design is informed by a review of eight major computer science educational software systems. Genost's curriculum is informed by a review of major literature on computational thinking. In two educational tests of Genost utilizing both college and high school students, Genost was shown to significantly increase computational thinking ability with a large effect size.

  7. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  8. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  9. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  10. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  11. 10 CFR 35.657 - Therapy-related computer systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  12. Search Engine Prototype System Based on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Han, Jinyu; Hu, Min; Sun, Hongwei

    With the development of Internet, IT support systems need to provide more storage space and faster computing power for Internet applications such as search engine. The emergence of cloud computing can effectively solve these problems. We present a search engine prototype system based on cloud computing platform in this paper.

  13. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  14. Distributed computing system with dual independent communications paths between computers and employing split tokens

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  15. Computational studies of nonlinear dispersive plasma systems

    NASA Astrophysics Data System (ADS)

    Qian, Xin

    Plasma systems with dispersive waves are ubiquitous. Dispersive waves have the property that their wave velocity depends on the wave number of the wave. These waves show up in weakly as well as strongly coupled plasmas, and play a significant role in the underlying plasma dynamics. Dispersive waves bring new challenges to the computer simulation of nonlinear phenomena. The goal of this thesis is to discuss two computational studies of plasma phenomena, one drawn from strongly coupled complex or dusty plasmas, and the other from weakly coupled hydrogen plasmas. In the realm of dusty plasmas, we focus on the problem of three-dimensional (3D) Mach cones which we study by means of Molecular Dynamics (MD) simulations, assuming that the dust particles interact via a Yukawa potential. While laboratory and MD simulations have explored thoroughly the properties of Mach cones in 2D, elucidating the important role of dispersive waves in the formation of multiple cones, the simulations presented in this thesis represent the first 3D MD studies of Mach cones in strongly coupled dusty plasmas. These results have qualitative similarities with experimental observations on 3D Mach cones from the PK-3 plus project, which studies complex plasmas under microgravity conditions aboard the International Space station. In the realm of weakly coupled plasmas, we present results on the application of non-oscillatory central schemes to Hall MHD reconnection problems, in which the presence of dispersive whistler waves presents a formidable challenge for numerical algorithms that rely on explicit time-stepping schemes. In particular, we focus on the semi-discrete central formulation of Kurganov and Tadmor (2000), which has the advantage that it allow for larger time steps, and with significantly smaller numerical viscosity, than fully discrete schemes. We implement the Hall MHD equations through the CentPACK software package that implements the Kurganov-Tadmor formulation for a wide range of

  16. Computer program and user documentation medical data tape retrieval system

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    This volume provides several levels of documentation for the program module of the NASA medical directorate mini-computer storage and retrieval system. A biomedical information system overview describes some of the reasons for the development of the mini-computer storage and retrieval system. It briefly outlines all of the program modules which constitute the system.

  17. Embedded computer systems for control applications in EBR-II

    SciTech Connect

    Carlson, R.B.; Start, S.E.

    1993-03-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II.

  18. Embedded computer systems for control applications in EBR-II

    SciTech Connect

    Carlson, R.B.; Start, S.E.

    1993-01-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II.

  19. An operating system for future aerospace vehicle computer systems

    NASA Technical Reports Server (NTRS)

    Foudriat, E. C.; Berman, W. J.; Will, R. W.; Bynum, W. L.

    1984-01-01

    The requirements for future aerospace vehicle computer operating systems are examined in this paper. The computer architecture is assumed to be distributed with a local area network connecting the nodes. Each node is assumed to provide a specific functionality. The network provides for communication so that the overall tasks of the vehicle are accomplished. The O/S structure is based upon the concept of objects. The mechanisms for integrating node unique objects with node common objects in order to implement both the autonomy and the cooperation between nodes is developed. The requirements for time critical performance and reliability and recovery are discussed. Time critical performance impacts all parts of the distributed operating system; e.g., its structure, the functional design of its objects, the language structure, etc. Throughout the paper the tradeoffs - concurrency, language structure, object recovery, binding, file structure, communication protocol, programmer freedom, etc. - are considered to arrive at a feasible, maximum performance design. Reliability of the network system is considered. A parallel multipath bus structure is proposed for the control of delivery time for time critical messages. The architecture also supports immediate recovery for the time critical message system after a communication failure.

  20. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  1. SRS Computer Animation and Drive Train System

    NASA Technical Reports Server (NTRS)

    Arthun, Daniel; Schachner, Christian

    2001-01-01

    The spinning rocket simulator (SRS) is an ongoing project at Oral Roberts University. The goal of the SRS is to gather crucial data concerning a spinning rocket under thrust for the purpose of analysis and correction of the coning motion experienced by this type of spacecraft maneuver. The computer animation simulates a virtual, scale model of the component of the SRS that represents the spacecraft itself. This component is known as the (VSM), or virtual spacecraft model. During actual physical simulation, this component of the SRS will experience a coning. The goal of the animation is to cone the VSM within that range to accurately represent the motion of the actual simulator. The drive system of the SRS is the apparatus that turns the actual simulator. It consists of a drive motor, motor mount and chain to power the simulator into motion. The motor mount is adjustable and rigid for high torque application. A digital stepper motor controller actuates the main drive motor for linear acceleration. The chain transfers power from the motor to the simulator via sprockets on both ends.

  2. Learning with Computers: Implementation of an Integrated Learning System for Computer Assisted Instruction (CAI).

    ERIC Educational Resources Information Center

    Texas State Dept. of Criminal Justice, Huntsville. Windham School System.

    This publication provides information on implementation of an integrated learning system for computer-assisted instruction (CAI) in adult learning environments. The first of the document's nine chapters is an introduction to computer-delivered instruction that addresses the appropriateness of computers in instruction and types of CAI activities.…

  3. Architecture and grid application of cluster computing system

    NASA Astrophysics Data System (ADS)

    Lv, Yi; Yu, Shuiqin; Mao, Youju

    2004-11-01

    Recently, people pay more attention to the grid technology. It can not only connect all kinds of resources in the network, but also put them into a super transparent computing environment for customers to realize mete-computing which can share computing resources. Traditional parallel computing system, such as SMP(Symmetrical multiprocessor) and MPP(massively parallel processor), use multi-processors to raise computing speed in a close coupling way, so the flexible and scalable performance of the system are limited, as a result of it, the system can't meet the requirement of the grid technology. In this paper, the architecture of cluster computing system applied in grid nodes is introduced. It mainly includes the following aspects. First, the network architecture of cluster computing system in grid nodes is analyzed and designed. Second, how to realize distributing computing (including coordinating computing and sharing computing) of cluster computing system in grid nodes to construct virtual node computers is discussed. Last, communication among grid nodes is analyzed. In other words, it discusses how to realize single reflection to let all the service requirements from customers be met through sending to the grid nodes.

  4. Computer controlled vent and pressurization system

    NASA Technical Reports Server (NTRS)

    Cieslewicz, E. J.

    1975-01-01

    The Centaur space launch vehicle airborne computer, which was primarily used to perform guidance, navigation, and sequencing tasks, was further used to monitor and control inflight pressurization and venting of the cryogenic propellant tanks. Computer software flexibility also provided a failure detection and correction capability necessary to adopt and operate redundant hardware techniques and enhance the overall vehicle reliability.

  5. Computer Human Interaction for Image Information Systems.

    ERIC Educational Resources Information Center

    Beard, David Volk

    1991-01-01

    Presents an approach to developing viable image computer-human interactions (CHI) involving user metaphors for comprehending image data and methods for locating, accessing, and displaying computer images. A medical-image radiology workstation application is used as an example, and feedback and evaluation methods are discussed. (41 references) (LRW)

  6. PLAID- A COMPUTER AIDED DESIGN SYSTEM

    NASA Technical Reports Server (NTRS)

    Brown, J. W.

    1994-01-01

    PLAID is a three-dimensional Computer Aided Design (CAD) system which enables the user to interactively construct, manipulate, and display sets of highly complex geometric models. PLAID was initially developed by NASA to assist in the design of Space Shuttle crewstation panels, and the detection of payload object collisions. It has evolved into a more general program for convenient use in many engineering applications. Special effort was made to incorporate CAD techniques and features which minimize the users workload in designing and managing PLAID models. PLAID consists of three major modules: the Primitive Object Generator (BUILD), the Composite Object Generator (COG), and the DISPLAY Processor. The BUILD module provides a means of constructing simple geometric objects called primitives. The primitives are created from polygons which are defined either explicitly by vertex coordinates, or graphically by use of terminal crosshairs or a digitizer. Solid objects are constructed by combining, rotating, or translating the polygons. Corner rounding, hole punching, milling, and contouring are special features available in BUILD. The COG module hierarchically organizes and manipulates primitives and other previously defined COG objects to form complex assemblies. The composite object is constructed by applying transformations to simpler objects. The transformations which can be applied are scalings, rotations, and translations. These transformations may be defined explicitly or defined graphically using the interactive COG commands. The DISPLAY module enables the user to view COG assemblies from arbitrary viewpoints (inside or outside the object) both in wireframe and hidden line renderings. The PLAID projection of a three-dimensional object can be either orthographic or with perspective. A conflict analysis option enables detection of spatial conflicts or collisions. DISPLAY provides camera functions to simulate a view of the model through different lenses. Other

  7. The hack attack - Increasing computer system awareness of vulnerability threats

    NASA Technical Reports Server (NTRS)

    Quann, John; Belford, Peter

    1987-01-01

    The paper discusses the issue of electronic vulnerability of computer based systems supporting NASA Goddard Space Flight Center (GSFC) by unauthorized users. To test the security of the system and increase security awareness, NYMA, Inc. employed computer 'hackers' to attempt to infiltrate the system(s) under controlled conditions. Penetration procedures, methods, and descriptions are detailed in the paper. The procedure increased the security consciousness of GSFC management to the electronic vulnerability of the system(s).

  8. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Software used for fault detection in safety-critical computer hardware or software. (4) Software that... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies...

  9. A Computer-Based System for Studies in Learning.

    ERIC Educational Resources Information Center

    Gentner, Donald R.; And Others

    A computer-based system, called the FLOW system, was used in experimental studies of human learning. The student learns a simple computer language from printed instructions and can run his programs interactively on the FLOW system. An automated tutor simulates a human tutor who watches over the student and gives help when the student has…

  10. Overview of ASC Capability Computing System Governance Model

    SciTech Connect

    Doebling, Scott W.

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  11. Report on the Total System Computer Program for Medical Libraries.

    ERIC Educational Resources Information Center

    Divett, Robert T.; Jones, W. Wayne

    The objective of this project was to develop an integrated computer program for the total operations of a medical library including acquisitions, cataloging, circulation, reference, a computer catalog, serials controls, and current awareness services. The report describes two systems approaches: the batch system and the terminal system. The batch…

  12. Computer Output Laser Disk (COLD) Systems--COM Replacement Units.

    ERIC Educational Resources Information Center

    Bolnick, Franklin I.

    1993-01-01

    Explains the COLD (Computer Output Laser Disk) system and describes current applications. Use of the COLD system to replace COM (Computer Output Microfilm) is discussed; advantages and disadvantages of the COLD system are considered; optical disks OD-WORM (Optical Disk-Write Once Read Many) versus CD-ROM are compared; and equipment and software…

  13. Architecture and applications of the HEP multiprocessor computer system

    SciTech Connect

    Smith, B.J.; Fink, D.J.

    1982-01-01

    The HEP computer system is a large scale scientific parallel computer employing shared resource MIMD architecture. The hardware and software facilities provided by the system are described, and techniques found to be useful in programming the system are also discussed. 3 references.

  14. A Very Tentative Computer System Model. Occasional Paper No. 3.

    ERIC Educational Resources Information Center

    Breslow, Martin P.

    The developmental paper, one of a series written as the Management Information System for Occupational Education (MISOE) was conceptualized, is a first attempt to picture the computer system necessary to carry out the project's goals. It describes the basic structure and the anticipated strategies of development of the computer system to be used.…

  15. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  16. Top 10 Threats to Computer Systems Include Professors and Students

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  17. Development of a Computer-Assisted Behavioral Skill Training System.

    ERIC Educational Resources Information Center

    Waller, Wayne

    1983-01-01

    As part of a iatrosedation program, a computer-assisted system was developed to enhance evaluation and feedback processes central to course design. Computer-controlled audio and video playback devices and computer technology are used to record and play back physician-patient interviews and print a record of the interview evaluation session. (MSE)

  18. Computer Assisted Instruction in Navy Technical Training Using a Small Dedicated Computer System: Final Report.

    ERIC Educational Resources Information Center

    Ford, John D.; And Others

    An investigation was made of the feasibility of Computer Assisted Instruction (CAI) for Navy technical training. The computer system used was the IBM 1500 system. Five CAI modules were developed which could replace 92 hours of the class curriculum. CAI provided very effective and efficient instruction. CAI students scored higher than…

  19. A Fourth Generation Distance Education System: Integrating Computer-Assisted Learning and Computer Conferencing.

    ERIC Educational Resources Information Center

    Lauzon, Allan C.; Moore, George A. B.

    1989-01-01

    Reviews the literature on Keller's Personalized System of Instruction (PSI), computer-assisted learning (CAL), computer conferencing (CC), and forms of instruction, then discusses how they can be integrated into a delivery system to enhance distance learning. Asynchronous individualized instruction and group instruction are also discussed. (28…

  20. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  1. Functional requirements for gas characterization system computer software

    SciTech Connect

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel.

  2. Safety Metrics for Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  3. Computer Generated Hologram System for Wavefront Measurement System Calibration

    NASA Technical Reports Server (NTRS)

    Olczak, Gene

    2011-01-01

    Computer Generated Holograms (CGHs) have been used for some time to calibrate interferometers that require nulling optics. A typical scenario is the testing of aspheric surfaces with an interferometer placed near the paraxial center of curvature. Existing CGH technology suffers from a reduced capacity to calibrate middle and high spatial frequencies. The root cause of this shortcoming is as follows: the CGH is not placed at an image conjugate of the asphere due to limitations imposed by the geometry of the test and the allowable size of the CGH. This innovation provides a calibration system where the imaging properties in calibration can be made comparable to the test configuration. Thus, if the test is designed to have good imaging properties, then middle and high spatial frequency errors in the test system can be well calibrated. The improved imaging properties are provided by a rudimentary auxiliary optic as part of the calibration system. The auxiliary optic is simple to characterize and align to the CGH. Use of the auxiliary optic also reduces the size of the CGH required for calibration and the density of the lines required for the CGH. The resulting CGH is less expensive than the existing technology and has reduced write error and alignment error sensitivities. This CGH system is suitable for any kind of calibration using an interferometer when high spatial resolution is required. It is especially well suited for tests that include segmented optical components or large apertures.

  4. A comparison of queueing, cluster and distributed computing systems

    NASA Technical Reports Server (NTRS)

    Kaplan, Joseph A.; Nelson, Michael L.

    1993-01-01

    Using workstation clusters for distributed computing has become popular with the proliferation of inexpensive, powerful workstations. Workstation clusters offer both a cost effective alternative to batch processing and an easy entry into parallel computing. However, a number of workstations on a network does not constitute a cluster. Cluster management software is necessary to harness the collective computing power. A variety of cluster management and queuing systems are compared: Distributed Queueing Systems (DQS), Condor, Load Leveler, Load Balancer, Load Sharing Facility (LSF - formerly Utopia), Distributed Job Manager (DJM), Computing in Distributed Networked Environments (CODINE), and NQS/Exec. The systems differ in their design philosophy and implementation. Based on published reports on the different systems and conversations with the system's developers and vendors, a comparison of the systems are made on the integral issues of clustered computing.

  5. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  6. National electronic medical records integration on cloud computing system.

    PubMed

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment. PMID:23920993

  7. Methods for operating parallel computing systems employing sequenced communications

    DOEpatents

    Benner, R.E.; Gustafson, J.L.; Montry, G.R.

    1999-08-10

    A parallel computing system and method are disclosed having improved performance where a program is concurrently run on a plurality of nodes for reducing total processing time, each node having a processor, a memory, and a predetermined number of communication channels connected to the node and independently connected directly to other nodes. The present invention improves performance of the parallel computing system by providing a system which can provide efficient communication between the processors and between the system and input and output devices. A method is also disclosed which can locate defective nodes with the computing system. 15 figs.

  8. Methods for operating parallel computing systems employing sequenced communications

    DOEpatents

    Benner, Robert E.; Gustafson, John L.; Montry, Gary R.

    1999-01-01

    A parallel computing system and method having improved performance where a program is concurrently run on a plurality of nodes for reducing total processing time, each node having a processor, a memory, and a predetermined number of communication channels connected to the node and independently connected directly to other nodes. The present invention improves performance of performance of the parallel computing system by providing a system which can provide efficient communication between the processors and between the system and input and output devices. A method is also disclosed which can locate defective nodes with the computing system.

  9. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  10. Computer optimization of reactor-thermoelectric space power systems

    NASA Technical Reports Server (NTRS)

    Maag, W. L.; Finnegan, P. M.; Fishbach, L. H.

    1973-01-01

    A computer simulation and optimization code that has been developed for nuclear space power systems is described. The results of using this code to analyze two reactor-thermoelectric systems are presented.

  11. A CLIPS based personal computer hardware diagnostic system

    NASA Technical Reports Server (NTRS)

    Whitson, George M.

    1991-01-01

    Often the person designated to repair personal computers has little or no knowledge of how to repair a computer. Described here is a simple expert system to aid these inexperienced repair people. The first component of the system leads the repair person through a number of simple system checks such as making sure that all cables are tight and that the dip switches are set correctly. The second component of the system assists the repair person in evaluating error codes generated by the computer. The final component of the system applies a large knowledge base to attempt to identify the component of the personal computer that is malfunctioning. We have implemented and tested our design with a full system to diagnose problems for an IBM compatible system based on the 8088 chip. In our tests, the inexperienced repair people found the system very useful in diagnosing hardware problems.

  12. Data systems and computer science programs: Overview

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  13. Cloud Computing Based E-Learning System

    ERIC Educational Resources Information Center

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  14. A Computer-Based Dietary Counseling System.

    ERIC Educational Resources Information Center

    Slack, Warner V.; And Others

    1976-01-01

    The preliminary trial of a program in which principles of patient-computer dialogue have been applied to dietary counseling is described. The program was designed to obtain historical information from overweight patients and to provide instruction and guidance regarding dietary behavior. Beginning with a teaching sequence, 25 non-overweight…

  15. Computers and Information Systems in Education.

    ERIC Educational Resources Information Center

    Goodlad, John I.; And Others

    In an effort to increase the awareness of educators about the potential of electronic data processing (EDP) in education and acquaint the EDP specialists with current educational problems, this book discusses the routine uses of EDP for business and student accounting, as well as its innovative uses in instruction. A description of computers and…

  16. Space data systems: Advanced flight computers

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: technology challenges; state-of-the-art assessment; program description; relationship to external programs; and cooperation and coordination effort.

  17. Computer system for forecasting surgery on the eye muscles

    NASA Astrophysics Data System (ADS)

    Avrunin, Oleg G.; Kukharenko, Dmitriy V.; Romanyuk, Sergii O.; Kalizhanova, Aliya; Toygozhinova, Aynur; Gromaszek, Konrad

    2015-12-01

    For the successful surgery on the eye muscles it is recommended to use a computer system of preoperative planning of the surgical correction of strabismus. With using the computer system at surgery planning, ophthalmologist surgeon will be able to choose the best surgical treatment and surgery dosage for a particular patient.

  18. Students Develop Real-World Web and Pervasive Computing Systems.

    ERIC Educational Resources Information Center

    Tappert, Charles C.

    In the academic year 2001-2002, Pace University (New York) Computer Science and Information Systems (CSIS) students developed real-world Web and pervasive computing systems for actual customers. This paper describes the general use of team projects in CSIS at Pace University, the real-world projects from this academic year, the benefits of…

  19. 21 CFR 892.1200 - Emission computed tomography system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Emission computed tomography system. 892.1200 Section 892.1200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1200 Emission computed tomography system. (a) Identification. An...

  20. Computer aided analysis and optimization of mechanical system dynamics

    NASA Technical Reports Server (NTRS)

    Haug, E. J.

    1984-01-01

    The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.

  1. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  2. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  3. Impact of new computing systems on computational mechanics and flight-vehicle structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Storaasli, O. O.; Fulton, R. E.

    1984-01-01

    Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.

  4. On the writing of programming systems for spacecraft computers.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.; Rohr, J. A.

    1972-01-01

    Consideration of the systems designed to generate programs for the increasingly complex digital computers being used on board unmanned deep-space probes. Such programming systems must accommodate the special-purpose features incorporated in the hardware. The use of higher-level language facilities in the programming system can significantly simplify the task. Computers for Mariner and for the Outer Planets Grand Tour are briefly described, as well as their programming systems. Aspects of the higher level languages are considered.

  5. Computer-Assisted Monitoring Of A Complex System

    NASA Technical Reports Server (NTRS)

    Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.

    1995-01-01

    Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.

  6. Computer-aided design of flight control systems

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Sircar, Subrata

    1991-01-01

    A computer program is presented for facilitating the development and assessment of flight control systems, and application to a control design is discussed. The program is a computer-aided control-system design program based on direct digital synthesis of a proportional-integral-filter controller with scheduled linear-quadratic-Gaussian gains and command generator tracking of pilot inputs. The FlightCAD system concentrates on aircraft dynamics, flight-control systems, stability and performance, and has practical engineering applications.

  7. Architectural issues in fault-tolerant, secure computing systems

    SciTech Connect

    Joseph, M.K.

    1988-01-01

    This dissertation explores several facets of the applicability of fault-tolerance techniques to secure computer design, these being: (1) how fault-tolerance techniques can be used on unsolved problems in computer security (e.g., computer viruses, and denial-of-service); (2) how fault-tolerance techniques can be used to support classical computer-security mechanisms in the presence of accidental and deliberate faults; and (3) the problems involved in designing a fault-tolerant, secure computer system (e.g., how computer security can degrade along with both the computational and fault-tolerance capabilities of a computer system). The approach taken in this research is almost as important as its results. It is different from current computer-security research in that a design paradigm for fault-tolerant computer design is used. This led to an extensive fault and error classification of many typical security threats. Throughout this work, a fault-tolerance perspective is taken. However, the author did not ignore basic computer-security technology. For some problems he investigated how to support and extend basic-security mechanism (e.g., trusted computing base), instead of trying to achieve the same result with purely fault-tolerance techniques.

  8. Evaluation of computer-based ultrasonic inservice inspection systems

    SciTech Connect

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems.

  9. Software fault tolerance in computer operating systems

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar K.; Lee, Inhwan

    1994-01-01

    This chapter provides data and analysis of the dependability and fault tolerance for three operating systems: the Tandem/GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Based on measurements from these systems, basic software error characteristics are investigated. Fault tolerance in operating systems resulting from the use of process pairs and recovery routines is evaluated. Two levels of models are developed to analyze error and recovery processes inside an operating system and interactions among multiple instances of an operating system running in a distributed environment. The measurements show that the use of process pairs in Tandem systems, which was originally intended for tolerating hardware faults, allows the system to tolerate about 70% of defects in system software that result in processor failures. The loose coupling between processors which results in the backup execution (the processor state and the sequence of events occurring) being different from the original execution is a major reason for the measured software fault tolerance. The IBM/MVS system fault tolerance almost doubles when recovery routines are provided, in comparison to the case in which no recovery routines are available. However, even when recovery routines are provided, there is almost a 50% chance of system failure when critical system jobs are involved.

  10. Computer graphics application in the engineering design integration system

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.

    1975-01-01

    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  11. Middleware in Modern High Performance Computing System Architectures

    SciTech Connect

    Engelmann, Christian; Ong, Hong Hoe; Scott, Stephen L

    2007-01-01

    A recent trend in modern high performance computing (HPC) system architectures employs ''lean'' compute nodes running a lightweight operating system (OS). Certain parts of the OS as well as other system software services are moved to service nodes in order to increase performance and scalability. This paper examines the impact of this HPC system architecture trend on HPC ''middleware'' software solutions, which traditionally equip HPC systems with advanced features, such as parallel and distributed programming models, appropriate system resource management mechanisms, remote application steering and user interaction techniques. Since the approach of keeping the compute node software stack small and simple is orthogonal to the middleware concept of adding missing OS features between OS and application, the role and architecture of middleware in modern HPC systems needs to be revisited. The result is a paradigm shift in HPC middleware design, where single middleware services are moved to service nodes, while runtime environments (RTEs) continue to reside on compute nodes.

  12. [Computer-assisted system for interstitial hyperthermia].

    PubMed

    Kneschaurek, P; Weisser, M

    1987-03-01

    The combination of interstitial radiotherapy and interstitial hyperthermia is more promising in the treatment of tumors than one of these methods alone. The unit developed by us uses the afterloading needles for heating up the tumor tissue with ohm current and for controlling the distribution of temperature in the target volume. Up to twelve needles are provided by one commutator with the R.F. current controlled by the computer. The temperature is measured by three thermistors per needle which are arranged at an axial distance of 2 cm each. The linearization of the thermistor characteristics and the control of cummutator and R.F. generator is performed by the computer over an interface constructed by us. In order to achieve a homogeneous distribution of temperature in the target volume and to avoid hot spots, we have examined several needle configurations by measuring in an homogeneous phantom. PMID:3563878

  13. TRL Computer System User’s Guide

    SciTech Connect

    Engel, David W.; Dalton, Angela C.

    2014-01-31

    We have developed a wiki-based graphical user-interface system that implements our technology readiness level (TRL) uncertainty models. This document contains the instructions for using this wiki-based system.

  14. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  15. An Annotated and Cross-Referenced Bibliography on Computer Security and Access Control in Computer Systems.

    ERIC Educational Resources Information Center

    Bergart, Jeffrey G.; And Others

    This paper represents a careful study of published works on computer security and access control in computer systems. The study includes a selective annotated bibliography of some eighty-five important published results in the field and, based on these papers, analyzes the state of the art. In annotating these works, the authors try to be…

  16. Avoiding pitfalls in simulating real-time computer systems

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    The software simulation of a computer target system on a computer host system, known as an interpretive computer simulator (ICS), functionally models and implements the action of the target hardware. For an ICS to function as efficiently as possible and to avoid certain pitfalls in designing an ICS, it is important that the details of the hardware architectural design of both the target and the host computers be known. This paper discusses both host selection considerations and ICS design features that, without proper consideration, could make the resulting ICS too slow to use or too costly to maintain and expand.

  17. Comprehensive automatic assessment of retinal vascular abnormalities for computer-assisted retinopathy grading.

    PubMed

    Joshi, Vinayak; Agurto, Carla; VanNess, Richard; Nemeth, Sheila; Soliz, Peter; Barriga, Simon

    2014-01-01

    One of the most important signs of systemic disease that presents on the retina is vascular abnormalities such as in hypertensive retinopathy. Manual analysis of fundus images by human readers is qualitative and lacks in accuracy, consistency and repeatability. Present semi-automatic methods for vascular evaluation are reported to increase accuracy and reduce reader variability, but require extensive reader interaction; thus limiting the software-aided efficiency. Automation thus holds a twofold promise. First, decrease variability while increasing accuracy, and second, increasing the efficiency. In this paper we propose fully automated software as a second reader system for comprehensive assessment of retinal vasculature; which aids the readers in the quantitative characterization of vessel abnormalities in fundus images. This system provides the reader with objective measures of vascular morphology such as tortuosity, branching angles, as well as highlights of areas with abnormalities such as artery-venous nicking, copper and silver wiring, and retinal emboli; in order for the reader to make a final screening decision. To test the efficacy of our system, we evaluated the change in performance of a newly certified retinal reader when grading a set of 40 color fundus images with and without the assistance of the software. The results demonstrated an improvement in reader's performance with the software assistance, in terms of accuracy of detection of vessel abnormalities, determination of retinopathy, and reading time. This system enables the reader in making computer-assisted vasculature assessment with high accuracy and consistency, at a reduced reading time. PMID:25571442

  18. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  19. Application of ubiquitous computing in personal health monitoring systems.

    PubMed

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain. PMID:12451864

  20. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  1. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  2. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated...

  3. Performance Models for Split-execution Computing Systems

    SciTech Connect

    Humble, Travis S; McCaskey, Alex; Schrock, Jonathan; Seddiqi, Hadayat; Britt, Keith A; Imam, Neena

    2016-01-01

    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardware limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.

  4. Parallel Computing Environments and Methods for Power Distribution System Simulation

    SciTech Connect

    Lu, Ning; Taylor, Zachary T.; Chassin, David P.; Guttromson, Ross T.; Studham, Scott S.

    2005-11-10

    The development of cost-effective high-performance parallel computing on multi-processor super computers makes it attractive to port excessively time consuming simulation software from personal computers (PC) to super computes. The power distribution system simulator (PDSS) takes a bottom-up approach and simulates load at appliance level, where detailed thermal models for appliances are used. This approach works well for a small power distribution system consisting of a few thousand appliances. When the number of appliances increases, the simulation uses up the PC memory and its run time increases to a point where the approach is no longer feasible to model a practical large power distribution system. This paper presents an effort made to port a PC-based power distribution system simulator (PDSS) to a 128-processor shared-memory super computer. The paper offers an overview of the parallel computing environment and a description of the modification made to the PDSS model. The performances of the PDSS running on a standalone PC and on the super computer are compared. Future research direction of utilizing parallel computing in the power distribution system simulation is also addressed.

  5. Artificial intelligence, expert systems, computer vision, and natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  6. Automated drafting system uses computer techniques

    NASA Technical Reports Server (NTRS)

    Millenson, D. H.

    1966-01-01

    Automated drafting system produces schematic and block diagrams from the design engineers freehand sketches. This system codes conventional drafting symbols and their coordinate locations on standard size drawings for entry on tapes that are used to drive a high speed photocomposition machine.

  7. Computing Operating Characteristics Of Bearing/Shaft Systems

    NASA Technical Reports Server (NTRS)

    Moore, James D.

    1996-01-01

    SHABERTH computer program predicts operating characteristics of bearings in multibearing load-support system. Lubricated and nonlubricated bearings modeled. Calculates loads, torques, temperatures, and fatigue lives of ball and/or roller bearings on single shaft. Provides for analysis of reaction of system to termination of supply of lubricant to bearings and other lubricated mechanical elements. Valuable in design and analysis of shaft/bearing systems. Two versions of SHABERTH available. Cray version (LEW-14860), "Computing Thermal Performances Of Shafts and Bearings". IBM PC version (MFS-28818), written for IBM PC-series and compatible computers running MS-DOS.

  8. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  9. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  10. Bringing the CMS distributed computing system into scalable operations

    NASA Astrophysics Data System (ADS)

    Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.

    2010-04-01

    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.

  11. Terahertz Computed Tomography of NASA Thermal Protection System Materials

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Reyes-Rodriguez, S.; Zimdars, D. A.; Rauser, R. W.; Ussery, W. W.

    2011-01-01

    A terahertz axial computed tomography system has been developed that uses time domain measurements in order to form cross-sectional image slices and three-dimensional volume renderings of terahertz-transparent materials. The system can inspect samples as large as 0.0283 cubic meters (1 cubic foot) with no safety concerns as for x-ray computed tomography. In this study, the system is evaluated for its ability to detect and characterize flat bottom holes, drilled holes, and embedded voids in foam materials utilized as thermal protection on the external fuel tanks for the Space Shuttle. X-ray micro-computed tomography was also performed on the samples to compare against the terahertz computed tomography results and better define embedded voids. Limits of detectability based on depth and size for the samples used in this study are loosely defined. Image sharpness and morphology characterization ability for terahertz computed tomography are qualitatively described.

  12. Evolutionary Computational Methods for Identifying Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2011-01-01

    A technique based on Evolutionary Computational Methods (ECMs) was developed that allows for the automated optimization of complex computationally modeled systems, such as autonomous systems. The primary technology, which enables the ECM to find optimal solutions in complex search spaces, derives from evolutionary algorithms such as the genetic algorithm and differential evolution. These methods are based on biological processes, particularly genetics, and define an iterative process that evolves parameter sets into an optimum. Evolutionary computation is a method that operates on a population of existing computational-based engineering models (or simulators) and competes them using biologically inspired genetic operators on large parallel cluster computers. The result is the ability to automatically find design optimizations and trades, and thereby greatly amplify the role of the system engineer.

  13. Research in computer vision for autonomous systems

    NASA Astrophysics Data System (ADS)

    Kak, Avi; Yoder, Mark; Andress, Keith; Blask, Steve; Underwood, Tom

    1988-09-01

    This report addresses FLIR processing, LADAR processing and electronic terrain board modeling. In our discussion on FLIR processing, issues were analyzed for classifiability of FLIR features, computationally efficient algorithms for target segmentation, metrics, etc. The discussion on LADAR includes a comparison of a number of different approaches to the segmentation of target surfaces from range images, extraction of silhouettes at different ranges, and reasoning strategies for the recognition of targets and estimation of their aspects. Regarding electronic terrain board modeling, it was shown how the readily available wire-frame data for strategic targets can be converted into volumetric models utilizing the concepts of constructive solid geometry; then is was shown how from the resulting volumetric models it is possible to generate synthetic range images that are very similar to real LADAR images. Also shown is how sensor noise can be added to these synthetic images to make them even more realistic.

  14. Fault tolerant computing: A preamble for assuring viability of large computer systems

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1977-01-01

    The need for fault-tolerant computing is addressed from the viewpoints of (1) why it is needed, (2) how to apply it in the current state of technology, and (3) what it means in the context of the Phoenix computer system and other related systems. To this end, the value of concurrent error detection and correction is described. User protection, program retry, and repair are among the factors considered. The technology of algebraic codes to protect memory systems and arithmetic codes to protect memory systems and arithmetic codes to protect arithmetic operations is discussed.

  15. Scientific computation systems quality branch manual

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A manual is presented which is designed to familiarize the GE 635 user with the configuration and operation of the overall system. Work submission, programming standards, restrictions, testing and debugging, and related general information is provided for GE 635 programmer.

  16. Computer Simulation Models of Economic Systems in Higher Education.

    ERIC Educational Resources Information Center

    Smith, Lester Sanford

    The increasing complexity of educational operations make analytical tools, such as computer simulation models, especially desirable for educational administrators. This MA thesis examined the feasibility of developing computer simulation models of economic systems in higher education to assist decision makers in allocating resources. The report…

  17. A Proposed Programming System for Knuth's Mix Computer.

    ERIC Educational Resources Information Center

    Akers, Max Neil

    A programing system using a hypothetical computer is proposed for use in teaching machine and assembly language programing courses. Major components such as monitor, assembler, interpreter, grader, and diagnostics are described. The interpreter is programed and documented for use on an IBM 360/67 computer. The interpreter can be used for teaching…

  18. System optimization of gasdynamic lasers, computer program user's manual

    NASA Technical Reports Server (NTRS)

    Otten, L. J., III; Saunders, R. C., III; Morris, S. J.

    1978-01-01

    The user's manual for a computer program that performs system optimization of gasdynamic lasers is provided. Detailed input/output formats are CDC 7600/6600 computers using a dialect of FORTRAN. Sample input/output data are provided to verify correct program operation along with a program listing.

  19. Two Year Computer System Technology Curricula for the '80's.

    ERIC Educational Resources Information Center

    Palko, Donald N.; Hata, David M.

    1982-01-01

    The computer industry is viewed on a collision course with a human resources crisis. Changes expected during the next decade are outlined, with expectations noted that merging of hardware and software skills will be met in a technician's skill set. Essential curricula components of a computer system technology program are detailed. (MP)

  20. Computer-Aided Personalized System of Instruction: A Program Evaluation.

    ERIC Educational Resources Information Center

    Pear, Joseph J.; Novak, Mark

    1996-01-01

    Presents an evaluation of a computer-aided personalized system of instruction program in two undergraduate psychology courses. The computer presented short essay tests and arranged for students who had completed various assignments satisfactorily to help evaluate other students' mastery of those assignments. Student response generally was…