Benefits of Automatic Duplexing Fact Sheet
This resource provides a how-to duplex guide, informs the reader on the benefits and cost savings from automatic duplexing, and several off the shelf print management software options for your facility.
Speeding response, saving lives : automatic vehicle location capabilities for emergency services.
DOT National Transportation Integrated Search
1999-01-01
Information from automatic vehicle location systems, when combined with computeraided dispatch software, can provide a rich source of data for analyzing emergency vehicle operations and evaluating agency performance.
[Development of a Software for Automatically Generated Contours in Eclipse TPS].
Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin
2015-03-01
The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.
Application of software technology to automatic test data analysis
NASA Technical Reports Server (NTRS)
Stagner, J. R.
1991-01-01
The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.
Scholtz, Jan-Erik; Wichmann, Julian L; Kaup, Moritz; Fischer, Sebastian; Kerl, J Matthias; Lehnert, Thomas; Vogl, Thomas J; Bauer, Ralf W
2015-03-01
To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. 77 patients (28 women, 49 men, mean age 65.3±14.4 years) with known or suspected spinal disorders (degenerative spine disease n=32; disc herniation n=36; traumatic vertebral fractures n=9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p<0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p<0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time-saving when reconstructions of 2 and more vertebrae are performed. Checking results of automatic labeling is necessary to prevent errors in labeling. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
[Development of a Compared Software for Automatically Generated DVH in Eclipse TPS].
Xie, Zhao; Luo, Kelin; Zou, Lian; Hu, Jinyou
2016-03-01
This study is to automatically calculate the dose volume histogram(DVH) for the treatment plan, then to compare it with requirements of doctor's prescriptions. The scripting language Autohotkey and programming language C# were used to develop a compared software for automatically generated DVH in Eclipse TPS. This software is named Show Dose Volume Histogram (ShowDVH), which is composed of prescription documents generation, operation functions of DVH, software visualization and DVH compared report generation. Ten cases in different cancers have been separately selected, in Eclipse TPS 11.0 ShowDVH could not only automatically generate DVH reports but also accurately determine whether treatment plans meet the requirements of doctor’s prescriptions, then reports gave direction for setting optimization parameters of intensity modulated radiated therapy. The ShowDVH is an user-friendly and powerful software, and can automatically generated compared DVH reports fast in Eclipse TPS 11.0. With the help of ShowDVH, it greatly saves plan designing time and improves working efficiency of radiation therapy physicists.
Gambacorta, Maria A; Boldrini, Luca; Valentini, Chiara; Dinapoli, Nicola; Mattiucci, Gian C; Chiloiro, Giuditta; Pasini, Danilo; Manfrida, Stefania; Caria, Nicola; Minsky, Bruce D; Valentini, Vincenzo
2016-07-05
To validate autocontouring software (AS) in a clinical practice including a two steps delineation quality assurance (QA) procedure.The existing delineation agreement among experts for rectal cancer and the overlap and time criteria that have to be verified to allow the use of AS were defined.Median Dice Similarity Coefficient (MDSC), Mean slicewise Hausdorff Distances (MSHD) and Total-Time saving (TT) were analyzed.Two expert Radiation Oncologists reviewed CT-scans of 44 patients and agreed the reference-CTV: the first 14 consecutive cases were used to populate the software Atlas and 30 were used as Test.Each expert performed a manual (group A) and an automatic delineation (group B) of 15 Test patients.The delineations were compared with the reference contours.The overlap between the manual and automatic delineations with MDSC and MSHD and the TT were analyzed.Three acceptance criteria were set: MDSC ≥ 0.75, MSHD ≤1mm and TT sparing ≥ 50%.At least 2 criteria had to be met, one of which had to be TT saving, to validate the system.The MDSC was 0.75, MSHD 2.00 mm and the TT saving 55.5% between group A and group B. MDSC among experts was 0.84.Autosegmentation systems in rectal cancer partially met acceptability criteria with the present version.
La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto
2012-09-18
To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.
Egger, Jan; Kappus, Christoph; Freisleben, Bernd; Nimsky, Christopher
2012-08-01
In this contribution, a medical software system for volumetric analysis of different cerebral pathologies in magnetic resonance imaging (MRI) data is presented. The software system is based on a semi-automatic segmentation algorithm and helps to overcome the time-consuming process of volume determination during monitoring of a patient. After imaging, the parameter settings-including a seed point-are set up in the system and an automatic segmentation is performed by a novel graph-based approach. Manually reviewing the result leads to reseeding, adding seed points or an automatic surface mesh generation. The mesh is saved for monitoring the patient and for comparisons with follow-up scans. Based on the mesh, the system performs a voxelization and volume calculation, which leads to diagnosis and therefore further treatment decisions. The overall system has been tested with different cerebral pathologies-glioblastoma multiforme, pituitary adenomas and cerebral aneurysms- and evaluated against manual expert segmentations using the Dice Similarity Coefficient (DSC). Additionally, intra-physician segmentations have been performed to provide a quality measure for the presented system.
NASA Astrophysics Data System (ADS)
Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie
2014-02-01
Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.
Automatic Energy Schemes for High Performance Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundriyal, Vaibhav
Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-allmore » and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.« less
NASA Astrophysics Data System (ADS)
Sun, Wenhao; Cai, Xudong; Meng, Qiao
2016-04-01
Complex automatic protection functions are being added to the onboard software of the Alpha Magnetic Spectrometer. A hardware-in-the-loop simulation method has been introduced to overcome the difficulties of ground testing that are brought by hardware and environmental limitations. We invented a time-saving approach by reusing the flight data as the data source of the simulation system instead of mathematical models. This is easy to implement and it works efficiently. This paper presents the system framework, implementation details and some application examples.
Automated detection of diabetic retinopathy in retinal images.
Valverde, Carmen; Garcia, Maria; Hornero, Roberto; Lopez-Galvez, Maria I
2016-01-01
Diabetic retinopathy (DR) is a disease with an increasing prevalence and the main cause of blindness among working-age population. The risk of severe vision loss can be significantly reduced by timely diagnosis and treatment. Systematic screening for DR has been identified as a cost-effective way to save health services resources. Automatic retinal image analysis is emerging as an important screening tool for early DR detection, which can reduce the workload associated to manual grading as well as save diagnosis costs and time. Many research efforts in the last years have been devoted to developing automatic tools to help in the detection and evaluation of DR lesions. However, there is a large variability in the databases and evaluation criteria used in the literature, which hampers a direct comparison of the different studies. This work is aimed at summarizing the results of the available algorithms for the detection and classification of DR pathology. A detailed literature search was conducted using PubMed. Selected relevant studies in the last 10 years were scrutinized and included in the review. Furthermore, we will try to give an overview of the available commercial software for automatic retinal image analysis.
NASA Technical Reports Server (NTRS)
Dunne, Matthew J.
2011-01-01
The development of computer software as a tool to generate visual displays has led to an overall expansion of automated computer generated images in the aerospace industry. These visual overlays are generated by combining raw data with pre-existing data on the object or objects being analyzed on the screen. The National Aeronautics and Space Administration (NASA) uses this computer software to generate on-screen overlays when a Visiting Vehicle (VV) is berthing with the International Space Station (ISS). In order for Mission Control Center personnel to be a contributing factor in the VV berthing process, computer software similar to that on the ISS must be readily available on the ground to be used for analysis. In addition, this software must perform engineering calculations and save data for further analysis.
Architectural Analysis of Complex Evolving Systems of Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Ackemann, Chris; Yonkwa, Lyly; Ganesan, Dharma
2009-01-01
The goal of this collaborative project between FC-MD, APL, and GSFC and supported by NASA IV&V Software Assurance Research Program (SARP), was to develop a tool, Dynamic SAVE, or Dyn-SAVE for short, for analyzing architectures of systems of systems. The project team was comprised of the principal investigator (PI) from FC-MD and four other FC-MD scientists (part time) and several FC-MD students (full time), as well as, two APL software architects (part time), and one NASA POC (part time). The PI and FC-MD scientists together with APL architects were responsible for requirements analysis, and for applying and evaluating the Dyn-SAVE tool and method. The PI and a group of FC-MD scientists were responsible for improving the method and conducting outreach activities, while another group of FC-MD scientists were responsible for development and improvement of the tool. Oversight and reporting was conducted by the PI and NASA POC. The project team produced many results including several prototypes of the Dyn-SAVE tool and method, several case studies documenting how the tool and method was applied to APL s software systems, and several published papers in highly respected conferences and journals. Dyn-SAVE as developed and enhanced throughout this research period, is a software tool intended for software developers and architects, software integration testers, and persons who need to analyze software systems from the point of view of how it communicates with other systems. Using the tool, the user specifies the planned communication behavior of the system modeled as a sequence diagram. The user then captures and imports the actual communication behavior of the system, which is then converted and visualized as a sequence diagram by Dyn-SAVE. After mapping the planned to the actual and specifying parameter and timing constraints, Dyn-SAVE detects and highlights deviations between the planned and the actual behavior. Requirements based on the need to analyze two inter-system communication protocols that are representative of protocols used in the Aerospace industry have been specified. The protocols are related: APL s Common Ground System (CGS) as used in the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) and the Radiation Belt Space Probes (RBSP) missions. The analyzed communications were implementations of the Telemetry protocol and the CCSDS File Delivery Protocol (CFDP) protocol. Based on these requirements, three prototypes of Dyn-SAVE were developed and applied to these protocols. The application of Dyn-SAVE to these protocols resulted in the detection of several issues. Dyn-SAVE was also applied to several Testbeds that have previously been used for experimentation earlier on this project, as well as, to other protocols and logs for testing its broader applicability. For example, Dyn-SAVE was used to analyze 1) the communication pattern between a web browser and a web server, 2) the system log of a computer in order to detect offnominal computer shut-down behavior, and 3) the actual test cases of NASA Goddard s Core Flight System (CFS) and automatically generated test cases in order to determine the overlap between the two sets of test cases. In all cases, Dyn-SAVE assisted in providing insightful conclusions about each of the cases identified above.
Microcomputer-controlled world time display for public area viewing
NASA Astrophysics Data System (ADS)
Yep, S.; Rashidian, M.
1982-05-01
The design, development, and implementation of a microcomputer-controlled world clock is discussed. The system, designated international Time Display System (ITDS), integrates a Geochron Calendar Map and a microcomputer-based digital display to automatically compensate for daylight savings time, leap year, and time zone differences. An in-depth technical description of the design and development of the electronic hardware, firmware, and software systems is provided. Reference material on the time zones, fabrication techniques, and electronic subsystems are also provided.
Wang, Wen-Bin; Li, Jang-Yuan; Wu, Qi-Jun
2007-01-01
A LabVIEW-based self-constructed chemical virtual instrument (VI) has been developed for determining temperatures and pressures. It can be put together easily and quickly by selecting hardware modules, such as the PCI-DAQ card or serial port method, different kinds of sensors, signal-conditioning circuits or finished chemical instruments, and software modules such as data acquisition, saving, proceeding. The VI system provides individual and extremely flexible solutions for automatic measurements in physical chemistry research.
Wang, Wen-Bin; Li, Jang-Yuan; Wu, Qi-Jun
2007-01-01
A LabVIEW-based self-constructed chemical virtual instrument (VI) has been developed for determining temperatures and pressures. It can be put together easily and quickly by selecting hardware modules, such as the PCI-DAQ card or serial port method, different kinds of sensors, signal-conditioning circuits or finished chemical instruments, and software modules such as data acquisition, saving, proceeding. The VI system provides individual and extremely flexible solutions for automatic measurements in physical chemistry research. PMID:17671611
González, Jorge Ernesto; Radl, Analía; Romero, Ivonne; Barquinero, Joan Francesc; García, Omar; Di Giorgio, Marina
2016-12-01
Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
UniPrime2: a web service providing easier Universal Primer design.
Boutros, Robin; Stokes, Nicola; Bekaert, Michaël; Teeling, Emma C
2009-07-01
The UniPrime2 web server is a publicly available online resource which automatically designs large sets of universal primers when given a gene reference ID or Fasta sequence input by a user. UniPrime2 works by automatically retrieving and aligning homologous sequences from GenBank, identifying regions of conservation within the alignment, and generating suitable primers that can be used to amplify variable genomic regions. In essence, UniPrime2 is a suite of publicly available software packages (Blastn, T-Coffee, GramAlign, Primer3), which reduces the laborious process of primer design, by integrating these programs into a single software pipeline. Hence, UniPrime2 differs from previous primer design web services in that all steps are automated, linked, saved and phylogenetically delimited, only requiring a single user-defined gene reference ID or input sequence. We provide an overview of the web service and wet-laboratory validation of the primers generated. The system is freely accessible at: http://uniprime.batlab.eu. UniPrime2 is licenced under a Creative Commons Attribution Noncommercial-Share Alike 3.0 Licence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wells, D.; Green, J.
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
Analyzing the Core Flight Software (CFS) with SAVE
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David
2008-01-01
This viewgraph presentation describes the SAVE tool and it's application to Core Flight Software (CFS). The contents include: 1) Fraunhofer-a short intro; 2) Context of this Collaboration; 3) CFS-Core Flight Software?; 4) The SAVE Tool; 5) Applying SAVE to CFS -A few example analyses; and 6) Goals.
Advisory Systems Save Time, Fuel for Airlines
NASA Technical Reports Server (NTRS)
2012-01-01
Heinz Erzberger never thought the sky was falling, but he knew it could benefit from enhanced traffic control. Throughout the 1990s, Erzberger led a team at Ames Research Center to develop a suite of automated tools to reduce restrictions and improve the efficiency of air traffic control operations. Called CTAS, or Center-TRACON (Terminal Radar Approach Control) Automation System, the software won NASA s Software of the Year award in 1998, and one of the tools in the suite - the traffic management advisor - was adopted by the Federal Aviation Administration and implemented at traffic control centers across the United States. Another one of the tools, Direct-To, has followed a different path. The idea behind Direct-To, explains Erzberger, a senior scientist at Ames, was that airlines could save fuel and money by shortening the routes they flew between take-off and landing. Aircraft are often limited to following established airways comprised of inefficient route segments. The routes are not easily adjusted because neither the pilot nor the aircraft controller can anticipate the constantly changing air traffic situation. To make the routes more direct while in flight, Erzberger came up with an idea for a software algorithm that could automatically examine air traffic in real-time, check to see if a shortcut was available, and then check for conflicts. If there were no conflicts and the shortcut saved more than 1 minute of flight time, the controller could be notified. "I was trying to figure out what goes on in the pilot and controller s minds when they decide to guide the aircraft in a certain way. That resulted in a different kind analysis," Erzberger says. As the engineer s idea went from theory to practice, in 2001, NASA demonstrated Direct-To in the airspace of Dallas-Ft. Worth. Estimations based on the demonstration found the technology was capable of saving 900 flying minutes per day for the aircraft in the test area.
FISH Finder: a high-throughput tool for analyzing FISH images
Shirley, James W.; Ty, Sereyvathana; Takebayashi, Shin-ichiro; Liu, Xiuwen; Gilbert, David M.
2011-01-01
Motivation: Fluorescence in situ hybridization (FISH) is used to study the organization and the positioning of specific DNA sequences within the cell nucleus. Analyzing the data from FISH images is a tedious process that invokes an element of subjectivity. Automated FISH image analysis offers savings in time as well as gaining the benefit of objective data analysis. While several FISH image analysis software tools have been developed, they often use a threshold-based segmentation algorithm for nucleus segmentation. As fluorescence signal intensities can vary significantly from experiment to experiment, from cell to cell, and within a cell, threshold-based segmentation is inflexible and often insufficient for automatic image analysis, leading to additional manual segmentation and potential subjective bias. To overcome these problems, we developed a graphical software tool called FISH Finder to automatically analyze FISH images that vary significantly. By posing the nucleus segmentation as a classification problem, compound Bayesian classifier is employed so that contextual information is utilized, resulting in reliable classification and boundary extraction. This makes it possible to analyze FISH images efficiently and objectively without adjustment of input parameters. Additionally, FISH Finder was designed to analyze the distances between differentially stained FISH probes. Availability: FISH Finder is a standalone MATLAB application and platform independent software. The program is freely available from: http://code.google.com/p/fishfinder/downloads/list Contact: gilbert@bio.fsu.edu PMID:21310746
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour
Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola
2016-01-01
Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non-human animal behaviour science. Further improvements and validation are needed, and future applications and limitations are discussed. PMID:27415814
Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.
Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola
2016-01-01
Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non-human animal behaviour science. Further improvements and validation are needed, and future applications and limitations are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, H; Tan, J; Kavanaugh, J
Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-timemore » and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding unnecessary manual verification for physicians/dosimetrists. In addition, its nature as a compact and stand-alone tool allows for future extensibility to include additional functions for physicians’ clinical needs.« less
The impact of translucent fabric shades and control strategies on energy savings and visual quality
NASA Astrophysics Data System (ADS)
Wankanapon, Pimonmart
Translucent fabric shades provide opportunities for building occupants to control sunlight penetration for heat reduction, thermal comfort, and visual quality. Regulating shades affects building energy and can potentially reduce the size of mechanical cooling systems. Shades are not normally included in energy model studies during the design process, even though shades potential impact energy use. This is because the occupants normally leave shades closed a large fraction of the time, but models are generally performed with no shades. Automatic shade control is now available, so it is necessary to understand the impact of shades on visual quality and their energy saving potential in order to optimize their overall performance. There are very limited studies that have address shades and their integrated performance on energy consumption and visual quality. Most of these do not reflected modern shade types and their application. The goals of this study are: First, to determine the impact of shades on total, heating, cooling and lighting energy savings with different design and operation parameters. Second, to study and develop different automatic shade control strategies to promote and optimize energy savings and visual quality. A simulation-based approach using EnergyPlus in a parametric study provide better understanding energy savings under different shade conditions. The parametric runs addressed various building parameters such as geometry, orientation, site climate, glazing/shade properties, and shade control strategies with integrated lighting control. The impact of shades was determined for total building and space heating, cooling and lighting energy savings. The effect of shades on visual quality was studied using EnergyPlus, AGI32 and DAYSIM for several indices such as daylight glare index (DGI), work plane illuminance, luminance ratios and view. Different shade control strategies and integrated lighting control were considered with two translucent fabric shade colors. The results clearly show the benefit of automatic shade control strategies with integrated lighting control over a condition when shades are closed all day. The main contributor to the total energy savings is from lighting energy savings, followed by cooling energy savings. Shades provide greater benefit in a hot climate and in a moderate climate than in a cold climate. Different control strategies provide savings in the range of 7-35% for annual total space energy with higher savings with light colored shades. Control strategies of shades should be selected and optimized based on climate, orientation, window area, and window/shade properties. High performance glazings, when equipped with shades, show lower energy savings when compared to standard glazings. High transmittance/reflectance shades, such as white shades, perform better than dark shades in most of the cases due to higher lighting energy savings obtained with the automatic electric lighting control and the resulting cooling energy savings from rejection of some solar energy and a reduction in the heat from lights. A South orientation showed the least benefit of automatic control of shades when compare to other orientations due to the large fraction of time shades are required to provide visual comfort. Under automatic shade control, energy savings are higher the more often the shades can be raised. The different automatic control strategies present tradeoffs between energy savings and comfort. With regard to visual quality, daylight quality assessments on view, glare, luminance ratios, and UDI can be used to assess shade control strategies. Automatic shade control can increase the number of view hours while controlling sunlight penetration. With automatic shade control, more daylight hours can be provided within the beneficial range of 100-2000 lux compared to shades that are closed all day. For a person facing the window, discomfort glare is likely to increase the more often the shades are raised. Keeping the shades down ensures an acceptable glare condition, but limits energy savings. Luminance ratios are another metric that can be used to assess shade performance. With white shades, the luminance ratios between the task and proximate surfaces are improved. Dark shades help improve the luminance ratios between the task and distant surfaces. When the shades are left open, even with no direct sunlight in the space, task to window luminance ratios will often exceed 1:10.
ATD-1 Team Completes Flight Tests
2017-02-24
Members of a NASA-led research team pose in front of a trio of aircraft, which on Feb. 22 concluded racking up enough air miles to circle the planet four times, all in the name of testing a new cockpit-based air traffic management tool. The prototype hardware and software is designed to automatically provide pilots with more precise spacing information on approach into a busy airport so that more planes can safely land in a given time. The technology is intended to help airplanes spend less time in the air, save money on fuel, and reduce engine emissions – all the while improving schedule efficiency to help passengers arrive on time.
Clinical evaluation of atlas and deep learning based automatic contouring for lung cancer.
Lustberg, Tim; van Soest, Johan; Gooding, Mark; Peressutti, Devis; Aljabar, Paul; van der Stoep, Judith; van Elmpt, Wouter; Dekker, Andre
2018-02-01
Contouring of organs at risk (OARs) is an important but time consuming part of radiotherapy treatment planning. The aim of this study was to investigate whether using institutional created software-generated contouring will save time if used as a starting point for manual OAR contouring for lung cancer patients. Twenty CT scans of stage I-III NSCLC patients were used to compare user adjusted contours after an atlas-based and deep learning contour, against manual delineation. The lungs, esophagus, spinal cord, heart and mediastinum were contoured for this study. The time to perform the manual tasks was recorded. With a median time of 20 min for manual contouring, the total median time saved was 7.8 min when using atlas-based contouring and 10 min for deep learning contouring. Both atlas based and deep learning adjustment times were significantly lower than manual contouring time for all OARs except for the left lung and esophagus of the atlas based contouring. User adjustment of software generated contours is a viable strategy to reduce contouring time of OARs for lung radiotherapy while conforming to local clinical standards. In addition, deep learning contouring shows promising results compared to existing solutions. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, C
Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary,more » orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.« less
Tokutomi, Tomoharu; Fukushima, Akimune; Yamamoto, Kayono; Bansho, Yasushi; Hachiya, Tsuyoshi; Shimizu, Atsushi
2017-07-14
The Tohoku Medical Megabank project aims to create a next-generation personalized healthcare system by conducting large-scale genome-cohort studies involving three generations of local residents in the areas affected by the Great East Japan Earthquake. We collected medical and genomic information for developing a biobank to be used for this healthcare system. We designed a questionnaire-based pedigree-creation software program named "f-treeGC," which enables even less experienced medical practitioners to accurately and rapidly collect family health history and create pedigree charts. f-treeGC may be run on Adobe AIR. Pedigree charts are created in the following manner: 1) At system startup, the client is prompted to provide required information on the presence or absence of children; f-treeGC is capable of creating a pedigree up to three generations. 2) An interviewer fills out a multiple-choice questionnaire on genealogical information. 3) The information requested includes name, age, gender, general status, infertility status, pregnancy status, fetal status, and physical features or health conditions of individuals over three generations. In addition, information regarding the client and the proband, and birth order information, including multiple gestation, custody, multiple individuals, donor or surrogate, adoption, and consanguinity may be included. 4) f-treeGC shows only marriages between first cousins via the overlay function. 5) f-treeGC automatically creates a pedigree chart, and the chart-creation process is visible for inspection on the screen in real time. 6) The genealogical data may be saved as a file in the original format. The created/modified date and time may be changed as required, and the file may be password-protected and/or saved in read-only format. To enable sorting or searching from the database, the file name automatically contains the terms typed into the entry fields, including physical features or health conditions, by default. 7) Alternatively, family histories are collected using a completed foldable interview paper sheet named "f-sheet", which is identical to the questionnaire in f-treeGC. We developed a questionnaire-based family tree-creation software, named f-treeGC, which is fully compliant with international recommendations for standardized human pedigree nomenclature. The present software simplifies the process of collecting family histories and pedigrees, and has a variety of uses, from genome cohort studies or primary care to genetic counseling.
An Automated Motion Detection and Reward System for Animal Training.
Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F; Black, Kevin J
2015-12-04
A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an "off-the-shelf" automated training system that suited our needs. We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use.
Unified Software Solution for Efficient SPR Data Analysis in Drug Research
Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan
2016-01-01
Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754
Development of 6-DOF painting robot control system
NASA Astrophysics Data System (ADS)
Huang, Junbiao; Liu, Jianqun; Gao, Weiqiang
2017-01-01
With the development of society, the spraying technology of manufacturing industry in China has changed from the manual operation to the 6-DOF (Degree Of Freedom)robot automatic spraying. Spraying painting robot can not only complete the work which does harm to human being, but also improve the production efficiency and save labor costs. Control system is the most critical part of the 6-DOF robots, however, there is still a lack of relevant technology research in China. It is very necessary to study a kind of control system of 6-DOF spraying painting robots which is easy to operation, and has high efficiency and stable performance. With Googol controller platform, this paper develops programs based on Windows CE embedded systems to control the robot to finish the painting work. Software development is the core of the robot control system, including the direct teaching module, playback module, motion control module, setting module, man-machine interface, alarm module, log module, etc. All the development work of the entire software system has been completed, and it has been verified that the entire software works steady and efficient.
Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree-Fock.
Tamayo-Mendoza, Teresa; Kreisbeck, Christoph; Lindh, Roland; Aspuru-Guzik, Alán
2018-05-23
Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult , a Hartree-Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.
Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree–Fock
2018-01-01
Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult, a Hartree–Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.
Automatic Tools for Enhancing the Collaborative Experience in Large Projects
NASA Astrophysics Data System (ADS)
Bourilkov, D.; Rodriquez, J. L.
2014-06-01
With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.
The research of automatic speed control algorithm based on Green CBTC
NASA Astrophysics Data System (ADS)
Lin, Ying; Xiong, Hui; Wang, Xiaoliang; Wu, Youyou; Zhang, Chuanqi
2017-06-01
Automatic speed control algorithm is one of the core technologies of train operation control system. It’s a typical multi-objective optimization control algorithm, which achieve the train speed control for timing, comfort, energy-saving and precise parking. At present, the train speed automatic control technology is widely used in metro and inter-city railways. It has been found that the automatic speed control technology can effectively reduce the driver’s intensity, and improve the operation quality. However, the current used algorithm is poor at energy-saving, even not as good as manual driving. In order to solve the problem of energy-saving, this paper proposes an automatic speed control algorithm based on Green CBTC system. Based on the Green CBTC system, the algorithm can adjust the operation status of the train to improve the efficient using rate of regenerative braking feedback energy while ensuring the timing, comfort and precise parking targets. Due to the reason, the energy-using of Green CBTC system is lower than traditional CBTC system. The simulation results show that the algorithm based on Green CBTC system can effectively reduce the energy-using due to the improvement of the using rate of regenerative braking feedback energy.
NASA Astrophysics Data System (ADS)
Bhattacharjee, T.; Kumar, P.; Fillipe, L.
2018-02-01
Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.
Wang, Yudan; Wen, Guojun; Chen, Han
2017-04-27
The drilling length is an important parameter in the process of horizontal directional drilling (HDD) exploration and recovery, but there has been a lack of accurate, automatically obtained statistics regarding this parameter. Herein, a technique for real-time HDD length detection and a management system based on the electromagnetic detection method with a microprocessor and two magnetoresistive sensors employing the software LabVIEW are proposed. The basic principle is to detect the change in the magnetic-field strength near a current coil while the drill stem and drill-stem joint successively pass through the current coil forward or backward. The detection system consists of a hardware subsystem and a software subsystem. The hardware subsystem employs a single-chip microprocessor as the main controller. A current coil is installed in front of the clamping unit, and two magneto resistive sensors are installed on the sides of the coil symmetrically and perpendicular to the direction of movement of the drill pipe. Their responses are used to judge whether the drill-stem joint is passing through the clamping unit; then, the order of their responses is used to judge the movement direction. The software subsystem is composed of a visual software running on the host computer and a software running in the slave microprocessor. The host-computer software processes, displays, and saves the drilling-length data, whereas the slave microprocessor software operates the hardware system. A combined test demonstrated the feasibility of the entire drilling-length detection system.
Wang, Yudan; Wen, Guojun; Chen, Han
2017-01-01
The drilling length is an important parameter in the process of horizontal directional drilling (HDD) exploration and recovery, but there has been a lack of accurate, automatically obtained statistics regarding this parameter. Herein, a technique for real-time HDD length detection and a management system based on the electromagnetic detection method with a microprocessor and two magnetoresistive sensors employing the software LabVIEW are proposed. The basic principle is to detect the change in the magnetic-field strength near a current coil while the drill stem and drill-stem joint successively pass through the current coil forward or backward. The detection system consists of a hardware subsystem and a software subsystem. The hardware subsystem employs a single-chip microprocessor as the main controller. A current coil is installed in front of the clamping unit, and two magneto resistive sensors are installed on the sides of the coil symmetrically and perpendicular to the direction of movement of the drill pipe. Their responses are used to judge whether the drill-stem joint is passing through the clamping unit; then, the order of their responses is used to judge the movement direction. The software subsystem is composed of a visual software running on the host computer and a software running in the slave microprocessor. The host-computer software processes, displays, and saves the drilling-length data, whereas the slave microprocessor software operates the hardware system. A combined test demonstrated the feasibility of the entire drilling-length detection system. PMID:28448445
Providers issue brief: automated external defibrillators.
Rothouse, M
1999-06-25
With expanded access to automatic external defibrillators, hundreds of lives could be saved on a daily basis. By training nonphysician providers, such as emergency medical service personnel or first responders, this life-saving medical equipment could help improve the survival rates for people suffering from cardiac arrest. During the last two years, state lawmakers have begun to enact legislation that develops training standards and provides immunity from civil liability for automatic external defibrillator users.
Ambrozy, C; Kolar, N A; Rattay, F
2010-01-01
For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.
Automated Installation Verification of COMSOL via LiveLink for MATLAB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowell, Michael W
Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less
Measurement and classification of heart and lung sounds by using LabView for educational use.
Altrabsheh, B
2010-01-01
This study presents the design, development and implementation of a simple low-cost method of phonocardiography signal detection. Human heart and lung signals are detected by using a simple microphone through a personal computer; the signals are recorded and analysed using LabView software. Amplitude and frequency analyses are carried out for various phonocardiography pathological cases. Methods for automatic classification of normal and abnormal heart sounds, murmurs and lung sounds are presented. Various cases of heart and lung sound measurement are recorded and analysed. The measurements can be saved for further analysis. The method in this study can be used by doctors as a detection tool aid and may be useful for teaching purposes at medical and nursing schools.
Systems Maintenance Automated Repair Tasks (SMART)
NASA Technical Reports Server (NTRS)
2008-01-01
SMART is an interactive decision analysis and refinement software system that uses evaluation criteria for discrepant conditions to automatically provide and populate a document/procedure with predefined steps necessary to repair a discrepancy safely, effectively, and efficiently. SMART can store the tacit (corporate) knowledge merging the hardware specification requirements with the actual "how to" repair methods, sequences, and required equipment, all within a user-friendly interface. Besides helping organizations retain repair knowledge in streamlined procedures and sequences, SMART can also help them in saving processing time and expense, increasing productivity, improving quality, and adhering more closely to safety and other guidelines. Though SMART was developed for Space Shuttle applications, its interface is easily adaptable to any hardware that can be broken down by component, subcomponent, discrepancy, and repair.
The School Advanced Ventilation Engineering Software (SAVES)
The School Advanced Ventilation Engineering Software (SAVES) package is a tool to help school designers assess the potential financial payback and indoor humidity control benefits of Energy Recovery Ventilation (ERV) systems for school applications.
Wang, Lin; Liu, Simin; Niu, Tianhua; Xu, Xin
2005-03-18
Single nucleotide polymorphisms (SNPs) provide an important tool in pinpointing susceptibility genes for complex diseases and in unveiling human molecular evolution. Selection and retrieval of an optimal SNP set from publicly available databases have emerged as the foremost bottlenecks in designing large-scale linkage disequilibrium studies, particularly in case-control settings. We describe the architectural structure and implementations of a novel software program, SNPHunter, which allows for both ad hoc-mode and batch-mode SNP search, automatic SNP filtering, and retrieval of SNP data, including physical position, function class, flanking sequences at user-defined lengths, and heterozygosity from NCBI dbSNP. The SNP data extracted from dbSNP via SNPHunter can be exported and saved in plain text format for further down-stream analyses. As an illustration, we applied SNPHunter for selecting SNPs for 10 major candidate genes for type 2 diabetes, including CAPN10, FABP4, IL6, NOS3, PPARG, TNF, UCP2, CRP, ESR1, and AR. SNPHunter constitutes an efficient and user-friendly tool for SNP screening, selection, and acquisition. The executable and user's manual are available at http://www.hsph.harvard.edu/ppg/software.htm
NASA Astrophysics Data System (ADS)
Weber, Walter H.; Mair, H. Douglas; Jansen, Dion
2003-03-01
A suite of basic signal processors has been developed. These basic building blocks can be cascaded together to form more complex processors without the need for programming. The data structures between each of the processors are handled automatically. This allows a processor built for one purpose to be applied to any type of data such as images, waveform arrays and single values. The processors are part of Winspect Data Acquisition software. The new processors are fast enough to work on A-scan signals live while scanning. Their primary use is to extract features, reduce noise or to calculate material properties. The cascaded processors work equally well on live A-scan displays, live gated data or as a post-processing engine on saved data. Researchers are able to call their own MATLAB or C-code from anywhere within the processor structure. A built-in formula node processor that uses a simple algebraic editor may make external user programs unnecessary. This paper also discusses the problems associated with ad hoc software development and how graphical programming languages can tie up researchers writing software rather than designing experiments.
Financial management using a computerized system for evaluating health care invoices.
Magnezi, Racheli; Ashkenazi, Isaac
2005-02-01
The Medical Corps of the Israel Defense Forces (IDF) provides health care services for hundreds of thousands of soldiers in IDF clinics and by purchasing services from civilian institutes. Monthly invoices from civilian institutes are so numerous that most are paid with insufficient scrutiny and valuable information regarding soldiers' health care is lost. Our objective was to develop a computerized system for reviewing invoices and gathering data. Based on Oracle software (Oracle, Redwood Shores, California), the system stores the terms of agreements with medical institutes, enters billing data, calculates invoice totals, manages information, and generates reports. It automatically checks for duplicate invoices and confirms payment. The system allows users to view data for decision-making, creates insurance claim files, identifies incorrect charges, assists in quality assurance, and maintains personal patient records. With the system in operation since 2001, savings significantly increased, to approximately 5% of the IDF health care budget. On the basis of information gathered by the system, changes in medical procedures were implemented that are expected to generate even greater savings.
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process
NASA Technical Reports Server (NTRS)
McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.
1999-01-01
This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.
Effectiveness of an automatic tracking software in underwater motion analysis.
Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia
2013-01-01
Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and supervision, thus allowing short processing time.When tracking underwater movements, the degree of automation of the tracking procedure is influenced by the capability of the algorithm to overcome difficulties linked to the small target size, the low image quality and the presence of background clutters.The newly developed feature-tracking algorithm has shown a good automatic tracking effectiveness in underwater motion analysis with significantly smaller percentage of required manual interventions when compared to a commercial software.
Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald
2016-01-14
Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.
Real-time bus location monitoring using Arduino
NASA Astrophysics Data System (ADS)
Ibrahim, Mohammad Y. M.; Audah, Lukman
2017-09-01
The Internet of Things (IoT) is the network of objects, such as a vehicles, mobile devices, and buildings that have electronic components, software, and network connectivity that enable them to collect data, run commands, and be controlled through the Internet. Controlling physical items from the Internet will increase efficiency and save time. The growing number of devices used by people increases the practicality of having IoT devices on the market. The IoT is also an opportunity to develop products that can save money and time and increase work efficiency. Initially, they need more efficiency for real-time bus location systems, especially in university campuses. This system can easily find the accurate locations of and distances between each bus stop and the estimated time to reach a new location. This system has been separated into two parts, which are the hardware and the software. The hardware parts are the Arduino Uno and the Global Positioning System (GPS), while Google Earth and GpsGate are the software parts. The GPS continuously takes input data from the satellite and stores the latitude and longitude values in the Arduino Uno. If we want to track the vehicle, we need to send the longitude and latitude as a message to the Google Earth software to convert these into maps for navigation. Once the Arduino Uno is activated, it takes the last received latitude and longitude positions' values from GpsGate and sends a message to Google Earth. Once the message has been sent to Google Earth, the current location will be shown, and navigation will be activated automatically. Then it will be broadcast using ManyCam, Google+ Hangouts, and YouTube, as well as Facebook, and appear to users. The additional features use Google Forms for determining problems faced by students, who can also take immediate action against the responsible department. Then after several successful simulations, the results will be shown in real time on a map.
Mining Software Usage with the Automatic Library Tracking Database (ALTD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadri, Bilel; Fahey, Mark R
2013-01-01
Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less
Kim, Jeong Rye; Shim, Woo Hyun; Yoon, Hee Mang; Hong, Sang Hyup; Lee, Jin Seong; Cho, Young Ah; Kim, Sangki
2017-12-01
The purpose of this study is to evaluate the accuracy and efficiency of a new automatic software system for bone age assessment and to validate its feasibility in clinical practice. A Greulich-Pyle method-based deep-learning technique was used to develop the automatic software system for bone age determination. Using this software, bone age was estimated from left-hand radiographs of 200 patients (3-17 years old) using first-rank bone age (software only), computer-assisted bone age (two radiologists with software assistance), and Greulich-Pyle atlas-assisted bone age (two radiologists with Greulich-Pyle atlas assistance only). The reference bone age was determined by the consensus of two experienced radiologists. First-rank bone ages determined by the automatic software system showed a 69.5% concordance rate and significant correlations with the reference bone age (r = 0.992; p < 0.001). Concordance rates increased with the use of the automatic software system for both reviewer 1 (63.0% for Greulich-Pyle atlas-assisted bone age vs 72.5% for computer-assisted bone age) and reviewer 2 (49.5% for Greulich-Pyle atlas-assisted bone age vs 57.5% for computer-assisted bone age). Reading times were reduced by 18.0% and 40.0% for reviewers 1 and 2, respectively. Automatic software system showed reliably accurate bone age estimations and appeared to enhance efficiency by reducing reading times without compromising the diagnostic accuracy.
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Standardization: Hardware and Software Standardization Can Reduce Costs and Save Time
ERIC Educational Resources Information Center
Brooks-Young, Susan
2005-01-01
Sadly, technical support doesn't come cheap. One money-saving strategy that's gained popularity among school technicians is equipment and software standardization. When it works, standardization can be very effective. However, standardization has its drawbacks. This article discusses the advantages and disadvantages of standardization.
NASA Astrophysics Data System (ADS)
Le Bras, Ronan; Kushida, Noriyuki; Mialle, Pierrick; Tomuta, Elena; Arora, Nimar
2017-04-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing a Bayesian method and software to perform the key step of automatic association of seismological, hydroacoustic, and infrasound (SHI) parametric data. In our preliminary testing in the CTBTO, NET_VISA shows much better performance than its currently operating automatic association module, with a rate for automatic events matching the analyst-reviewed events increased by 10%, signifying that the percentage of missed events is lowered by 40%. Initial tests involving analysts also showed that the new software will complete the automatic bulletins of the CTBTO by adding previously missed events. Because products by the CTBTO are also widely distributed to its member States as well as throughout the seismological community, the introduction of a new technology must be carried out carefully, and the first step of operational integration is to first use NET-VISA results within the interactive analysts' software so that the analysts can check the robustness of the Bayesian approach. We report on the latest results both on the progress for automatic processing and for the initial introduction of NET-VISA results in the analyst review process
Development of automatic visceral fat volume calculation software for CT volume data.
Nemoto, Mitsutaka; Yeernuer, Tusufuhan; Masutani, Yoshitaka; Nomura, Yukihiro; Hanaoka, Shouhei; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Ohtomo, Kuni
2014-01-01
To develop automatic visceral fat volume calculation software for computed tomography (CT) volume data and to evaluate its feasibility. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30) in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.
İnce, Fatma Demet; Ellidağ, Hamit Yaşar; Koseoğlu, Mehmet; Şimşek, Neşe; Yalçın, Hülya; Zengin, Mustafa Osman
2016-08-01
Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. 209 urine samples were analyzed by the Iris iQ200 ELITE (İris Diagnostics, USA), Dirui FUS-200 (DIRUI Industrial Co., China) automatic urine sediment analyzers and by manual microscopic examination. The degree of concordance (Kappa coefficient) and the rates within the same grading were evaluated. For erythrocytes, leukocytes, epithelial cells, bacteria, crystals and yeasts, the degree of concordance between the two instruments was better than the degree of concordance between the manual microscopic method and the individual devices. There was no concordance between all methods for casts. The results from the automated analyzers for erythrocytes, leukocytes and epithelial cells were similar to the result of microscopic examination. However, in order to avoid any error or uncertainty, some images (particularly: dysmorphic cells, bacteria, yeasts, casts and crystals) have to be analyzed by manual microscopic examination by trained staff. Therefore, the software programs which are used in automatic urine sediment analysers need further development to recognize urinary shaped elements more accurately. Automated systems are important in terms of time saving and standardization.
Speeding response -- saving lives : automatic vehicle location capabilities for emergency vehicles
DOT National Transportation Integrated Search
1999-01-01
This brochure focuses on the application of automatic vehicle location systems to emergency services. It discusses how AVL works with emergency vehicles. how it accommodates a wide range of emergency situations, and the benefits of its use.
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
Design and implementation of a simple acousto optic dual control circuit
NASA Astrophysics Data System (ADS)
Li, Biqing; Li, Zhao
2017-04-01
This page proposed a simple light control circuit which designed by using power supply circuit, sonic circuits, electric circuit and delay circuit four parts. The main chip for CD4011, have inside of the four and to complete the sonic or circuit, electric, delay logic circuit. During the day, no matter how much a pedestrian voice, is ever shine light bulb. Dark night, circuit in a body to make the microphone as long as testing noise, and will automatically be bright for pedestrians lighting, several minutes after the automatic and put out, effective energy saving. Applicable scope and the working principle of the circuit principle diagram and given device parameters selection, power saving effect is obvious, at the same time greatly reduce the maintenance quantity, saving money, use effect is good.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
Software feedback for monochromator tuning at UNICAT (abstract)
NASA Astrophysics Data System (ADS)
Jemian, Pete R.
2002-03-01
Automatic tuning of double-crystal monochromators presents an interesting challenge in software. The goal is to either maximize, or hold constant, the throughput of the monochromator. An additional goal of the software feedback is to disable itself when there is no beam and then, at the user's discretion, re-enable itself when the beam returns. These and other routine goals, such as adherence to limits of travel for positioners, are maintained by software controls. Many solutions exist to lock in and maintain a fixed throughput. Among these include a hardware solution involving a wave form generator, and a lock-in amplifier to autocorrelate the movement of a piezoelectric transducer (PZT) providing fine adjustment of the second crystal Bragg angle. This solution does not work when the positioner is a slow acting device such as a stepping motor. Proportional integral differential (PID) loops have been used to provide feedback through software but additional controls must be provided to maximize the monochromator throughput. Presented here is a software variation of the PID loop which meets the above goals. By using two floating point variables as inputs, representing the intensity of x rays measured before and after the monochromator, it attempts to maximize (or hold constant) the ratio of these two inputs by adjusting an output floating point variable. These floating point variables are connected to hardware channels corresponding to detectors and positioners. When the inputs go out of range, the software will stop making adjustments to the control output. Not limited to monochromator feedback, the software could be used, with beam steering positioners, to maintain a measure of beam position. Advantages of this software feedback are the flexibility of its various components. It has been used with stepping motors and PZTs as positioners. Various devices such as ion chambers, scintillation counters, photodiodes, and photoelectron collectors have been used as detectors. The software provides significant cost savings over hardware feedback methods. Presently implemented in EPICS, the software is sufficiently general to any automated instrument control system.
Development of software for computing forming information using a component based approach
NASA Astrophysics Data System (ADS)
Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye
2009-12-01
In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.
Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnette, Daniel W.
2012-01-05
Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less
Speeding response, saving lives : automatic vehicle location capabilities for emergency services
DOT National Transportation Integrated Search
1999-01-01
This brochure focuses on the application of automatic vehicle location systems to emergency services. It discusses how AVL works with emergency vehicles and how it accommodates a wide range of emergency situations, and the benefits of ITS use. (3 p.)
Saturn S-2 Automatic Software System /SASS/
NASA Technical Reports Server (NTRS)
Parker, P. E.
1967-01-01
SATURN S-2 Automatic Software System /SASS/ was designed and implemented to aid SATURN S-2 program development and to increase the overall operating efficiency within the S-2 data laboratory. This program is written in FORTRAN 2 for SDS 920 computers.
Automatic Commercial Permit Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grana, Paul
Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.
Development of energy saving automatic air conditioner
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okada, T.; Iijima, T.; Kakinuma, A.
1986-01-01
This paper discusses an automatic air conditioner which adopts a new energy saving control method for controlling heat exchange at the heater and the cooler instead of the conventional reheat air-mix one. In this new air conditioner, the cooler does not work when the passenger room is heated and similarly the heater does not work when the passenger room is cooled, minimizing the use rate of the cooler which accounts for the most of the air conditioner's power consumption. Nonetheless, the heat released from the air conditioner to the room can be adjusted smoothly from maximum cooling to maximum heatingmore » just the same as in the conventional type. The results of on-vehicle comparison tests of the above two methods have shown that the energy saving control method saves nearly half of the energy which is consumed in a year with the conventional one, with the room being kept around 25/sup 0/C (77/sup 0/F).« less
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris
2008-01-01
The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.
The Effect of Providing Peer Information on Retirement Savings Decisions
Beshears, John; Choi, James J.; Laibson, David; Madrian, Brigitte C.; Milkman, Katherine L.
2015-01-01
Using a field experiment in a 401(k) plan, we measure the effect of disseminating information about peer behavior on savings. Low-saving employees received simplified plan enrollment or contribution increase forms. A randomized subset of forms stated the fraction of age-matched coworkers participating in the plan or age-matched participants contributing at least 6% of pay to the plan. We document an oppositional reaction: the presence of peer information decreased the savings of nonparticipants who were ineligible for 401(k) automatic enrollment, and higher observed peer savings rates also decreased savings. Discouragement from upward social comparisons seems to drive this reaction. PMID:26045629
EDMUS, a European database for multiple sclerosis.
Confavreux, C; Compston, D A; Hommes, O R; McDonald, W I; Thompson, A J
1992-08-01
EDMUS is a minimal descriptive record developed for research purposes to document clinical and laboratory data in patients with multiple sclerosis (MS). It has been designed by a committee of the European Concerted Action for MS, organised under the auspices of the Commission of the European Communities. The software is user-friendly and fast, with a minimal set of obligatory data. Priority has been given to analytical data and the system is capable of automatically generating data, such as diagnosis classification, using appropriate algorithms. This procedure saves time, ensures a uniform approach to individual cases and allows automatic updating of the classification whenever additional information becomes available. It is also compatible with future developments and requirements since new algorithms can be entered in the programme when necessary. This system is flexible and may be adapted to the users needs. It is run on Apple and IBM-PC personal microcomputers. Great care has been taken to preserve confidentiality of the data. It is anticipated that this "common" language will enable the collection of appropriate cases for specific purposes, including population-based studies of MS and will be particularly useful in projects where the collaboration of several centres is needed to recruit a critical number of patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dagle, J.E.
1992-09-01
The Pacific Northwest Laboratory identified energy savings potential of automatic equipment-room lighting controls, which was demonstrated by the field experiment described in this report. Occupancy sensor applications have gained popularity in recent years due to improved technology that enhances reliability and reduces cost. Automatic lighting control using occupancy sensors has been accepted as an energy-conservation measure because it reduces wasted lighting. This study focused on lighting control for equipment rooms, which have inherent conditions ideal for automatic lighting control, i.e., an area which is seldom occupied, multiple users of the area who would not know if others are in themore » room when they leave, and high lighting energy intensity in the area. Two rooms were selected for this study: a small equipment room in the basement of the 337 Building, and a large equipment area in the upper level of the 329 Building. The rooms were selected to demonstrate the various degrees of complexity which may be encountered in equipment rooms throughout the Hanford Site. The 337 Building equipment-room test case demonstrated a 97% reduction in lighting energy consumption, with an annual energy savings of $184. Including lamp-replacement savings, a total savings of $306 per year is offset by an initial installation cost of $1,100. The installation demonstrates a positive net present value of $2,858 when the lamp-replacement costs are included in a life-cycle analysis. This also corresponds to a 4.0-year payback period. The 329 Building equipment-room installation resulted in a 92% reduction in lighting energy consumption. This corresponds to annual energy savings of $1,372, and a total annual savings of $2,104 per year including lamp-replacement savings. The life-cycle cost analysis shows a net present value of $15,855, with a 5.8-year payback period.« less
The design of automatic software testing module for civil aviation information system
NASA Astrophysics Data System (ADS)
Qi, Qi; Sun, Yang
2018-05-01
In this paper, the practical innovation design is carried out according to the urgent needs of the automatic testing module of civil aviation information system. Firstly, the background and significance of the automatic testing module of civil aviation information system is expounded, and the current research status of automatic testing module and the advantages and disadvantages of related software are analyzed. Then, from the three aspects of macro demand, module functional requirement and module nonfunctional demand, we further study the needs of automatic testing module of civil aviation information system. Finally, from the four aspects of module structure, module core function, database and security, we have made an innovative plan for the automatic testing module of civil aviation information system.
NASA Astrophysics Data System (ADS)
Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.
2016-07-01
The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.
Energy-Saving Opportunities for Manufacturing Enterprises (International English Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This fact sheet provides information about the Industrial Technologies Program Save Energy Now energy audit process, software tools, training, energy management standards, and energy efficient technologies to help U.S. companies identify energy cost savings.
12 CFR 7.1018 - Automatic payment plan account.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7.1018 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY BANK ACTIVITIES AND... dividends at the current rate paid on regular savings accounts. The depositor, upon reaching a previously designated age, receives his or her accumulated savings and earned interest in installments of equal amounts...
Shaping electromagnetic waves using software-automatically-designed metasurfaces.
Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie
2017-06-15
We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.
MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration
NASA Technical Reports Server (NTRS)
Ansar, Adnan I.
2011-01-01
MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically
Automatic Generation of Test Oracles - From Pilot Studies to Application
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Smith, Ben
1998-01-01
There is a trend towards the increased use of automation in V&V. Automation can yield savings in time and effort. For critical systems, where thorough V&V is required, these savings can be substantial. We describe a progression from pilot studies to development and use of V&V automation. We used pilot studies to ascertain opportunities for, and suitability of, automating various analyses whose results would contribute to V&V. These studies culminated in the development of an automatic generator of automated test oracles. This was then applied and extended in the course of testing an Al planning system that is a key component of an autonomous spacecraft.
Kim, Won-Seok; Zeng, Pengcheng; Shi, Jian Qing; Lee, Youngjo; Paik, Nam-Jong
2017-01-01
Motion analysis of the hyoid bone via videofluoroscopic study has been used in clinical research, but the classical manual tracking method is generally labor intensive and time consuming. Although some automatic tracking methods have been developed, masked points could not be tracked and smoothing and segmentation, which are necessary for functional motion analysis prior to registration, were not provided by the previous software. We developed software to track the hyoid bone motion semi-automatically. It works even in the situation where the hyoid bone is masked by the mandible and has been validated in dysphagia patients with stroke. In addition, we added the function of semi-automatic smoothing and segmentation. A total of 30 patients' data were used to develop the software, and data collected from 17 patients were used for validation, of which the trajectories of 8 patients were partly masked. Pearson correlation coefficients between the manual and automatic tracking are high and statistically significant (0.942 to 0.991, P-value<0.0001). Relative errors between automatic tracking and manual tracking in terms of the x-axis, y-axis and 2D range of hyoid bone excursion range from 3.3% to 9.2%. We also developed an automatic method to segment each hyoid bone trajectory into four phases (elevation phase, anterior movement phase, descending phase and returning phase). The semi-automatic hyoid bone tracking from VFSS data by our software is valid compared to the conventional manual tracking method. In addition, the ability of automatic indication to switch the automatic mode to manual mode in extreme cases and calibration without attaching the radiopaque object is convenient and useful for users. Semi-automatic smoothing and segmentation provide further information for functional motion analysis which is beneficial to further statistical analysis such as functional classification and prognostication for dysphagia. Therefore, this software could provide the researchers in the field of dysphagia with a convenient, useful, and all-in-one platform for analyzing the hyoid bone motion. Further development of our method to track the other swallowing related structures or objects such as epiglottis and bolus and to carry out the 2D curve registration may be needed for a more comprehensive functional data analysis for dysphagia with big data.
Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan
2018-01-01
Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a greater data amount are areas of future works.
Dryer cuts fuel usage by equivalent of 6-million scf/yr
DOE Office of Scientific and Technical Information (OSTI.GOV)
List, K.H.; Powers, J.
Drying is an integral part of the production of ZnO pellets, a specific requirement for producing a uniform end product. The dryer has to be able to cure the product for a finite period at temperatures up to 500/sup 0/C in the same unit. Substantial savings have been realized because the dryer has enabled the user to optimize operating conditions. Continuous on-stream operations requiring minimum operator attendance, automatic controls, simplicity of design contruction, and the unit's ability to use waste heat also contribute to these savings. Gas exhausted from a nearby kiln provides the total heat requirement for the dryer.more » Positive delivery of the hot flue gas is assured by a blower and automatically controlled dampers. Annual fuel savings based on the use of waste heat, amounts to an equiv of 6 million scf of natural gas. 1 figure.« less
Assessing Your Assets: Systems for Tracking and Managing IT Assets Can Save Time and Dollars
ERIC Educational Resources Information Center
Holub, Patricia A.
2007-01-01
The average school district loses more than $80,000 per year because of lost or damaged IT assets, according to a QED survey cosponsored by Follett Software Company. And many districts--59 percent--still use manual systems to track assets. Enter asset management systems. Software for managing assets, when implemented properly, can save time,…
Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks
Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin
2017-01-01
Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay. PMID:28914816
Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.
Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin
2017-09-15
Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.
Dynamic Weather Routes: A Weather Avoidance Concept for Trajectory-Based Operations
NASA Technical Reports Server (NTRS)
McNally, B. David; Love, John
2011-01-01
The integration of convective weather modeling with trajectory automation for conflict detection, trial planning, direct routing, and auto resolution has uncovered a concept that could help controllers, dispatchers, and pilots identify improved weather routes that result in significant savings in flying time and fuel burn. Trajectory automation continuously and automatically monitors aircraft in flight to find those that could potentially benefit from improved weather reroutes. Controllers, dispatchers, and pilots then evaluate reroute options to assess their suitability given current weather and traffic. In today's operations aircraft fly convective weather avoidance routes that were implemented often hours before aircraft approach the weather and automation does not exist to automatically monitor traffic to find improved weather routes that open up due to changing weather conditions. The automation concept runs in real-time and employs two keysteps. First, a direct routing algorithm automatically identifies flights with large dog legs in their routes and therefore potentially large savings in flying time. These are common - and usually necessary - during convective weather operations and analysis of Fort Worth Center traffic shows many aircraft with short cuts that indicate savings on the order of 10 flying minutes. The second and most critical step is to apply trajectory automation with weather modeling to determine what savings could be achieved by modifying the direct route such that it avoids weather and traffic and is acceptable to controllers and flight crews. Initial analysis of Fort Worth Center traffic suggests a savings of roughly 50% of the direct route savings could be achievable.The core concept is to apply trajectory automation with convective weather modeling in real time to identify a reroute that is free of weather and traffic conflicts and indicates enough time and fuel savings to be considered. The concept is interoperable with today's integrated FMS/datalink. Auxiliary(lat/long) waypoints define a minimum delay reroute between current position and a downstream capture fix beyond the weather. These auxiliary waypoints can be uplinked to equipped aircraft and auto-loaded into the FMS. Alternatively, for unequipped aircraft, auxiliary waypoints can be replaced by nearby named fixes, but this could reduce potential savings. The presentation includes an overview of the automation approach and focuses on several cases in terms of potential savings, reroute complexity, best auxiliary waypoint solution vs. named fix solution, and other metrics.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis
NASA Astrophysics Data System (ADS)
Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz
2004-04-01
Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria
NASA Astrophysics Data System (ADS)
Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang
2016-04-01
Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.
ERIC Educational Resources Information Center
Chen, Howard Hao-Jan
2011-01-01
Oral communication ability has become increasingly important to many EFL students. Several commercial software programs based on automatic speech recognition (ASR) technologies are available but their prices are not affordable for many students. This paper will demonstrate how the Microsoft Speech Application Software Development Kit (SASDK), a…
Automatic Speech Recognition: Reliability and Pedagogical Implications for Teaching Pronunciation
ERIC Educational Resources Information Center
Kim, In-Seok
2006-01-01
This study examines the reliability of automatic speech recognition (ASR) software used to teach English pronunciation, focusing on one particular piece of software, "FluSpeak, as a typical example." Thirty-six Korean English as a Foreign Language (EFL) college students participated in an experiment in which they listened to 15 sentences…
Use of an Automatic Problem Generator to Teach Basic Skills in a First Course in Assembly Language.
ERIC Educational Resources Information Center
Benander, Alan; And Others
1989-01-01
Discussion of the use of computer aided instruction (CAI) and instructional software in college level courses highlights an automatic problem generator, AUTOGEN, that was written for computer science students learning assembly language. Design of the software is explained, and student responses are reported. (nine references) (LRW)
Automatic programming for critical applications
NASA Technical Reports Server (NTRS)
Loganantharaj, Raj L.
1988-01-01
The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bharoto,; Suparno, Nadi; Putra, Edy Giri Rachman
In 2005, the main computer for data acquisition and control system of Small-angle Neutron Scattering (SANS) BATAN Spectrometer (SMARTer) was replaced since it halted to operate the spectrometer. According to this replacement, the new software for data acquisition and control system has been developed in-house. Visual Basic programming language is used in developing the software. In the last two years, many developments have been made both in the hardware and also the software to conduct the experiment is more effective and efficient. Lately, the previous motor controller card (ISA Card) was replaced with the programmable motor controller card (PCI Card)more » for driving one motor of position sensitive detector (PSD), eight motors of four collimators, and six motors of six pinhole discs. This new control system software makes all motors can be moved simultaneously, then it reduces significantly the consuming time of setting up the instrument before running the experiment. Along with that development, the new data acquisition software under MS Windows operating system is also developed to drive a beam stopper in X-Y directions as well as to read the equipment status such as position of the collimators and PSD, to acquire neutron counts on monitor and PSD detectors, and also to manage 12 samples position automatically. A timer object which is set in one second to read the equipment status via serial port of the computer (RS232C), and general purpose interface board (GPIB) for reading the total counts of each pixel of the PSD from histogram memory was used in this new software. The experiment result displayed in real time on the main window, and the data is saved in the special format for further data reduction and analysis. The new software has been implemented and performed for experiment using a preset count or preset time mode for absolute scattering intensity method.« less
NASA Astrophysics Data System (ADS)
Bharoto, Suparno, Nadi; Putra, Edy Giri Rachman
2015-04-01
In 2005, the main computer for data acquisition and control system of Small-angle Neutron Scattering (SANS) BATAN Spectrometer (SMARTer) was replaced since it halted to operate the spectrometer. According to this replacement, the new software for data acquisition and control system has been developed in-house. Visual Basic programming language is used in developing the software. In the last two years, many developments have been made both in the hardware and also the software to conduct the experiment is more effective and efficient. Lately, the previous motor controller card (ISA Card) was replaced with the programmable motor controller card (PCI Card) for driving one motor of position sensitive detector (PSD), eight motors of four collimators, and six motors of six pinhole discs. This new control system software makes all motors can be moved simultaneously, then it reduces significantly the consuming time of setting up the instrument before running the experiment. Along with that development, the new data acquisition software under MS Windows operating system is also developed to drive a beam stopper in X-Y directions as well as to read the equipment status such as position of the collimators and PSD, to acquire neutron counts on monitor and PSD detectors, and also to manage 12 samples position automatically. A timer object which is set in one second to read the equipment status via serial port of the computer (RS232C), and general purpose interface board (GPIB) for reading the total counts of each pixel of the PSD from histogram memory was used in this new software. The experiment result displayed in real time on the main window, and the data is saved in the special format for further data reduction and analysis. The new software has been implemented and performed for experiment using a preset count or preset time mode for absolute scattering intensity method.
Study of the Acquisition of Peripheral Equipment for Use with Automatic Data Processing Systems.
ERIC Educational Resources Information Center
Comptroller General of the U.S., Washington, DC.
The General Accounting Office (GAO) performed this study because: preliminary indications showed that significant savings could be achieved in the procurement of selected computer components; the Federal Government is investing increasing amounts of money in Automatic Data Processing (ADP) equipment; and there is a widespread congressional…
Developing an Approach for Analyzing and Verifying System Communication
NASA Technical Reports Server (NTRS)
Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally
2009-01-01
This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.
1977-05-01
C31) programs; (4) simulator/ trainer programs ; and (5) automatic test equipment software. Each of these five types of software represents a problem...coded in the same source language, say JOVIAL, then source—language statements would be a better measure, since that would automatically compensate...whether done at no (visible) cost or by renegotiation of the contract. Fig. 2.3 illustrates these with solid lines. It is conjec- tured that the change
Automatic Certification of Kalman Filters for Reliable Code Generation
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd; Schumann, Johann; Richardson, Julian
2005-01-01
AUTOFILTER is a tool for automatically deriving Kalman filter code from high-level declarative specifications of state estimation problems. It can generate code with a range of algorithmic characteristics and for several target platforms. The tool has been designed with reliability of the generated code in mind and is able to automatically certify that the code it generates is free from various error classes. Since documentation is an important part of software assurance, AUTOFILTER can also automatically generate various human-readable documents, containing both design and safety related information. We discuss how these features address software assurance standards such as DO-178B.
RT-Syn: A real-time software system generator
NASA Technical Reports Server (NTRS)
Setliff, Dorothy E.
1992-01-01
This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, W.B.
1979-09-01
This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.
Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft
NASA Technical Reports Server (NTRS)
Matusow, Carla
1999-01-01
As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.
Automatic measurement of skin textures of the dorsal hand in evaluating skin aging.
Gao, Qian; Yu, Jiaming; Wang, Fang; Ge, Tiantian; Hu, Liwen; Liu, Yang
2013-05-01
Changes in skin textures have been used to evaluate skin aging in many studies. In our previous study, we built some skin texture parameters, which can be used to evaluate skin aging of human dorsal hand. However, it will take too much time and need to work arduously to get the information from digital skin image by manual work. So, we want to build a simple and effective method to automatically count some of those skin texture parameters by using digital image-processing technology. A total of 100 subjects aged 30 years and above were involved. Sun exposure history and demographic information were collected by using a questionnaire. The skin image of subjects' dorsal hand was obtained by using a portable skin detector. The number of grids, which is one of skin texture parameters built in our previous study, was measured manually and automatically. Automated image analysis program was developed by using Matlab 7.1 software. The number of grids counted automatically (NGA) was significantly correlated with the number of grids counted manually (NGM) (r = 0.9287, P < 0.0001). And in each age group, there were no significant differences between NGA and NGM. The NGA was negatively correlated with age and lifetime sun exposure, and decreased with increasing Beagley-Gibson score from 3 to 6. In addition, even after adjusting for NGA, the standard deviation of grid areas for each image was positively correlated with age, sun exposure, and Bealey-Gibson score. The method introduced in present study can be used to measure some skin aging parameters automatically and objectively. And it will save much time, reduce labor, and avoid measurement errors of deferent investigators when evaluating a great deal of skin images in a short time. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Huang, Huiyuan; Ding, Zhongxiang; Mao, Dewang; Yuan, Jianhua; Zhu, Fangmei; Chen, Shuda; Xu, Yan; Lou, Lin; Feng, Xiaoyan; Qi, Le; Qiu, Wusi; Zhang, Han; Zang, Yu-Feng
2016-10-01
The main goal of brain tumor surgery is to maximize tumor resection while minimizing the risk of irreversible postoperative functional sequelae. Eloquent functional areas should be delineated preoperatively, particularly for patients with tumors near eloquent areas. Functional magnetic resonance imaging (fMRI) is a noninvasive technique that demonstrates great promise for presurgical planning. However, specialized data processing toolkits for presurgical planning remain lacking. Based on several functions in open-source software such as Statistical Parametric Mapping (SPM), Resting-State fMRI Data Analysis Toolkit (REST), Data Processing Assistant for Resting-State fMRI (DPARSF) and Multiple Independent Component Analysis (MICA), here, we introduce an open-source MATLAB toolbox named PreSurgMapp. This toolbox can reveal eloquent areas using comprehensive methods and various complementary fMRI modalities. For example, PreSurgMapp supports both model-based (general linear model, GLM, and seed correlation) and data-driven (independent component analysis, ICA) methods and processes both task-based and resting-state fMRI data. PreSurgMapp is designed for highly automatic and individualized functional mapping with a user-friendly graphical user interface (GUI) for time-saving pipeline processing. For example, sensorimotor and language-related components can be automatically identified without human input interference using an effective, accurate component identification algorithm using discriminability index. All the results generated can be further evaluated and compared by neuro-radiologists or neurosurgeons. This software has substantial value for clinical neuro-radiology and neuro-oncology, including application to patients with low- and high-grade brain tumors and those with epilepsy foci in the dominant language hemisphere who are planning to undergo a temporal lobectomy.
A preliminary architecture for building communication software from traffic captures
NASA Astrophysics Data System (ADS)
Acosta, Jaime C.; Estrada, Pedro
2017-05-01
Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.
Software engineering with application-specific languages
NASA Technical Reports Server (NTRS)
Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.
1993-01-01
Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.
A VxD-based automatic blending system using multithreaded programming.
Wang, L; Jiang, X; Chen, Y; Tan, K C
2004-01-01
This paper discusses the object-oriented software design for an automatic blending system. By combining the advantages of a programmable logic controller (PLC) and an industrial control PC (ICPC), an automatic blending control system is developed for a chemical plant. The system structure and multithread-based communication approach are first presented in this paper. The overall software design issues, such as system requirements and functionalities, are then discussed in detail. Furthermore, by replacing the conventional dynamic link library (DLL) with virtual X device drivers (VxD's), a practical and cost-effective solution is provided to improve the robustness of the Windows platform-based automatic blending system in small- and medium-sized plants.
Ignizio, Drew A.; O'Donnell, Michael S.; Talbert, Colin B.
2014-01-01
Creating compliant metadata for scientific data products is mandated for all federal Geographic Information Systems professionals and is a best practice for members of the geospatial data community. However, the complexity of the The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata, the limited availability of easy-to-use tools, and recent changes in the ESRI software environment continue to make metadata creation a challenge. Staff at the U.S. Geological Survey Fort Collins Science Center have developed a Python toolbox for ESRI ArcDesktop to facilitate a semi-automated workflow to create and update metadata records in ESRI’s 10.x software. The U.S. Geological Survey Metadata Wizard tool automatically populates several metadata elements: the spatial reference, spatial extent, geospatial presentation format, vector feature count or raster column/row count, native system/processing environment, and the metadata creation date. Once the software auto-populates these elements, users can easily add attribute definitions and other relevant information in a simple Graphical User Interface. The tool, which offers a simple design free of esoteric metadata language, has the potential to save many government and non-government organizations a significant amount of time and costs by facilitating the development of The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata compliant metadata for ESRI software users. A working version of the tool is now available for ESRI ArcDesktop, version 10.0, 10.1, and 10.2 (downloadable at http:/www.sciencebase.gov/metadatawizard).
OpenStudio: A Platform for Ex Ante Incentive Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, Amir; Brackney, Larry; Parker, Andrew
Many utilities operate programs that provide ex ante (up front) incentives for building energy conservation measures (ECMs). A typical incentive program covers two kinds of ECMs. ECMs that deliver similar savings in different contexts are associated with pre-calculated 'deemed' savings values. ECMs that deliver different savings in different contexts are evaluated on a 'custom' per-project basis. Incentive programs often operate at less than peak efficiency because both deemed ECMs and custom projects have lengthy and effort-intensive review processes--deemed ECMs to gain confidence that they are sufficiently context insensitive, custom projects to ensure that savings are claimed appropriately. DOE's OpenStudio platformmore » can be used to automate ex ante processes and help utilities operate programs more efficiently, consistently, and transparently, resulting in greater project throughput and energy savings. A key concept of the platform is the OpenStudio Measure, a script that queries and transforms building energy models. Measures can be simple or surgical, e.g., applying different transformations based on space-type, orientation, etc. Measures represent ECMs explicitly and are easier to review than ECMs that are represented implicitly as the difference between a with-ECM and without-ECM models. Measures can be automatically applied to large numbers of prototype models--and instantiated from uncertainty distributions--facilitating the large scale analysis required to develop deemed savings values. For custom projects, Measures can also be used to calibrate existing building models, to automatically create code baseline models, and to perform quality assurance screening.« less
NASA Astrophysics Data System (ADS)
Möller, Thomas; Bellin, Knut; Creutzburg, Reiner
2015-03-01
The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.
Slatina, E
2000-01-01
This poster will show a concept as well as some practical solutions for the database which is intended to be used by all members and institutions of the Emergency Medical Care international institutions network. This poster shows how data is automatically processed in a rapid and good quality way, while in the same time the lives are saved and time and money are saved too.
Research progress of on-line automatic monitoring of chemical oxygen demand (COD) of water
NASA Astrophysics Data System (ADS)
Cai, Youfa; Fu, Xing; Gao, Xiaolu; Li, Lianyin
2018-02-01
With the increasingly stricter control of pollutant emission in China, the on-line automatic monitoring of water quality is particularly urgent. The chemical oxygen demand (COD) is a comprehensive index to measure the contamination caused by organic matters, and thus it is taken as one important index of energy-saving and emission reduction in China’s “Twelve-Five” program. So far, the COD on-line automatic monitoring instrument has played an important role in the field of sewage monitoring. This paper reviews the existing methods to achieve on-line automatic monitoring of COD, and on the basis, points out the future trend of the COD on-line automatic monitoring instruments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B; Lee, S; Chen, S
Purpose: Monitoring the delivered dose is an important task for the adaptive radiotherapy (ART) and for determining time to re-plan. A software tool which enables automatic delivered dose calculation using cone-beam CT (CBCT) has been developed and tested. Methods: The tool consists of four components: a CBCT Colleting Module (CCM), a Plan Registration Moduel (PRM), a Dose Calculation Module (DCM), and an Evaluation and Action Module (EAM). The CCM is triggered periodically (e.g. every 1:00 AM) to search for newly acquired CBCTs of patients of interest and then export the DICOM files of the images and related registrations defined inmore » ARIA followed by triggering the PRM. The PRM imports the DICOM images and registrations, links the CBCTs to the related treatment plan of the patient in the planning system (RayStation V4.5, RaySearch, Stockholm, Sweden). A pre-determined CT-to-density table is automatically generated for dose calculation. Current version of the DCM uses a rigid registration which regards the treatment isocenter of the CBCT to be the isocenter of the treatment plan. Then it starts the dose calculation automatically. The AEM evaluates the plan using pre-determined plan evaluation parameters: PTV dose-volume metrics and critical organ doses. The tool has been tested for 10 patients. Results: Automatic plans are generated and saved in the order of the treatment dates of the Adaptive Planning module of the RayStation planning system, without any manual intervention. Once the CTV dose deviates more than 3%, both email and page alerts are sent to the physician and the physicist of the patient so that one can look the case closely. Conclusion: The tool is capable to perform automatic dose tracking and to alert clinicians when an action is needed. It is clinically useful for off-line adaptive therapy to catch any gross error. Practical way of determining alarming level for OAR is under development.« less
National Highway Safety Administration. Automatic collision notice field test summary.
2001-10-01
From 1995 to 2000, the National Highway Traffic Safety Administration (NHTSA) sponsored an initiative to create and operate an Automatic Collision Notification (ACN) system on a demonstration basis in a rural area to provide faster and smarter emergency medical responses and in an attempt to save lives and reduce disabilities from injuries. This article is a brief summary of that demonstration.
Simmat, I; Georg, P; Georg, D; Birkfellner, W; Goldner, G; Stock, M
2012-09-01
The goal of the current study was to evaluate the commercially available atlas-based autosegmentation software for clinical use in prostate radiotherapy. The accuracy was benchmarked against interobserver variability. A total of 20 planning computed tomographs (CTs) and 10 cone-beam CTs (CBCTs) were selected for prostate, rectum, and bladder delineation. The images varied regarding to individual (age, body mass index) and setup parameters (contrast agent, rectal balloon, implanted markers). Automatically created contours with ABAS(®) and iPlan(®) were compared to an expert's delineation by calculating the Dice similarity coefficient (DSC) and conformity index. Demo-atlases of both systems showed different results for bladder (DSC(ABAS) 0.86 ± 0.17, DSC(iPlan) 0.51 ± 0.30) and prostate (DSC(ABAS) 0.71 ± 0.14, DSC(iPlan) 0.57 ± 0.19). Rectum delineation (DSC(ABAS) 0.78 ± 0.11, DSC(iPlan) 0.84 ± 0.08) demonstrated differences between the systems but better correlation of the automatically drawn volumes. ABAS(®) was closest to the interobserver benchmark. Autosegmentation with iPlan(®), ABAS(®) and manual segmentation took 0.5, 4 and 15-20 min, respectively. Automatic contouring on CBCT showed high dependence on image quality (DSC bladder 0.54, rectum 0.42, prostate 0.34). For clinical routine, efforts are still necessary to either redesign algorithms implemented in autosegmentation or to optimize image quality for CBCT to guarantee required accuracy and time savings for adaptive radiotherapy.
MARZ: Manual and automatic redshifting software
NASA Astrophysics Data System (ADS)
Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.
2016-04-01
The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.
Automatic thermographic image defect detection of composites
NASA Astrophysics Data System (ADS)
Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP
2011-05-01
Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.
NASA Technical Reports Server (NTRS)
Dobrinskaya, Tatiana
2015-01-01
This paper suggests a new method for optimizing yaw maneuvers on the International Space Station (ISS). Yaw rotations are the most common large maneuvers on the ISS often used for docking and undocking operations, as well as for other activities. When maneuver optimization is used, large maneuvers, which were performed on thrusters, could be performed either using control moment gyroscopes (CMG), or with significantly reduced thruster firings. Maneuver optimization helps to save expensive propellant and reduce structural loads - an important factor for the ISS service life. In addition, optimized maneuvers reduce contamination of the critical elements of the vehicle structure, such as solar arrays. This paper presents an analytical solution for optimizing yaw attitude maneuvers. Equations describing pitch and roll motion needed to counteract the major torques during a yaw maneuver are obtained. A yaw rate profile is proposed. Also the paper describes the physical basis of the suggested optimization approach. In the obtained optimized case, the torques are significantly reduced. This torque reduction was compared to the existing optimization method which utilizes the computational solution. It was shown that the attitude profiles and the torque reduction have a good match for these two methods of optimization. The simulations using the ISS flight software showed similar propellant consumption for both methods. The analytical solution proposed in this paper has major benefits with respect to computational approach. In contrast to the current computational solution, which only can be calculated on the ground, the analytical solution does not require extensive computational resources, and can be implemented in the onboard software, thus, making the maneuver execution automatic. The automatic maneuver significantly simplifies the operations and, if necessary, allows to perform a maneuver without communication with the ground. It also reduces the probability of command errors. The suggested analytical solution provides a new method of maneuver optimization which is less complicated, automatic and more universal. A maneuver optimization approach, presented in this paper, can be used not only for the ISS, but for other orbiting space vehicles.
Chipster: user-friendly analysis software for microarray and other high-throughput data.
Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I
2011-10-14
The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Yaowei; Hu, Jiansheng, E-mail: hujs@ipp.ac.cn; Wan, Zhao
2016-03-15
Deuterium pressure in deuterium-helium mixture gas is successfully measured by a common quadrupole mass spectrometer (model: RGA200) with a resolution of ∼0.5 atomic mass unit (AMU), by using varied ionization energy together with new developed software and dedicated calibration for RGA200. The new software is developed by using MATLAB with the new functions: electron energy (EE) scanning, deuterium partial pressure measurement, and automatic data saving. RGA200 with new software is calibrated in pure deuterium and pure helium 1.0 × 10{sup −6}–5.0 × 10{sup −2} Pa, and the relation between pressure and ion current of AMU4 under EE = 25 eVmore » and EE = 70 eV is obtained. From the calibration result and RGA200 scanning with varied ionization energy in deuterium and helium mixture gas, both deuterium partial pressures (P{sub D{sub 2}}) and helium partial pressure (P{sub He}) could be obtained. The result shows that deuterium partial pressure could be measured if P{sub D{sub 2}} > 10{sup −6} Pa (limited by ultimate pressure of calibration vessel), and helium pressure could be measured only if P{sub He}/P{sub D{sub 2}} > 0.45, and the measurement error is evaluated as 15%. This method is successfully employed in EAST 2015 summer campaign to monitor deuterium outgassing/desorption during helium discharge cleaning.« less
Chipster: user-friendly analysis software for microarray and other high-throughput data
2011-01-01
Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641
NASA Technical Reports Server (NTRS)
Wolverton, David A.; Dickson, Richard W.; Clinedinst, Winston C.; Slominski, Christopher J.
1993-01-01
The flight software developed for the Flight Management/Flight Controls (FM/FC) MicroVAX computer used on the Transport Systems Research Vehicle for Advanced Transport Operating Systems (ATOPS) research is described. The FM/FC software computes navigation position estimates, guidance commands, and those commands issued to the control surfaces to direct the aircraft in flight. Various modes of flight are provided for, ranging from computer assisted manual modes to fully automatic modes including automatic landing. A high-level system overview as well as a description of each software module comprising the system is provided. Digital systems diagrams are included for each major flight control component and selected flight management functions.
Front-End/Gateway Software: Availability and Usefulness.
ERIC Educational Resources Information Center
Kesselman, Martin
1985-01-01
Reviews features of front-end software packages (interface between user and online system)--database selection, search strategy development, saving and downloading, hardware and software requirements, training and documentation, online systems and database accession, and costs--and discusses gateway services (user searches through intermediary…
Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG
NASA Astrophysics Data System (ADS)
Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu
2016-12-01
Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.
Automatic extraction and visualization of object-oriented software design metrics
NASA Astrophysics Data System (ADS)
Lakshminarayana, Anuradha; Newman, Timothy S.; Li, Wei; Talburt, John
2000-02-01
Software visualization is a graphical representation of software characteristics and behavior. Certain modes of software visualization can be useful in isolating problems and identifying unanticipated behavior. In this paper we present a new approach to aid understanding of object- oriented software through 3D visualization of software metrics that can be extracted from the design phase of software development. The focus of the paper is a metric extraction method and a new collection of glyphs for multi- dimensional metric visualization. Our approach utilize the extensibility interface of a popular CASE tool to access and automatically extract the metrics from Unified Modeling Language class diagrams. Following the extraction of the design metrics, 3D visualization of these metrics are generated for each class in the design, utilizing intuitively meaningful 3D glyphs that are representative of the ensemble of metrics. Extraction and visualization of design metrics can aid software developers in the early study and understanding of design complexity.
Review of automatic detection of pig behaviours by using image analysis
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Kong, Fantao
2017-06-01
Automatic detection of lying, moving, feeding, drinking, and aggressive behaviours of pigs by means of image analysis can save observation input by staff. It would help staff make early detection of diseases or injuries of pigs during breeding and improve management efficiency of swine industry. This study describes the progress of pig behaviour detection based on image analysis and advancement in image segmentation of pig body, segmentation of pig adhesion and extraction of pig behaviour characteristic parameters. Challenges for achieving automatic detection of pig behaviours were summarized.
Ideas for the rapid development of the structural models in mechanical engineering
NASA Astrophysics Data System (ADS)
Oanta, E.; Raicu, A.; Panait, C.
2017-08-01
Conceiving computer based instruments is a long run concern of the authors. Some of the original solutions are: optimal processing of the large matrices, interfaces between the programming languages, approximation theory using spline functions, numerical programming increased accuracy based on the extended arbitrary precision libraries. For the rapid development of the models we identified the following directions: atomization, ‘librarization’, parameterization, automatization and integration. Each of these directions has some particular aspects if we approach mechanical design problems or software development. Atomization means a thorough top-down decomposition analysis which offers an insight regarding the basic features of the phenomenon. Creation of libraries of reusable mechanical parts and libraries of programs (data types, functions) save time, cost and effort when a new model must be conceived. Parameterization leads to flexible definition of the mechanical parts, the values of the parameters being changed either using a dimensioning program or in accord to other parts belonging to the same assembly. The resulting templates may be also included in libraries. Original software applications are useful for the model’s input data generation, to input the data into CAD/FEA commercial applications and for the data integration of the various types of studies included in the same project.
Sankari, Ziad; Adeli, Hojjat
2011-04-01
A mobile medical device, dubbed HeartSaver, is developed for real-time monitoring of a patient's electrocardiogram (ECG) and automatic detection of several cardiac pathologies, including atrial fibrillation, myocardial infarction and atrio-ventricular block. HeartSaver is based on adroit integration of four different modern technologies: electronics, wireless communication, computer, and information technologies in the service of medicine. The physical device consists of four modules: sensor and ECG processing unit, a microcontroller, a link between the microcontroller and the cell phone, and mobile software associated with the system. HeartSaver includes automated cardiac pathology detection algorithms. These algorithms are simple enough to be implemented on a low-cost, limited-power microcontroller but powerful enough to detect the relevant cardiac pathologies. When an abnormality is detected, the microcontroller sends a signal to a cell phone. This operation triggers an application software on the cell phone that sends a text message transmitting information about patient's physiological condition and location promptly to a physician or a guardian. HeartSaver can be used by millions of cardiac patients with the potential to transform the cardiac diagnosis, care, and treatment and save thousands of lives. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Schoppers, Marcel
1994-01-01
The design of a flexible, real-time software architecture for trajectory planning and automatic control of redundant manipulators is described. Emphasis is placed on a technique of designing control systems that are both flexible and robust yet have good real-time performance. The solution presented involves an artificial intelligence algorithm that dynamically reprograms the real-time control system while planning system behavior.
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano
Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
Accuracy of computerized automatic identification of cephalometric landmarks by a designed software.
Shahidi, Sh; Shahidi, S; Oshagh, M; Gozin, F; Salehi, P; Danaei, S M
2013-01-01
The purpose of this study was to design software for localization of cephalometric landmarks and to evaluate its accuracy in finding landmarks. 40 digital cephalometric radiographs were randomly selected. 16 landmarks which were important in most cephalometric analyses were chosen to be identified. Three expert orthodontists manually identified landmarks twice. The mean of two measurements of each landmark was defined as the baseline landmark. The computer was then able to compare the automatic system's estimate of a landmark with the baseline landmark. The software was designed using Delphi and Matlab programming languages. The techniques were template matching, edge enhancement and some accessory techniques. The total mean error between manually identified and automatically identified landmarks was 2.59 mm. 12.5% of landmarks had mean errors less than 1 mm. 43.75% of landmarks had mean errors less than 2 mm. The mean errors of all landmarks except the anterior nasal spine were less than 4 mm. This software had significant accuracy for localization of cephalometric landmarks and could be used in future applications. It seems that the accuracy obtained with the software which was developed in this study is better than previous automated systems that have used model-based and knowledge-based approaches.
Development of an Automatic Echo-counting Program for HROFFT Spectrograms
NASA Astrophysics Data System (ADS)
Noguchi, Kazuya; Yamamoto, Masa-Yuki
2008-06-01
Radio meteor observations by Ham-band beacon or FM radio broadcasts using “Ham-band Radio meteor Observation Fast Fourier Transform” (HROFFT) an automatic operating software have been performed widely in recent days. Previously, counting of meteor echoes on the spectrograms of radio meteor observation was performed manually by observers. In the present paper, we introduce an automatic meteor echo counting software application. Although output images of the HROFFT contain both the features of meteor echoes and those of various types of noises, a newly developed image processing technique has been applied, resulting in software that enables a useful auto-counting tool. There exists a slight error in the processing on spectrograms when the observation site is affected by many disturbing noises. Nevertheless, comparison between software and manual counting revealed an agreement of almost 90%. Therefore, we can easily obtain a dataset of detection time, duration time, signal strength, and Doppler shift of each meteor echo from the HROFFT spectrograms. Using this software, statistical analyses of meteor activities is based on the results obtained at many Ham-band Radio meteor Observation (HRO) sites throughout the world, resulting in a very useful “standard” for monitoring meteor stream activities in real time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ian Metzger, Jesse Dean
2010-12-31
This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.
Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter
2018-01-01
Introduction Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However—due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. Material and methods In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Results Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Discussion Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a greater data amount are areas of future works. PMID:29746490
Energy savings modelling of re-tuning energy conservation measures in large office buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Nick; Katipamula, Srinivas; Wang, Weimin
Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS’s capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This paper investigates the energy savings potential of several common HVAC system re-tuning measures on a typical large office building, using the Department of Energy’s building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy asmore » an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply-air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated – each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All the individual measures and combinations were simulated in 16 climate locations representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual total HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy savings for most cities for all measures). Combining many of the re-tuning measures revealed deep savings potential. Some of the more aggressive combinations revealed 35-75% reductions in annual HVAC energy consumption, depending on climate and building vintage.« less
Specdata: Automated Analysis Software for Broadband Spectra
NASA Astrophysics Data System (ADS)
Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.
2017-06-01
With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.
NASA Technical Reports Server (NTRS)
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
NASA Astrophysics Data System (ADS)
Luo, Jiasai; Guo, Yongcai; Wang, Xin
2018-06-01
This paper puts forward a novel method for fabrication of sandwich-structured BCE using a detachable micro-hole array (MHA) prepared by 3D printing. Compared with most traditional methods, 3D printing enables effective implementation of direct micro-fabrication for curved BCE without the pattern transfer and substrate reshaping process. This 3D fabrication method allows rapid fabrication of the curved BCE and automatic assembly of the detachable MHA using a custom-built mold under negative pressure. The formation of a multi-focusing micro-lens array (MLA) was realized by adjusting the parameters of the curved detachable MHA. The imaging performance was effectively enhanced by the sandwich structure that consist of the multi-focusing MLA, the outer detachable MHA and the inner solidified MHA. This method is suitable for mass production due to its advantages as a time-saving, cost-effective and simple process. Optical design software was used to analyze the optical properties, and an imaging simulation was performed.
NASA Astrophysics Data System (ADS)
Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.
2016-04-01
In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.
Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.
Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D
2018-05-01
To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).
Split-personality transmission: shifts like an automatic, saves fuel like a manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, D.
1981-11-01
The design, operation and performance of a British-invented automatic transmission which claims to result in fuel economy valves equal to those attained with manual shifts are described. Developed for both 4-speed and 6-speed transmissions, this transmission uses standard parts made for existing manual transmissions, rearranges the gear pairings, and relies on a microcomputer to pick the optimal shift points according to load requirements. (LCL)
Fully automatic segmentation of white matter hyperintensities in MR images of the elderly.
Admiraal-Behloul, F; van den Heuvel, D M J; Olofsen, H; van Osch, M J P; van der Grond, J; van Buchem, M A; Reiber, J H C
2005-11-15
The role of quantitative image analysis in large clinical trials is continuously increasing. Several methods are available for performing white matter hyperintensity (WMH) volume quantification. They vary in the amount of the human interaction involved. In this paper, we describe a fully automatic segmentation that was used to quantify WMHs in a large clinical trial on elderly subjects. Our segmentation method combines information from 3 different MR images: proton density (PD), T2-weighted and fluid-attenuated inversion recovery (FLAIR) images; our method uses an established artificial intelligent technique (fuzzy inference system) and does not require extensive computations. The reproducibility of the segmentation was evaluated in 9 patients who underwent scan-rescan with repositioning; an inter-class correlation coefficient (ICC) of 0.91 was obtained. The effect of differences in image resolution was tested in 44 patients, scanned with 6- and 3-mm slice thickness FLAIR images; we obtained an ICC value of 0.99. The accuracy of the segmentation was evaluated on 100 patients for whom manual delineation of WMHs was available; the obtained ICC was 0.98 and the similarity index was 0.75. Besides the fact that the approach demonstrated very high volumetric and spatial agreement with expert delineation, the software did not require more than 2 min per patient (from loading the images to saving the results) on a Pentium-4 processor (512 MB RAM).
NASA Astrophysics Data System (ADS)
Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.
2014-11-01
Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.
A New Tool For The Hospital Lab
NASA Technical Reports Server (NTRS)
1979-01-01
The multi-module AutoMicrobic System (AMS), whose development stemmed from space-biomedical research, is an automatic, time-saving system for detecting and identifying disease-producing microorganisms in the human body.
Applying Hypertext Structures to Software Documentation.
ERIC Educational Resources Information Center
French, James C.; And Others
1997-01-01
Describes a prototype system for software documentation management called SLEUTH (Software Literacy Enhancing Usefulness to Humans) being developed at the University of Virginia. Highlights include information retrieval techniques, hypertext links that are installed automatically, a WAIS (Wide Area Information Server) search engine, user…
NASA Astrophysics Data System (ADS)
Rahman, Fuad; Tarnikova, Yuliya; Hartono, Rachmat; Alam, Hassan
2006-01-01
This paper presents a novel automatic web publishing solution, Pageview (R). PageView (R) is a complete working solution for document processing and management. The principal aim of this tool is to allow workgroups to share, access and publish documents on-line on a regular basis. For example, assuming that a person is working on some documents. The user will, in some fashion, organize his work either in his own local directory or in a shared network drive. Now extend that concept to a workgroup. Within a workgroup, some users are working together on some documents, and they are saving them in a directory structure somewhere on a document repository. The next stage of this reasoning is that a workgroup is working on some documents, and they want to publish them routinely on-line. Now it may happen that they are using different editing tools, different software, and different graphics tools. The resultant documents may be in PDF, Microsoft Office (R), HTML, or Word Perfect format, just to name a few. In general, this process needs the documents to be processed in a fashion so that they are in the HTML format, and then a web designer needs to work on that collection to make them available on-line. PageView (R) takes care of this whole process automatically, making the document workflow clean and easy to follow. PageView (R) Server publishes documents, complete with the directory structure, for online use. The documents are automatically converted to HTML and PDF so that users can view the content without downloading the original files, or having to download browser plug-ins. Once published, other users can access the documents as if they are accessing them from their local folders. The paper will describe the complete working system and will discuss possible applications within the document management research.
Web-Based Environment for Maintaining Legacy Software
NASA Technical Reports Server (NTRS)
Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard
2007-01-01
Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.
Automatic Coding of Short Text Responses via Clustering in Educational Assessment
ERIC Educational Resources Information Center
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank
2016-01-01
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
Automatic discovery of the communication network topology for building a supercomputer model
NASA Astrophysics Data System (ADS)
Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim
2016-10-01
The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2008-07-01
This case study describes how the Kaiser Aluminum plant in Sherman, Texas, achieved annual savings of $360,000 and 45,000 MMBtu, and improved furnace energy intensity by 11.1% after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its process heating system.
Diagnosis diagrams for passing signals on an automatic block signaling railway section
NASA Astrophysics Data System (ADS)
Spunei, E.; Piroi, I.; Chioncel, C. P.; Piroi, F.
2018-01-01
This work presents a diagnosis method for railway traffic security installations. More specifically, the authors present a series of diagnosis charts for passing signals on a railway block equipped with an automatic block signaling installation. These charts are based on the exploitation electric schemes, and are subsequently used to develop a diagnosis software package. The thus developed software package contributes substantially to a reduction of failure detection and remedy for these types of installation faults. The use of the software package eliminates making wrong decisions in the fault detection process, decisions that may result in longer remedy times and, sometimes, to railway traffic events.
Integrated Functional and Executional Modelling of Software Using Web-Based Databases
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Marietta, Roberta
1998-01-01
NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.
A Reconfigurable Simulation-Based Test System for Automatically Assessing Software Operating Skills
ERIC Educational Resources Information Center
Su, Jun-Ming; Lin, Huan-Yu
2015-01-01
In recent years, software operating skills, the ability in computer literacy to solve problems using specific software, has become much more important. A great deal of research has also proven that students' software operating skills can be efficiently improved by practicing customized virtual and simulated examinations. However, constructing…
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasche, George P.
2009-10-01
Cambio is an application intended to automatically read and display any spectrum file of any format in the world that the nuclear emergency response community might encounter. Cambio also provides an analysis capability suitable for HPGe spectra when detector response and scattering environment are not well known. Why is Cambio needed: (1) Cambio solves the following problem - With over 50 types of formats from instruments used in the field and new format variations appearing frequently, it is impractical for every responder to have current versions of the manufacturer's software from every instrument used in the field; (2) Cambio convertsmore » field spectra to any one of several common formats that are used for analysis, saving valuable time in an emergency situation; (3) Cambio provides basic tools for comparing spectra, calibrating spectra, and isotope identification with analysis suited especially for HPGe spectra; and (4) Cambio has a batch processing capability to automatically translate a large number of archival spectral files of any format to one of several common formats, such as the IAEA SPE or the DHS N42. Currently over 540 analysts and members of the nuclear emergency response community worldwide are on the distribution list for updates to Cambio. Cambio users come from all levels of government, university, and commercial partners around the world that support efforts to counter terrorist nuclear activities. Cambio is Unclassified Unlimited Release (UUR) and distributed by internet downloads with email notifications whenever a new build of Cambio provides for new formats, bug fixes, or new or improved capabilities. Cambio is also provided as a DLL to the Karlsruhe Institute for Transuranium Elements so that Cambio's automatic file-reading capability can be included at the Nucleonica web site.« less
An Interative Grahical User Interface for Maritime Security Services
NASA Astrophysics Data System (ADS)
Reize, T.; Müller, R.; Kiefl, R.
2013-10-01
In order to analyse optical satellite images for maritime security issues in Near-Real-Time (NRT) an interactive graphical user interface (GUI) based on NASA World Wind was developed and is presented in this article. Targets or activities can be detected, measured and classified with this tool simply and quickly. The service uses optical satellite images, currently taken from 6 sensors: Worldview-1 and Worldview-2, Ikonos, Quickbird, GeoEye-1 and EROS-B. The GUI can also handle SAR-images, air-borne images or UAV images. Software configurations are provided in a job-order file and thus all preparation tasks, such as image installation are performed fully automatically. The imagery can be overlaid with vessels derived by an automatic detection processor. These potential vessel layers can be zoomed in by a single click and sorted with an adapted method. Further object properties, such as vessel type or confidence level of identification, can be added by the operator manually. The heading angle can be refined by dragging the vessel's head or switching it to 180° with a single click. Further vessels or other relevant objects can be added. The objects length, width, heading and position are calculated automatically from three clicks on top, bottom and an arbitrary point at one of the object's longer side. In case of an Activity Detection, the detected objects can be grouped in area of interests (AOI) and classified, according to the ordered activities. All relevant information is finally written to an exchange file, after quality control and necessary correction procedures are performed. If required, image thumbnails can be cut around objects or around whole areas of interest and saved as separated, geo-referenced images.
Iwasaki, Wataru; Yamamoto, Yasunori; Takagi, Toshihisa
2010-12-13
In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration). The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the "tsunami" of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past). The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom.
Takagi, Toshihisa
2010-01-01
In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration). The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the “tsunami” of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past). The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom. PMID:21179453
Automated Sequence Generation Process and Software
NASA Technical Reports Server (NTRS)
Gladden, Roy
2007-01-01
"Automated sequence generation" (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences.
2014-07-01
different value and pressing Enter. The PRC- Calc session can be saved for future use with these new values using the Save Session button in the upper...describe (a) how the PRC Correction Calculator (PRC- Calc ) uses the model of Fernandez et al. (2009), (b) how well its performance compares against...experimental data, (c) how the user may prepare their computer with software to use the PRC calculator, and then (d) how to use PRC- Calc to process PRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S; Wu, Y; Chang, X
Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans ofmore » the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the Agency for Healthcare Research and Quality (AHRQ) under award 1R01HS0222888. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, AS; Piper, J; Curry, K
2015-06-15
Purpose: Prostate MRI plays an important role in diagnosis, biopsy guidance, and therapy planning for prostate cancer. Prostate MRI contours can be used to aid in image fusion for ultrasound biopsy guidance and delivery of radiation. Our goal in this study is to evaluate an automatic atlas-based segmentation method for generating prostate and peripheral zone (PZ) contours on MRI. Methods: T2-weighted MRIs were acquired on 3T-Discovery MR750 System (GE, Milwaukee). The Volumes of Interest (VOIs): prostate and PZ were outlined by an expert radiation oncologist and used to create an atlas library for atlas-based segmentation. The atlas-segmentation accuracy was evaluatedmore » using a leave-one-out analysis. The method involved automatically finding the atlas subject that best matched the test subject followed by a normalized intensity-based free-form deformable registration of the atlas subject to the test subject. The prostate and PZ contours were transformed to the test subject using the same deformation. For each test subject the three best matches were used and the final contour was combined using Majority Vote. The atlas-segmentation process was fully automatic. Dice similarity coefficients (DSC) and mean Hausdorff values were used for comparison. Results: VOIs contours were available for 28 subjects. For the prostate, the atlas-based segmentation method resulted in an average DSC of 0.88+/−0.08 and a mean Hausdorff distance of 1.1+/−0.9mm. The number of patients (#) in DSC ranges are as follows: 0.60–0.69(1), 0.70–0.79(2), 0.80–0.89(13), >0.89(11). For the PZ, the average DSC was 0.72+/−0.17 and average Hausdorff of 0.9+/−0.9mm. The number of patients (#) in DSC ranges are as follows: <0.60(4), 0.60–0.69(6), 0.70–0.79(7), 0.80–0.89(9), >0.89(1). Conclusion: The MRI atlas-based segmentation method achieved good results for both the whole prostate and PZ compared to expert defined VOIs. The technique is fast, fully automatic, and has the potential to provide significant time savings for prostate VOI definition. AS Nelson and J Piper are partial owners of MIM Software, Inc. AS Nelson, J Piper, K Curry, and A Swallen are current employees at MIM Software, Inc.« less
Specifications and programs for computer software validation
NASA Technical Reports Server (NTRS)
Browne, J. C.; Kleir, R.; Davis, T.; Henneman, M.; Haller, A.; Lasseter, G. L.
1973-01-01
Three software products developed during the study are reported and include: (1) FORTRAN Automatic Code Evaluation System, (2) the Specification Language System, and (3) the Array Index Validation System.
Automatic Evolution of Molecular Nanotechnology Designs
NASA Technical Reports Server (NTRS)
Globus, Al; Lawton, John; Wipke, Todd; Saini, Subhash (Technical Monitor)
1998-01-01
This paper describes strategies for automatically generating designs for analog circuits at the molecular level. Software maps out the edges and vertices of potential nanotechnology systems on graphs, then selects appropriate ones through evolutionary or genetic paradigms.
A Cost Estimation Analysis of U.S. Navy Ship Fuel-Savings Techniques and Technologies
2009-09-01
readings to the boiler operator. The PLC will provide constant automatic trimming of the excess oxygen based upon real time SGA readings. An SCD...the author): The Aegis Combat System is controlled by an advanced, automatic detect-and-track, multi-function three-dimensional passive...subsequently offloaded. An Online Wash System would reduce these maintenance costs and improve fuel efficiency of these engines by keeping the engines
Software Solution Saves Dollars
ERIC Educational Resources Information Center
Trotter, Andrew
2004-01-01
This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…
Messier, Erik
2016-08-01
A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.
An automatic tooth preparation technique: A preliminary study
NASA Astrophysics Data System (ADS)
Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun
2016-04-01
The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.
An automatic tooth preparation technique: A preliminary study.
Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun
2016-04-29
The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.
DNAGear--a free software for spa type identification in Staphylococcus aureus.
AL-Tam, Faroq; Brunel, Anne-Sophie; Bouzinbi, Nicolas; Corne, Philippe; Bañuls, Anne-Laure; Shahbazkia, Hamid Reza
2012-11-19
Staphylococcus aureus is both human commensal and an important human pathogen, responsible for community-acquired and nosocomial infections ranging from superficial wound infections to invasive infections, such as osteomyelitis, bacteremia and endocarditis, pneumonia or toxin shock syndrome with a mortality rate up to 40%. S. aureus reveals a high genetic polymorphism and detecting the genotypes is extremely useful to manage and prevent possible outbreaks and to understand the route of infection. One of current and expanded typing method is based on the X region of the spa gene composed of a succession of repeats of 21 to 27 bp. More than 10000 types are known. Extracting the repeats is impossible by hand and needs a dedicated software. Unfortunately the only software on the market is a commercial program from Ridom. This article presents DNAGear, a free and open source software with a user friendly interface written all in Java on top of NetBeans Platform to perform spa typing, detecting new repeats and new spa types and synchronizing automatically the files with the open access database. The installation is easy and the application is platform independent. In fact, the SPA identification is a formal regular expression matching problem and the results are 100% exact. As the program is using Java embedded modules written over string manipulation of well established algorithms, the exactitude of the solution is perfectly established. DNAGear is able to identify the types of the S. aureus sequences and detect both new types and repeats. Comparing to manual processing, which is time consuming and error prone, this application saves a lot of time and effort and gives very reliable results. Additionally, the users do not need to prepare the forward-reverse sequences manually, or even by using additional tools. They can simply create them in DNAGear and perform the typing task. In short, researchers who do not have commercial software will benefit a lot from this application.
DNAGear- a free software for spa type identification in Staphylococcus aureus
2012-01-01
Background Staphylococcus aureus is both human commensal and an important human pathogen, responsible for community-acquired and nosocomial infections ranging from superficial wound infections to invasive infections, such as osteomyelitis, bacteremia and endocarditis, pneumonia or toxin shock syndrome with a mortality rate up to 40%. S. aureus reveals a high genetic polymorphism and detecting the genotypes is extremely useful to manage and prevent possible outbreaks and to understand the route of infection. One of current and expanded typing method is based on the X region of the spa gene composed of a succession of repeats of 21 to 27 bp. More than 10000 types are known. Extracting the repeats is impossible by hand and needs a dedicated software. Unfortunately the only software on the market is a commercial program from Ridom. Findings This article presents DNAGear, a free and open source software with a user friendly interface written all in Java on top of NetBeans Platform to perform spa typing, detecting new repeats and new spa types and synchronizing automatically the files with the open access database. The installation is easy and the application is platform independent. In fact, the SPA identification is a formal regular expression matching problem and the results are 100% exact. As the program is using Java embedded modules written over string manipulation of well established algorithms, the exactitude of the solution is perfectly established. Conclusions DNAGear is able to identify the types of the S. aureus sequences and detect both new types and repeats. Comparing to manual processing, which is time consuming and error prone, this application saves a lot of time and effort and gives very reliable results. Additionally, the users do not need to prepare the forward-reverse sequences manually, or even by using additional tools. They can simply create them in DNAGear and perform the typing task. In short, researchers who do not have commercial software will benefit a lot from this application. PMID:23164452
Tracker: Image-Processing and Object-Tracking System Developed
NASA Technical Reports Server (NTRS)
Klimek, Robert B.; Wright, Theodore W.
1999-01-01
Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in extracting numerical instrumentation data that are embedded in images. All the results are saved in files for further data reduction and graphing. There are currently three Tracking Systems (workstations) operating near the laboratories and offices of Lewis Microgravity Science Division researchers. These systems are used independently by students, scientists, and university-based principal investigators. The researchers bring their tapes or films to the workstation and perform the tracking analysis. The resultant data files generated by the tracking process can then be analyzed on the spot, although most of the time researchers prefer to transfer them via the network to their offices for further analysis or plotting. In addition, many researchers have installed Tracker on computers in their office for desktop analysis of digital image sequences, which can be digitized by the Tracking System or some other means. Tracker has not only provided a capability to efficiently and automatically analyze large volumes of data, saving many hours of tedious work, but has also provided new capabilities to extract valuable information and phenomena that was heretofore undetected and unexploited.
Shen, Elizabeth P; Chen, Wei-Li; Hu, Fung-Rong
2010-03-01
To compare the efficacy and safety of manual limbal markings and wavefront-guided treatment with iris-registration software in laser in situ keratomileusis (LASIK) for myopic astigmatism. National Taiwan University Hospital, Taipei, Taiwan. Eyes with myopic astigmatism had LASIK with a Technolas 217z laser. Eyes in the limbal-marking group had conventional LASIK (PlanoScan or Zyoptix tissue-saving algorithm) with manual cyclotorsional-error adjustments according to 2 limbal marks. Eyes in the iris-registration group had wavefront-guided ablation (Zyoptix) in which cyclotorsional errors were automatically detected and adjusted. Refraction, corneal topography, and visual acuity data were compared between groups. Vector analysis was by the Alpins method. The mean preoperative spherical equivalent (SE) was -6.64 diopters (D) +/- 1.99 (SD) in the limbal-marking group and -6.72 +/- 1.86 D in the iris-registration group (P = .92). At 6 months, the mean SE was -0.42 +/- 0.63 D and -0.47 +/- 0.62 D, respectively (P = .08). There was no statistically significant difference between groups in the astigmatism correction, success, or flattening index values using 6-month postoperative refractive data. The angle of error was within +/-10 degrees in 73% of eyes in the limbal-marking group and 75% of eyes in the iris-registration group. Manual limbal markings and iris-registration software were equally effective and safe in LASIK for myopic astigmatism, showing that checking cyclotorsion by manual limbal markings is a safe alternative when automated systems are not available. Copyright 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
R-189 (C-620) air compressor control logic software documentation. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, K.E.
1995-06-08
This relates to FFTF plant air compressors. Purpose of this document is to provide an updated Computer Software Description for the software to be used on R-189 (C-620-C) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started.
Automatic Generation of Just-in-Time Online Assessments from Software Design Models
ERIC Educational Resources Information Center
Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.
2009-01-01
Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…
Sampling theory and automated simulations for vertical sections, applied to human brain.
Cruz-Orive, L M; Gelšvartas, J; Roberts, N
2014-02-01
In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images - although convenient manual application is always an option - and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the 'lambda method') is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable--one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Long-Run Savings and Investment Strategy Optimization
Gerrard, Russell; Guillén, Montserrat; Pérez-Marín, Ana M.
2014-01-01
We focus on automatic strategies to optimize life cycle savings and investment. Classical optimal savings theory establishes that, given the level of risk aversion, a saver would keep the same relative amount invested in risky assets at any given time. We show that, when optimizing lifecycle investment, performance and risk assessment have to take into account the investor's risk aversion and the maximum amount the investor could lose, simultaneously. When risk aversion and maximum possible loss are considered jointly, an optimal savings strategy is obtained, which follows from constant rather than relative absolute risk aversion. This result is fundamental to prove that if risk aversion and the maximum possible loss are both high, then holding a constant amount invested in the risky asset is optimal for a standard lifetime saving/pension process and outperforms some other simple strategies. Performance comparisons are based on downside risk-adjusted equivalence that is used in our illustration. PMID:24711728
Long-run savings and investment strategy optimization.
Gerrard, Russell; Guillén, Montserrat; Nielsen, Jens Perch; Pérez-Marín, Ana M
2014-01-01
We focus on automatic strategies to optimize life cycle savings and investment. Classical optimal savings theory establishes that, given the level of risk aversion, a saver would keep the same relative amount invested in risky assets at any given time. We show that, when optimizing lifecycle investment, performance and risk assessment have to take into account the investor's risk aversion and the maximum amount the investor could lose, simultaneously. When risk aversion and maximum possible loss are considered jointly, an optimal savings strategy is obtained, which follows from constant rather than relative absolute risk aversion. This result is fundamental to prove that if risk aversion and the maximum possible loss are both high, then holding a constant amount invested in the risky asset is optimal for a standard lifetime saving/pension process and outperforms some other simple strategies. Performance comparisons are based on downside risk-adjusted equivalence that is used in our illustration.
A Survey of Automatic Code Generating Software
1988-09-01
11189 Cincinnati, OH 45211 513-662-2300 69 LIST OF REFERENCES 1. Boehm, Barry W., "Software and Its Impact: A Quantita- tive Assessment," Daamtin, Vol...Decision SuDvort Systems: An Organizational Perspective, pp. 11- 12, Addison-Wesley Publishing Company, Inc., 1978. 6. Pressman , Roger S., Software
Development of Data Processing Software for NBI Spectroscopic Analysis System
NASA Astrophysics Data System (ADS)
Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong
2015-04-01
A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation
Klatte, J Michael; Selvarangan, Rangaraj; Jackson, Mary Anne; Myers, Angela L
2016-01-01
Study objectives included addressing overuse of Clostridium difficile laboratory testing by decreasing submission rates of nondiarrheal stool specimens and specimens from children ≤12 months of age and determining resultant patient and laboratory cost savings associated with decreased testing. A multifaceted initiative was developed, and components included multiple provider education methods, computerized order entry modifications, and automatic declination from laboratory on testing stool specimens of nondiarrheal consistency and from children ≤12 months old. A run chart, demonstrating numbers of nondiarrheal plus infant stool specimens submitted over time, was developed to analyze the initiative's impact on clinicians' test-ordering practices. A p-chart was generated to evaluate the percentage of these submitted specimens tested biweekly over a 12-month period. Cost savings for patients and the laboratory were assessed at the study period's conclusion. Run chart analysis revealed an initial shift after the interventions, suggesting a temporary decrease in testing submission; however, no sustained differences in numbers of specimens submitted biweekly were observed over time. On the p-chart, the mean percentage of specimens tested before the intervention was 100%. After the intervention, the average percentage of specimens tested dropped to 53.8%. Resultant laboratory cost savings totaled nearly $3600, and patient savings on testing charges were ∼$32 000. Automatic laboratory declination of nondiarrheal stools submitted for CDI testing resulted in a sustained decrease in the number of specimens tested, resulting in significant laboratory and patient cost savings. Despite multiple educational efforts, no sustained changes in physician ordering practices were observed. Copyright © 2016 by the American Academy of Pediatrics.
Automatic generation of randomized trial sequences for priming experiments.
Ihrke, Matthias; Behrendt, Jörg
2011-01-01
In most psychological experiments, a randomized presentation of successive displays is crucial for the validity of the results. For some paradigms, this is not a trivial issue because trials are interdependent, e.g., priming paradigms. We present a software that automatically generates optimized trial sequences for (negative-) priming experiments. Our implementation is based on an optimization heuristic known as genetic algorithms that allows for an intuitive interpretation due to its similarity to natural evolution. The program features a graphical user interface that allows the user to generate trial sequences and to interactively improve them. The software is based on freely available software and is released under the GNU General Public License.
The AST3 controlling and operating software suite for automatic sky survey
NASA Astrophysics Data System (ADS)
Hu, Yi; Shang, Zhaohui; Ma, Bin; Hu, Keliang
2016-07-01
We have developed a specialized software package, called ast3suite, to achieve the remote control and automatic sky survey for AST3 (Antarctic Survey Telescope) from scratch. It includes several daemon servers and many basic commands. Each program does only one single task, and they work together to make AST3 a robotic telescope. A survey script calls basic commands to carry out automatic sky survey. Ast3suite was carefully tested in Mohe, China in 2013 and has been used at Dome, Antarctica in 2015 and 2016 with the real hardware for practical sky survey. Both test results and practical using showed that ast3suite had worked very well without any manual auxiliary as we expected.
Astrometrica: Astrometric data reduction of CCD images
NASA Astrophysics Data System (ADS)
Raab, Herbert
2012-03-01
Astrometrica is an interactive software tool for scientific grade astrometric data reduction of CCD images. The current version of the software is for the Windows 32bit operating system family. Astrometrica reads FITS (8, 16 and 32 bit integer files) and SBIG image files. The size of the images is limited only by available memory. It also offers automatic image calibration (Dark Frame and Flat Field correction), automatic reference star identification, automatic moving object detection and identification, and access to new-generation star catalogs (PPMXL, UCAC 3 and CMC-14), in addition to online help and other features. Astrometrica is shareware, available for use for a limited period of time (100 days) for free; special arrangements can be made for educational projects.
StochKit2: software for discrete stochastic simulation of biochemical systems with events.
Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R
2011-09-01
StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Selvester scoring in patients with strict LBBB using the QUARESS software.
Xia, Xiaojuan; Chaudhry, Uzma; Wieslander, Björn; Borgquist, Rasmus; Wagner, Galen S; Strauss, David G; Platonov, Pyotr; Ugander, Martin; Couderc, Jean-Philippe
2015-01-01
Estimation of the infarct size from body-surface ECGs in post-myocardial infarction patients has become possible using the Selvester scoring method. Automation of this scoring has been proposed in order to speed-up the measurement of the score and improving the inter-observer variability in computing a score that requires strong expertise in electrocardiography. In this work, we evaluated the quality of the QuAReSS software for delivering correct Selvester scoring in a set of standard 12-lead ECGs. Standard 12-lead ECGs were recorded in 105 post-MI patients prescribed implantation of an implantable cardiodefibrillator (ICD). Amongst the 105 patients with standard clinical left bundle branch block (LBBB) patterns, 67 had a LBBB pattern meeting the strict criteria. The QuAReSS software was applied to these 67 tracings by two independent groups of cardiologists (from a clinical group and an ECG core laboratory) to measure the Selvester score semi-automatically. Using various level of agreement metrics, we compared the scores between groups and when automatically measured by the software. The average of the absolute difference in Selvester scores measured by the two independent groups was 1.4±1.5 score points, whereas the difference between automatic method and the two manual adjudications were 1.2±1.2 and 1.3±1.2 points. Eighty-two percent score agreement was observed between the two independent measurements when the difference of score was within two point ranges, while 90% and 84% score agreements were reached using the automatic method compared to the two manual adjudications. The study confirms that the QuAReSS software provides valid measurements of the Selvester score in patients with strict LBBB with minimal correction from cardiologists. Copyright © 2015 Elsevier Inc. All rights reserved.
AIRSAR Web-Based Data Processing
NASA Technical Reports Server (NTRS)
Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne
2007-01-01
The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.
Finding Savings in Community Use of Schools
ERIC Educational Resources Information Center
Gandy, Julia
2013-01-01
This article reports on the growing challenge of managing community groups using educational facilities for meetings, athletics, and special events. It describes how, by using an online scheduling software program, one school district was able to track payments and save time and money with its event and facility scheduling process.
Clothes Dryer Automatic Termination Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
TeGrotenhuis, Ward E.
Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this reportmore » shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.« less
Gaburro, Julie; Duchemin, Jean-Bernard; Paradkar, Prasad N; Nahavandi, Saeid; Bhatti, Asim
2016-11-18
Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.
NASA Astrophysics Data System (ADS)
Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.
2017-08-01
In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.
NASA Technical Reports Server (NTRS)
2015-01-01
Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.
NASA Technical Reports Server (NTRS)
Simmons, D. B.; Marchbanks, M. P., Jr.; Quick, M. J.
1982-01-01
The results of an effort to thoroughly and objectively analyze the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software are given. The particular areas of interest include cost of the software, reliability of the software, requirements for the software and how the requirements changed during development of the system. Data related to the current version of the software system produced some interesting results. Suggestions are made for the saving of additional data which will allow additional investigation.
Design of a real-time tax-data monitoring intelligent card system
NASA Astrophysics Data System (ADS)
Gu, Yajun; Bi, Guotang; Chen, Liwei; Wang, Zhiyuan
2009-07-01
To solve the current problem of low efficiency of domestic Oil Station's information management, Oil Station's realtime tax data monitoring system has been developed to automatically access tax data of Oil pumping machines, realizing Oil-pumping machines' real-time automatic data collection, displaying and saving. The monitoring system uses the noncontact intelligent card or network to directly collect data which can not be artificially modified and so seals the loopholes and improves the tax collection's automatic level. It can perform real-time collection and management of the Oil Station information, and find the problem promptly, achieves the automatic management for the entire process covering Oil sales accounting and reporting. It can also perform remote query to the Oil Station's operation data. This system has broad application future and economic value.
Foam-Mixing-And-Dispensing Machine
NASA Technical Reports Server (NTRS)
Chong, Keith Y.; Toombs, Gordon R.; Jackson, Richard J.
1996-01-01
Time-and-money-saving machine produces consistent, homogeneously mixed foam, enhancing production efficiency. Automatically mixes and dispenses polyurethane foam in quantities specified by weight. Consists of cart-mounted, air-driven proportioning unit; air-activated mechanical mixing gun; programmable timer/counter, and controller.
Software for marine ecological environment comprehensive monitoring system based on MCGS
NASA Astrophysics Data System (ADS)
Wang, X. H.; Ma, R.; Cao, X.; Cao, L.; Chu, D. Z.; Zhang, L.; Zhang, T. P.
2017-08-01
The automatic integrated monitoring software for marine ecological environment based on MCGS configuration software is designed and developed to realize real-time automatic monitoring of many marine ecological parameters. The DTU data transmission terminal performs network communication and transmits the data to the user data center in a timely manner. The software adopts the modular design and has the advantages of stable and flexible data structure, strong portability and scalability, clear interface, simple user operation and convenient maintenance. Continuous site comparison test of 6 months showed that, the relative error of the parameters monitored by the system such as temperature, salinity, turbidity, pH, dissolved oxygen was controlled within 5% with the standard method and the relative error of the nutrient parameters was within 15%. Meanwhile, the system had few maintenance times, low failure rate, stable and efficient continuous monitoring capabilities. The field application shows that the software is stable and the data communication is reliable, and it has a good application prospect in the field of marine ecological environment comprehensive monitoring.
Automatic Rotational Sky Quality Meter (R-SQM) Design and Software for Astronomical Observatories
NASA Astrophysics Data System (ADS)
Dogan, E.; Ozbaldan, E. E.; Shameoni, Niaei M.; Yesilyaprak, C.
2016-12-01
We have presented the new design of Sky Quality Meter (SQM) device that is an automatic rotational model of sky quality meter (R-SQM) carried out by DAG (Eastern Anatolia Observatory) Technical Team. R-SQM is required for determining the long-term changes of sky quality of an astronomical observatory and consists of four SQM devices mounted on a rotating shaft with different angles for scanning all sky. This system is controlled by a Raspberry Pi control card and a step motor with its driver and a special software.
Periodic, On-Demand, and User-Specified Information Reconciliation
NASA Technical Reports Server (NTRS)
Kolano, Paul
2007-01-01
Automated sequence generation (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences. APGEN includes a graphical user interface that facilitates scheduling of activities on a time line and affords a capability to automatically expand, decompose, and schedule activities.
NASA Technical Reports Server (NTRS)
1994-01-01
An aerial color infrared (CIR) mapping system developed by Kennedy Space Center enables Florida's Charlotte County to accurately appraise its citrus groves while reducing appraisal costs. The technology was further advanced by development of a dual video system making it possible to simultaneously view images of the same area and detect changes. An image analysis system automatically surveys and photo interprets grove images as well as automatically counts trees and reports totals. The system, which saves both time and money, has potential beyond citrus grove valuation.
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
NASA Astrophysics Data System (ADS)
Sagnotti, Leonardo
2013-04-01
Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).
EMT-defibrillation: a recipe for saving lives.
Paris, P M
1988-05-01
Sudden cardiac death is the number-one cause of death in this country. It has long been known that most of these deaths occur outside of the hospital, therefore necessitating an approach to the problem involving prehospital care. The development of advanced life support emergency medical systems has had a dramatic impact on improving survival in selected communities. Most of the country continues to see little result because of our inability to provide timely defibrillation. Automatic external defibrillators now provide a safe, reliable, proven method to increase the number of "saves" in rural, urban, and suburban communities. This new tool, if widely used, will allow us to save scores of "hearts too good to die."
US Army Proposed Automatic Test Equipment Software Development and Support Facility.
1982-10-29
programs would be prepared as weapon and prime system operating software. The ATE Software Development and Support Facility will help prevent the TPS...ONE AS A STANDARD **Partially being Developed (2) UNDER DEVELOP- by Navy CSS Prgram MENT (3) NEEDS TAILOR- (5) NEEDS ING FOR ARMY DEVELOPMENT A- 2
Integrated Functional and Executional Modelling of Software Using Web-Based Databases
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Marietta, Roberta
1998-01-01
NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases. To appear in an article of Journal of Database Management.
Tang, Wei; Peled, Noam; Vallejo, Deborah I.; Borzello, Mia; Dougherty, Darin D.; Eskandar, Emad N.; Widge, Alik S.; Cash, Sydney S.; Stufflebeam, Steven M.
2018-01-01
Purpose Existing methods for sorting, labeling, registering, and across-subject localization of electrodes in intracranial encephalography (iEEG) may involve laborious work requiring manual inspection of radiological images. Methods We describe a new open-source software package, the interactive electrode localization utility which presents a full pipeline for the registration, localization, and labeling of iEEG electrodes from CT and MR images. In addition, we describe a method to automatically sort and label electrodes from subdural grids of known geometry. Results We validated our software against manual inspection methods in twelve subjects undergoing iEEG for medically intractable epilepsy. Our algorithm for sorting and labeling performed correct identification on 96% of the electrodes. Conclusions The sorting and labeling methods we describe offer nearly perfect performance and the software package we have distributed may simplify the process of registering, sorting, labeling, and localizing subdural iEEG grid electrodes by manual inspection. PMID:27915398
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Zhang, Zhao
With each CMOS technology generation, leakage energy consumption has been dramatically increasing and hence, managing leakage power consumption of large last-level caches (LLCs) has become a critical issue in modern processor design. In this paper, we present EnCache, a novel software-based technique which uses dynamic profiling-based cache reconfiguration for saving cache leakage energy. EnCache uses a simple hardware component called profiling cache, which dynamically predicts energy efficiency of an application for 32 possible cache configurations. Using these estimates, system software reconfigures the cache to the most energy efficient configuration. EnCache uses dynamic cache reconfiguration and hence, it does not requiremore » offline profiling or tuning the parameter for each application. Furthermore, EnCache optimizes directly for the overall memory subsystem (LLC and main memory) energy efficiency instead of the LLC energy efficiency alone. The experiments performed with an x86-64 simulator and workloads from SPEC2006 suite confirm that EnCache provides larger energy saving than a conventional energy saving scheme. For single core and dual-core system configurations, the average savings in memory subsystem energy over a shared baseline configuration are 30.0% and 27.3%, respectively.« less
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
Analog Input Data Acquisition Software
NASA Technical Reports Server (NTRS)
Arens, Ellen
2009-01-01
DAQ Master Software allows users to easily set up a system to monitor up to five analog input channels and save the data after acquisition. This program was written in LabVIEW 8.0, and requires the LabVIEW runtime engine 8.0 to run the executable.
Mason, R.R.; Hill, C.L.
1988-01-01
The U.S. Geological Survey has developed software that interfaces with the Automated Data Processing System to facilitate and expedite preparation of the annual water-resources data report. This software incorporates a feature that prepares daily values tables and appends them to previously edited files containing station manuscripts. Other features collate the merged files with miscellaneous sections of the report. The report is then printed as page-size, camera-ready copy. All system components reside on a minicomputer; this provides easy access and use by remote field offices. Automation of the annual report preparation process results in significant savings of labor and cost. Use of the system for producing the 1986 annual report in the North Carolina District realized a labor savings of over two man-months. A fully implemented system would produce a greater savings and speed release of the report to users.
Loborec, Steven M; Johnson, Shawn E; Keating, Ellen A
2016-02-01
The results of a study to assess the financial impact of an automatic formulary substitution of ipratropium-albuterol nebulization solution for ipratropium-albuterol metered-dose inhalers (MDIs) at an academic health system are reported. The study was conducted at a 1242-bed urban academic health system. Data were collected regarding all respiratory medication administrations during a three-month period before the MDI-to-nebulizer substitution (October-December 2012) and the same period of 2013 (after the substitution was implemented). Purchasing data were compared between the two time periods to measure the impact of the formulary substitution on pharmacy department costs, and documented administrations were assessed to evaluate associated changes in respiratory therapist (RT) workload. With 100% prescriber compliance with the formulary substitution, the number of MDI administrations of ipratropium-albuterol declined from 13,667 in October-December 2012 to zero in the same period of 2013. The substitution required expenditures for equipment (vibrating mesh nebulizer technology and patient-specific kits) and RT personnel (one additional RT was hired), but those added costs were substantially outweighed by cost savings resulting from a substantial reduction in overall respiratory drug spending. An automatic substitution of ipratropium-albuterol nebulization solution for MDIs resulted in a three-month savings of $99,359 in drug cost and an extrapolated full-year savings of $397,436. When additional costs associated with the substitution were taken into account, there was an overall savings of $146,806 during the implementation year and a projected savings of $257,936 for each following year. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Egbert, James Allen
2016-01-01
In support of ground system development for the Space Launch System (SLS), engineers are tasked with building immense engineering models of extreme complexity. The various systems require rigorous analysis of pneumatics, hydraulic, cryogenic, and hypergolic systems. There are certain standards that each of these systems must meet, in the form of pressure vessel system (PVS) certification reports. These reports can be hundreds of pages long, and require many hours to compile. Traditionally, each component is analyzed individually, often utilizing hand calculations in the design process. The objective of this opportunity is to perform these analyses in an integrated fashion with the parametric CADCAE environment. This allows for systems to be analyzed on an assembly level in a semi-automated fashion, which greatly improves accuracy and efficiency. To accomplish this, component specific parameters were stored in the Windchill database to individual Creo Parametric models based on spec control drawings. These parameters were then accessed by using the Prime Analysis within Creo Parametric. MathCAD Prime spreadsheets were created that automatically extracted these parameters, performed calculations, and generated reports. The reports described component compatibility based on local conditions such as pressure, temperature, density, and flow rates. The reports also determined component pairing compatibility, such as properly sizing relief valves with regulators. The reports stored the input conditions that were used to determine compatibility to increase traceability of component selection. The desired workflow for using this tool would begin with a Creo Schematics diagram of a PVS system. This schematic would store local conditions and locations of components. The schematic would then populate an assembly within Creo Parametric, using Windchill database parts. These parts would have their attributes already assigned, and the MathCAD spreadsheets could begin running through database parts to determine which components would be suited for specific locations within the assembly. This eliminates a significant amount of time from the design process, and makes initial analysis assessments more accurate. Each component that would be checked for a location within the assembly would generate a report, showing whether the component was compatible. These reports could be used to generate the PVS report without the need to perform the same analysis multiple times. This process also has the potential to be expanded upon to further automate PVS reports. The integration of software codes or macros could be used to automatically check through hundreds of parts for each location on the schematic. If the software could recognize which type of component would be necessary for each location, it is possible that simply starting the macro could completely choose all the components needed for the schematic, and in turn the system. This would save many hours of work initially selecting components, which could end up saving money. Overall, this process helps to automate initial component selections for PVS systems to fit local design specifications. These selections will automatically generate reports showing how the design criteria are met by the specific component that was chosen. These reports will contribute to easier compilation of the PVS certification reports, which currently take a great amount of time and effort to produce.
NASA Technical Reports Server (NTRS)
Stinnett, W. G.
1980-01-01
The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
Improving a data-acquisition software system with abstract data type components
NASA Technical Reports Server (NTRS)
Howard, S. D.
1990-01-01
Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.
Solar Asset Management Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron; Zviagin, George
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.
NASA Astrophysics Data System (ADS)
Pena-Verdeal, Hugo; Garcia-Resua, Carlos; Yebra-Pimentel, Eva; Giraldez, Maria J.
2017-08-01
Purpose: Different lower tear meniscus parameters can be clinical assessed on dry eye diagnosis. The aim of this study was to propose and analyse the variability of a semi-automatic method for measuring lower tear meniscus central area (TMCA) by using open source software. Material and methods: On a group of 105 subjects, one video of the lower tear meniscus after fluorescein instillation was generated by a digital camera attached to a slit-lamp. A short light beam (3x5 mm) with moderate illumination in the central portion of the meniscus (6 o'clock) was used. Images were extracted from each video by a masked observer. By using an open source software based on Java (NIH ImageJ), a further observer measured in a masked and randomized order the TMCA in the short light beam illuminated area by two methods: (1) manual method, where TMCA images was "manually" measured; (2) semi-automatic method, where TMCA images were transformed in an 8-bit-binary image, then holes inside this shape were filled and on the isolated shape, the area size was obtained. Finally, both measurements, manual and semi-automatic, were compared. Results: Paired t-test showed no statistical difference between both techniques results (p = 0.102). Pearson correlation between techniques show a significant positive near to perfect correlation (r = 0.99; p < 0.001). Conclusions: This study showed a useful tool to objectively measure the frontal central area of the meniscus in photography by free open source software.
NASA Astrophysics Data System (ADS)
Aggarwal, Kashish; Dagar, Nidhi; Sethi, Shikha; Tiwari, Shobhit; Kumar, Arun
2011-12-01
We have designed a security system using MATLAB. The system uses the optimum components and is very useful in tracking visitors to our homes, offices and other meeting places. The system works in two modes: Automatic Mode & Manual Mode. In the Automatic Mode, the snapshot of the person standing outside is taken with the help of a webcam and it is then compared with the set of predefined pictures in the database and if matched, the door is opened automatically. In the Manual Mode, the live preview is sent to the laptop screen and is recorded according to user's commands. The door is opened, and then date, time and recording is saved in the database.
Hiscox, Lucy; Leonavičiūtė, Erika; Humby, Trevor
2014-08-01
Dyslexia is associated with difficulties in language-specific skills such as spelling, writing and reading; the difficulty in acquiring literacy skills is not a result of low intelligence or the absence of learning opportunity, but these issues will persist throughout life and could affect long-term education. Writing is a complex process involving many different functions, integrated by the working memory system; people with dyslexia have a working memory deficit, which means that concentration on writing quality may be detrimental to understanding. We confirm impaired working memory in a sample of university students with (compensated) dyslexia, and using a within-subject design with three test conditions, we show that these participants demonstrated better understanding of a piece of text if they had used automatic spelling correction software during a dictation/transcription task. We hypothesize that the use of the autocorrecting software reduced demand on working memory, by allowing word writing to be more automatic, thus enabling better processing and understanding of the content of the transcriptions and improved recall. Long-term and regular use of autocorrecting assistive software should be beneficial for people with and without dyslexia and may improve confidence, written work, academic achievement and self-esteem, which are all affected in dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.
Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation
Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.
2012-01-01
Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720
Risking Your Life without a Second Thought: Intuitive Decision-Making and Extreme Altruism
Rand, David G.; Epstein, Ziv G.
2014-01-01
When faced with the chance to help someone in mortal danger, what is our first response? Do we leap into action, only later considering the risks to ourselves? Or must instinctive self-preservation be overcome by will-power in order to act? We investigate this question by examining the testimony of Carnegie Hero Medal Recipients (CHMRs), extreme altruists who risked their lives to save others. We collected published interviews with CHMRs where they described their decisions to help. We then had participants rate the intuitiveness versus deliberativeness of the decision-making process described in each CHMR statement. The statements were judged to be overwhelmingly dominated by intuition; to be significantly more intuitive than a set of control statements describing deliberative decision-making; and to not differ significantly from a set of intuitive control statements. This remained true when restricting to scenarios in which the CHMRs had sufficient time to reflect before acting if they had so chosen. Text-analysis software found similar results. These findings suggest that high-stakes extreme altruism may be largely motivated by automatic, intuitive processes. PMID:25333876
Risking your life without a second thought: intuitive decision-making and extreme altruism.
Rand, David G; Epstein, Ziv G
2014-01-01
When faced with the chance to help someone in mortal danger, what is our first response? Do we leap into action, only later considering the risks to ourselves? Or must instinctive self-preservation be overcome by will-power in order to act? We investigate this question by examining the testimony of Carnegie Hero Medal Recipients (CHMRs), extreme altruists who risked their lives to save others. We collected published interviews with CHMRs where they described their decisions to help. We then had participants rate the intuitiveness versus deliberativeness of the decision-making process described in each CHMR statement. The statements were judged to be overwhelmingly dominated by intuition; to be significantly more intuitive than a set of control statements describing deliberative decision-making; and to not differ significantly from a set of intuitive control statements. This remained true when restricting to scenarios in which the CHMRs had sufficient time to reflect before acting if they had so chosen. Text-analysis software found similar results. These findings suggest that high-stakes extreme altruism may be largely motivated by automatic, intuitive processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Y; Huang, H; Su, T
Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less
Automatic Publication of a MIS Product to GeoNetwork: Case of the AIS Indexer
2012-11-01
installation and configuration The following instructions are for installing and configuring the software packages Java 1.6 and MySQL 5.5 which are...An Automatic Identification System (AIS) reception indexer Java application was developed in the summer of 2011, based on the work of Lapinski and...release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT An Automatic Identification System (AIS) reception indexer Java application was
Making the most of a translator.
Dannenfeldt, D
1994-01-01
Anaheim Memorial Hospital in California is a trailblazer. It's one of the first hospitals in the nation to use translation software for both supply procurement and claims-related transactions. Initially, it acquired the software to streamline the ordering of supplies by shifting to standard electronic formats. Today, the hospital is using the software to receive electronic remittance advice, and it has plans for other labor-saving applications.
Use of Semi-Autonomous Tools for ISS Commanding and Monitoring
NASA Technical Reports Server (NTRS)
Brzezinski, Amy S.
2014-01-01
As the International Space Station (ISS) has moved into a utilization phase, operations have shifted to become more ground-based with fewer mission control personnel monitoring and commanding multiple ISS systems. This shift to fewer people monitoring more systems has prompted use of semi-autonomous console tools in the ISS Mission Control Center (MCC) to help flight controllers command and monitor the ISS. These console tools perform routine operational procedures while keeping the human operator "in the loop" to monitor and intervene when off-nominal events arise. Two such tools, the Pre-positioned Load (PPL) Loader and Automatic Operators Recorder Manager (AutoORM), are used by the ISS Communications RF Onboard Networks Utilization Specialist (CRONUS) flight control position. CRONUS is responsible for simultaneously commanding and monitoring the ISS Command & Data Handling (C&DH) and Communications and Tracking (C&T) systems. PPL Loader is used to uplink small pieces of frequently changed software data tables, called PPLs, to ISS computers to support different ISS operations. In order to uplink a PPL, a data load command must be built that contains multiple user-input fields. Next, a multiple step commanding and verification procedure must be performed to enable an onboard computer for software uplink, uplink the PPL, verify the PPL has incorporated correctly, and disable the computer for software uplink. PPL Loader provides different levels of automation in both building and uplinking these commands. In its manual mode, PPL Loader automatically builds the PPL data load commands but allows the flight controller to verify and save the commands for future uplink. In its auto mode, PPL Loader automatically builds the PPL data load commands for flight controller verification, but automatically performs the PPL uplink procedure by sending commands and performing verification checks while notifying CRONUS of procedure step completion. If an off-nominal condition occurs during procedure execution, PPL Loader notifies CRONUS through popup messages, allowing CRONUS to examine the situation and choose an option of how PPL loader should proceed with the procedure. The use of PPL Loader to perform frequent, routine PPL uplinks offloads CRONUS to better monitor two ISS systems. It also reduces procedure performance time and decreases risk of command errors. AutoORM identifies ISS communication outage periods and builds commands to lock, playback, and unlock ISS Operations Recorder files. Operation Recorder files are circular buffer files of continually recorded ISS telemetry data. Sections of these files can be locked from further writing, be played back to capture telemetry data that occurred during an ISS loss of signal (LOS) period, and then be unlocked for future recording use. Downlinked Operation Recorder files are used by mission support teams for data analysis, especially if failures occur during LOS. The commands to lock, playback, and unlock Operations Recorder files are encompassed in three different operational procedures and contain multiple user-input fields. AutoORM provides different levels of automation for building and uplinking the commands to lock, playback, and unlock Operations Recorder files. In its automatic mode, AutoORM automatically detects ISS LOS periods, then generates and uplinks the commands to lock, playback, and unlock Operations Recorder files when MCC regains signal with ISS. AutoORM also features semi-autonomous and manual modes which integrate CRONUS more into the command verification and uplink process. AutoORMs ability to automatically detect ISS LOS periods and build the necessary commands to preserve, playback, and release recorded telemetry data greatly offloads CRONUS to perform more high-level cognitive tasks, such as mission planning and anomaly troubleshooting. Additionally, since Operations Recorder commands contain numerical time input fields which are tedious for a human to manually build, AutoORM's ability to automatically build commands reduces operational command errors. PPL Loader and AutoORM demonstrate principles of semi-autonomous operational tools that will benefit future space mission operations. Both tools employ different levels of automation to perform simple and routine procedures, thereby offloading human operators to perform higher-level cognitive tasks. Because both tools provide procedure execution status and highlight off-nominal indications, the flight controller is able to intervene during procedure execution if needed. Semi-autonomous tools and systems that can perform routine procedures, yet keep human operators informed of execution, will be essential in future long-duration missions where the onboard crew will be solely responsible for spacecraft monitoring and control.
ERIC Educational Resources Information Center
Lowe, John B.
1988-01-01
If the CD-ROM revolution is likened to gambling, players are information providers and consumers; the stakes are development, production, distribution, hardware, and software costs; and betting is represented by the costs of updating disks and hardware and software maintenance, and by pricing. Strategy should take into account cost savings,…
Accuracy of Automatic Cephalometric Software on Landmark Identification
NASA Astrophysics Data System (ADS)
Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.
2017-11-01
This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (p<0.05) were found in 5 landmarks (Or, A-point, Me, L1T, and L1A) in horizontal direction and 7 landmarks (Or, A-point, U1T, U1A, B-point, Me, and L1A) in vertical direction. Four landmarks (Or, A-point, Me, and L1A) showed significant (p<0.05) mean differences in both horizontal and vertical directions. Small mean differences (<0.5mm) were found for S, N, B-point, Gn, and Pog in horizontal direction and N, Gn, Me, and L1T in vertical direction. Large mean differences were found for A-point (3.0 < 3.5mm) in horizontal direction and L1A (>4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.
Automatic forensic face recognition from digital images.
Peacock, C; Goode, A; Brett, A
2004-01-01
Digital image evidence is now widely available from criminal investigations and surveillance operations, often captured by security and surveillance CCTV. This has resulted in a growing demand from law enforcement agencies for automatic person-recognition based on image data. In forensic science, a fundamental requirement for such automatic face recognition is to evaluate the weight that can justifiably be attached to this recognition evidence in a scientific framework. This paper describes a pilot study carried out by the Forensic Science Service (UK) which explores the use of digital facial images in forensic investigation. For the purpose of the experiment a specific software package was chosen (Image Metrics Optasia). The paper does not describe the techniques used by the software to reach its decision of probabilistic matches to facial images, but accepts the output of the software as though it were a 'black box'. In this way, the paper lays a foundation for how face recognition systems can be compared in a forensic framework. The aim of the paper is to explore how reliably and under what conditions digital facial images can be presented in evidence.
Techniques for Developing an Acquisition Strategy by Profiling Software Risks
2006-08-01
Drivers...................................................................................... 13 Figure 8: BMW 745Li Software... BMW 745Li, shown in Figure 8, is a good illustration of the increasing software control of hardware systems in automobiles. Among the many features...roll stabilization, dynamic brake con- trol, coded drive-away protection, an adaptive automatic transmission, and iDrive systems. This list can be
Research flight software engineering and MUST, an integrated system of support tools
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Foudriat, E. C.; Will, R. W.
1977-01-01
Consideration is given to software development to support NASA flight research. The Multipurpose User-Oriented Software Technology (MUST) program, designed to integrate digital systems into flight research, is discussed. Particular attention is given to the program's special interactive user interface, subroutine library, assemblers, compiler, automatic documentation tools, and test and simulation subsystems.
NASA Astrophysics Data System (ADS)
Yu, You; Han, Yanchao; Xu, Miao; Zhang, Lingling; Dong, Shaojun
2016-04-01
Inverted illumination compensation is important in energy-saving projects, artificial photosynthesis and some forms of agriculture, such as hydroponics. However, only a few illumination adjustments based on self-powered biodetectors that quantitatively detect the intensity of visible light have been reported. We constructed an automatic illumination compensation device based on a photoelectrochemical biofuel cell (PBFC) driven by visible light. The PBFC consisted of a glucose dehydrogenase modified bioanode and a p-type semiconductor cuprous oxide photocathode. The PBFC had a high power output of 161.4 μW cm-2 and an open circuit potential that responded rapidly to visible light. It adjusted the amount of illumination inversely irrespective of how the external illumination was changed. This rational design of utilizing PBFCs provides new insights into automatic light adjustable devices and may be of benefit to intelligent applications.Inverted illumination compensation is important in energy-saving projects, artificial photosynthesis and some forms of agriculture, such as hydroponics. However, only a few illumination adjustments based on self-powered biodetectors that quantitatively detect the intensity of visible light have been reported. We constructed an automatic illumination compensation device based on a photoelectrochemical biofuel cell (PBFC) driven by visible light. The PBFC consisted of a glucose dehydrogenase modified bioanode and a p-type semiconductor cuprous oxide photocathode. The PBFC had a high power output of 161.4 μW cm-2 and an open circuit potential that responded rapidly to visible light. It adjusted the amount of illumination inversely irrespective of how the external illumination was changed. This rational design of utilizing PBFCs provides new insights into automatic light adjustable devices and may be of benefit to intelligent applications. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00759g
Methodology of decreasing software complexity using ontology
NASA Astrophysics Data System (ADS)
DÄ browska-Kubik, Katarzyna
2015-09-01
In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.
DOT National Transportation Integrated Search
1998-05-01
Recent technological advances in computer hardware, software, and image processing have led to the development of automated license plate reading equipment. This equipment has primarily been developed for enforcement and security applications, such a...
76 FR 12342 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... savings account. The application collects the necessary bank account information that allows the U.S... automatic debiting of all monthly payments may provide bank account information that allows them to... DEPARTMENT OF EDUCATION Notice of Proposed Information Collection Requests AGENCY: Department of...
GISentinel: a software platform for automatic ulcer detection on capsule endoscopy videos
NASA Astrophysics Data System (ADS)
Yi, Steven; Jiao, Heng; Meng, Fan; Leighton, Jonathon A.; Shabana, Pasha; Rentz, Lauri
2014-03-01
In this paper, we present a novel and clinically valuable software platform for automatic ulcer detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos take about 8 hours. They have to be reviewed manually by physicians to detect and locate diseases such as ulcers and bleedings. The process is time consuming. Moreover, because of the long-time manual review, it is easy to lead to miss-finding. Working with our collaborators, we were focusing on developing a software platform called GISentinel, which can fully automated GI tract ulcer detection and classification. This software includes 3 parts: the frequency based Log-Gabor filter regions of interest (ROI) extraction, the unique feature selection and validation method (e.g. illumination invariant feature, color independent features, and symmetrical texture features), and the cascade SVM classification for handling "ulcer vs. non-ulcer" cases. After the experiments, this SW gave descent results. In frame-wise, the ulcer detection rate is 69.65% (319/458). In instance-wise, the ulcer detection rate is 82.35%(28/34).The false alarm rate is 16.43% (34/207). This work is a part of our innovative 2D/3D based GI tract disease detection software platform. The final goal of this SW is to find and classification of major GI tract diseases intelligently, such as bleeding, ulcer, and polyp from the CE videos. This paper will mainly describe the automatic ulcer detection functional module.
SU-E-T-50: Automatic Validation of Megavoltage Beams Modeled for Clinical Use in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melchior, M; Salinas Aranda, F; 21st Century Oncology, Ft. Myers, FL
2014-06-01
Purpose: To automatically validate megavoltage beams modeled in XiO™ 4.50 (Elekta, Stockholm, Sweden) and Varian Eclipse™ Treatment Planning Systems (TPS) (Varian Associates, Palo Alto, CA, USA), reducing validation time before beam-on for clinical use. Methods: A software application that can automatically read and analyze DICOM RT Dose and W2CAD files was developed using MatLab integrated development environment.TPS calculated dose distributions, in DICOM RT Dose format, and dose values measured in different Varian Clinac beams, in W2CAD format, were compared. Experimental beam data used were those acquired for beam commissioning, collected on a water phantom with a 2D automatic beam scanningmore » system.Two methods were chosen to evaluate dose distributions fitting: gamma analysis and point tests described in Appendix E of IAEA TECDOC-1583. Depth dose curves and beam profiles were evaluated for both open and wedged beams. Tolerance parameters chosen for gamma analysis are 3% and 3 mm dose and distance, respectively.Absolute dose was measured independently at points proposed in Appendix E of TECDOC-1583 to validate software results. Results: TPS calculated depth dose distributions agree with measured beam data under fixed precision values at all depths analyzed. Measured beam dose profiles match TPS calculated doses with high accuracy in both open and wedged beams. Depth and profile dose distributions fitting analysis show gamma values < 1. Relative errors at points proposed in Appendix E of TECDOC-1583 meet therein recommended tolerances.Independent absolute dose measurements at points proposed in Appendix E of TECDOC-1583 confirm software results. Conclusion: Automatic validation of megavoltage beams modeled for their use in the clinic was accomplished. The software tool developed proved efficient, giving users a convenient and reliable environment to decide whether to accept or not a beam model for clinical use. Validation time before beam-on for clinical use was reduced to a few hours.« less
Higashigaito, K; Becker, A S; Sprengel, K; Simmen, H-P; Wanner, G; Alkadhi, H
2016-09-01
To demonstrate the feasibility and accuracy of automatic radiation dose monitoring software for computed tomography (CT) of trauma patients in a clinical setting over time, and to evaluate the potential of radiation dose reduction using iterative reconstruction (IR). In a time period of 18 months, data from 378 consecutive thoraco-abdominal CT examinations of trauma patients were extracted using automatic radiation dose monitoring software, and patients were split into three cohorts: cohort 1, 64-section CT with filtered back projection, 200 mAs tube current-time product; cohort 2, 128-section CT with IR and identical imaging protocol; cohort 3, 128-section CT with IR, 150 mAs tube current-time product. Radiation dose parameters from the software were compared with the individual patient protocols. Image noise was measured and image quality was semi-quantitatively determined. Automatic extraction of radiation dose metrics was feasible and accurate in all (100%) patients. All CT examinations were of diagnostic quality. There were no differences between cohorts 1 and 2 regarding volume CT dose index (CTDIvol; p=0.62), dose-length product (DLP), and effective dose (ED, both p=0.95), while noise was significantly lower (chest and abdomen, both -38%, p<0.017). Compared to cohort 1, CTDIvol, DLP, and ED in cohort 3 were significantly lower (all -25%, p<0.017), similar to the noise in the chest (-32%) and abdomen (-27%, both p<0.017). Compared to cohort 2, CTDIvol (-28%), DLP, and ED (both -26%) in cohort 3 was significantly lower (all, p<0.017), while noise in the chest (+9%) and abdomen (+18%) was significantly higher (all, p<0.017). Automatic radiation dose monitoring software is feasible and accurate, and can be implemented in a clinical setting for evaluating the effects of lowering radiation doses of CT protocols over time. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Delivering Savings with Open Architecture and Product Lines
2011-04-30
p.m. Chair: Christopher Deegan , Executive Director, Program Executive Office for Integrated Warfare Systems Delivering Savings with Open...Architectures Walt Scacchi and Thomas Alspaugh, Institute for Software Research Christopher Deegan —Executive Director, Program Executive Officer...Integrated Warfare Systems (PEO IWS). Mr. Deegan directs the development, acquisition, and fleet support of 150 combat weapon system programs managed by 350
Using Technology to Control Costs
ERIC Educational Resources Information Center
Ho, Simon; Schoenberg, Doug; Richards, Dan; Morath, Michael
2009-01-01
In this article, the authors examines the use of technology to control costs in the child care industry. One of these technology solutions is Software-as-a-Service (SaaS). SaaS solutions can help child care providers save money in many aspects of center management. In addition to cost savings, SaaS solutions are also particularly appealing to…
LogiKit - assisting complex logic specification and implementation for embedded control systems
NASA Astrophysics Data System (ADS)
Diglio, A.; Nicolodi, B.
2002-07-01
LogiKit provides an overall lifecycle solution. LogiKit is a powerful software engineering case toolkit for requirements specification, simulation and documentation. LogiKit also provides an automatic ADA software design, code and unit test generator.
Recent advances in the CRANK software suite for experimental phasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pannu, Navraj S., E-mail: raj@chem.leidenuniv.nl; Waterreus, Willem-Jan; Skubák, Pavol
2011-04-01
Recent developments in the CRANK software suite for experimental phasing have led to many more structures being built automatically. For its first release in 2004, CRANK was shown to effectively detect and phase anomalous scatterers from single-wavelength anomalous diffraction data. Since then, CRANK has been significantly improved and many more structures can be built automatically with single- or multiple-wavelength anomalous diffraction or single isomorphous replacement with anomalous scattering data. Here, the new algorithms that have been developed that have led to these substantial improvements are discussed and CRANK’s performance on over 100 real data sets is shown. The latest versionmore » of CRANK is freely available for download at http://www.bfsc.leidenuniv.nl/software/crank/ and from CCP4 (http://www.ccp4.ac.uk/)« less
TED-Time and life saving External Defibrillator for home-use.
Weiss, Teddy A; Rosenheck, Shimon; Gorni, Shraga; Katz, Ioni; Mendelbaum, Mendel; Gilon, Dan
2014-06-01
Sudden Cardiac Death--SCD --is a major unmet health problem that needs urgent and prompt solution. AICDs are very expensive, risky and indicated for a small group of patients, at the highest risk. AEDs--Automatic External Defibrillators--are designed for public places and although safe, cannot enter the home-market due to their cost and need for constant, high-cost maintenance. We developed TED, a low-cost AED that derives its energy off the mains, designed for home-use, to save SCD victims' lives. Copyright © 2014 Elsevier Inc. All rights reserved.
Saving Endangered Species: Using Technology to Teach Thematically.
ERIC Educational Resources Information Center
Wepner, Shelly B.; Seminoff, Nancy E.
1994-01-01
Describes a project using software in kindergarten instruction. Seven pieces of software used in a unit on endangered species that included social studies, math, art, language arts, and music emphases are briefly described. Ideas for managing a one-computer classroom and general recommendations drawn from the study are given. (KRN)
No-Fail Software Gifts for Kids.
ERIC Educational Resources Information Center
Buckleitner, Warren
1996-01-01
Reviews children's software packages: (1) "Fun 'N Games"--nonviolent games and activities; (2) "Putt-Putt Saves the Zoo"--matching, logic games, and animal facts; (3) "Big Job"--12 logic games with video from job sites; (4) "JumpStart First Grade"--15 activities introducing typical school lessons; and (5) "Read, Write, & Type!"--progressively…
Information Technology Wants to Be Free
ERIC Educational Resources Information Center
Poritz, Jonathan A.
2012-01-01
It makes sense for college and university faculty to ally with the free and open-source software community. They share common values. A marvelous additional benefit is that free software on campuses would significantly advance pedagogy and scholarship, increase efficiency, and save money. Only unquestioning obedience to market fundamentalism--or…
Recording Computer-Based Demonstrations and Board Work
ERIC Educational Resources Information Center
Spencer, Neil H.
2010-01-01
This article describes how a demonstration of statistical (or other) software can be recorded without expensive video equipment and saved as a presentation to be displayed with software such as Microsoft PowerPoint. Work carried out on a tablet PC, for example, can also be recorded in this fashion.
Wang, Rui; Meinel, Felix G; Schoepf, U Joseph; Canstein, Christian; Spearman, James V; De Cecco, Carlo N
2015-12-01
To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. • Cardiac computed tomography (CCT) can accurately assess segmental left ventricular wall function. • A novel automated software permits accurate and fast evaluation of wall function. • The software may improve the clinical implementation of segmental functional analysis.
Gennaro, G; Ballaminut, A; Contento, G
2017-09-01
This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.
SLAM, a Mathematica interface for SUSY spectrum generators
NASA Astrophysics Data System (ADS)
Marquard, Peter; Zerf, Nikolai
2014-03-01
We present and publish a Mathematica package, which can be used to automatically obtain any numerical MSSM input parameter from SUSY spectrum generators, which follow the SLHA standard, like SPheno, SOFTSUSY, SuSeFLAV or Suspect. The package enables a very comfortable way of numerical evaluations within the MSSM using Mathematica. It implements easy to use predefined high scale and low scale scenarios like mSUGRA or mhmax and if needed enables the user to directly specify the input required by the spectrum generators. In addition it supports an automatic saving and loading of SUSY spectra to and from a SQL data base, avoiding the rerun of a spectrum generator for a known spectrum. Catalogue identifier: AERX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERX_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4387 No. of bytes in distributed program, including test data, etc.: 37748 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer where Mathematica version 6 or higher is running providing bash and sed. Operating system: Linux. Classification: 11.1. External routines: A SUSY spectrum generator such as SPheno, SOFTSUSY, SuSeFLAV or SUSPECT Nature of problem: Interfacing published spectrum generators for automated creation, saving and loading of SUSY particle spectra. Solution method: SLAM automatically writes/reads SLHA spectrum generator input/output and is able to save/load generated data in/from a data base. Restrictions: No general restrictions, specific restrictions are given in the manuscript. Running time: A single spectrum calculation takes much less than one second on a modern PC.
Software for simulation of a computed tomography imaging spectrometer using optical design software
NASA Astrophysics Data System (ADS)
Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.
2000-11-01
Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.
The associate principal astronomer telescope operations model
NASA Technical Reports Server (NTRS)
Drummond, Mark; Bresina, John; Swanson, Keith; Edgington, Will; Henry, Greg
1994-01-01
This paper outlines a new telescope operations model that is intended to achieve low operating costs with high operating efficiency and high scientific productivity. The model is based on the existing Principal Astronomer approach used in conjunction with ATIS, a language for commanding remotely located automatic telescopes. This paper introduces the notion of an Associate Principal Astronomer, or APA. At the heart of the APA is automatic observation loading and scheduling software, and it is this software that is expected to help achieve efficient and productive telescope operations. The purpose of the APA system is to make it possible for astronomers to submit observation requests to and obtain resulting data from remote automatic telescopes, via the Internet, in a highly-automated way that minimizes human interaction with the system and maximizes the scientific return from observing time.
The Accuracy of GBM GRB Localizations
NASA Astrophysics Data System (ADS)
Briggs, Michael Stephen; Connaughton, V.; Meegan, C.; Hurley, K.
2010-03-01
We report an study of the accuracy of GBM GRB localizations, analyzing three types of localizations: those produced automatically by the GBM Flight Software on board GBM, those produced automatically with ground software in near real time, and localizations produced with human guidance. The two types of automatic locations are distributed in near real-time via GCN Notices; the human-guided locations are distributed on timescale of many minutes or hours using GCN Circulars. This work uses a Bayesian analysis that models the distribution of the GBM total location error by comparing GBM locations to more accurate locations obtained with other instruments. Reference locations are obtained from Swift, Super-AGILE, the LAT, and with the IPN. We model the GBM total location errors as having systematic errors in addition to the statistical errors and use the Bayesian analysis to constrain the systematic errors.
Application of software for automated canal management (SacMan) to the WM lateral canal
USDA-ARS?s Scientific Manuscript database
Simulation studies have demonstrated that automatic control of canals is more effective when feedforward scheduling, or routing of know demand changes, is combined with centralized, automatic, distant, downstream-water-level control. In practice, few canals use this approach. To help further develop...
32 CFR 2001.30 - Automatic declassification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... transferred in conjunction with a transfer of functions, and not merely for storage, the receiving agency... contamination by a hazardous substance; and (iii) Electronic media if the media is subject to issues of software... the automatic declassification of a specific series of records as defined in section 6.1(r) of the...
32 CFR 2001.30 - Automatic declassification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... transferred in conjunction with a transfer of functions, and not merely for storage, the receiving agency... contamination by a hazardous substance; and (iii) Electronic media if the media is subject to issues of software... the automatic declassification of a specific series of records as defined in section 6.1(r) of the...
32 CFR 2001.30 - Automatic declassification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... transferred in conjunction with a transfer of functions, and not merely for storage, the receiving agency... contamination by a hazardous substance; and (iii) Electronic media if the media is subject to issues of software... the automatic declassification of a specific series of records as defined in section 6.1(r) of the...
32 CFR 2001.30 - Automatic declassification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... transferred in conjunction with a transfer of functions, and not merely for storage, the receiving agency... contamination by a hazardous substance; and (iii) Electronic media if the media is subject to issues of software... the automatic declassification of a specific series of records as defined in section 6.1(r) of the...
32 CFR 2001.30 - Automatic declassification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... transferred in conjunction with a transfer of functions, and not merely for storage, the receiving agency... contamination by a hazardous substance; and (iii) Electronic media if the media is subject to issues of software... the automatic declassification of a specific series of records as defined in section 6.1(r) of the...
A Flexible and Configurable Architecture for Automatic Control Remote Laboratories
ERIC Educational Resources Information Center
Kalúz, Martin; García-Zubía, Javier; Fikar, Miroslav; Cirka, Luboš
2015-01-01
In this paper, we propose a novel approach in hardware and software architecture design for implementation of remote laboratories for automatic control. In our contribution, we show the solution with flexible connectivity at back-end, providing features of multipurpose usage with different types of experimental devices, and fully configurable…
Urschler, Martin; Grassegger, Sabine; Štern, Darko
2015-01-01
Age estimation of individuals is important in human biology and has various medical and forensic applications. Recent interest in MR-based methods aims to investigate alternatives for established methods involving ionising radiation. Automatic, software-based methods additionally promise improved estimation objectivity. To investigate how informative automatically selected image features are regarding their ability to discriminate age, by exploring a recently proposed software-based age estimation method for MR images of the left hand and wrist. One hundred and two MR datasets of left hand images are used to evaluate age estimation performance, consisting of bone and epiphyseal gap volume localisation, computation of one age regression model per bone mapping image features to age and fusion of individual bone age predictions to a final age estimate. Quantitative results of the software-based method show an age estimation performance with a mean absolute difference of 0.85 years (SD = 0.58 years) to chronological age, as determined by a cross-validation experiment. Qualitatively, it is demonstrated how feature selection works and which image features of skeletal maturation are automatically chosen to model the non-linear regression function. Feasibility of automatic age estimation based on MRI data is shown and selected image features are found to be informative for describing anatomical changes during physical maturation in male adolescents.
Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system
NASA Astrophysics Data System (ADS)
Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong
2018-01-01
We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.
SWIFT MODELLER: a Java based GUI for molecular modeling.
Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S
2011-10-01
MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.
45 CFR 95.617 - Software and ownership rights.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Software and ownership rights. 95.617 Section 95.617 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION GENERAL... PROGRAMS) Automatic Data Processing Equipment and Services-Conditions for Federal Financial Participation...
Automatic Requirements Specification Extraction from Natural Language (ARSENAL)
2014-10-01
designers, implementers) involved in the design of software systems. However, natural language descriptions can be informal, incomplete, imprecise...communication of technical descriptions between the various stakeholders (e.g., customers, designers, imple- menters) involved in the design of software systems...the accuracy of the natural language processing stage, the degree of automation, and robustness to noise. 1 2 Introduction Software systems operate in
Technical Support | Division of Cancer Prevention
To view the live webinar, you will need to have the software, Microsoft Live Meeting, downloaded onto your computer before the event. In most cases, the software will automatically download when you open the program on your system. However, in the event that you need to download it manually, you can access the software at the link below: Download the Microsoft Office Live
An Analysis of Botnet Vulnerabilities
2007-06-01
Definition Currently, the primary defense against botnets is prompt patching of vulnerable systems and antivirus software . Network monitoring can identify...IRCd software , none were identified during this effort. AFIT iv For my wife, for her caring and support throughout the course of this...are software agents designed to automatically perform tasks. Examples include web-spiders that catalog the Internet and bots found in popular online
Continuous integration and quality control for scientific software
NASA Astrophysics Data System (ADS)
Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.
2013-08-01
Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.
Automatic target detection using binary template matching
NASA Astrophysics Data System (ADS)
Jun, Dong-San; Sun, Sun-Gu; Park, HyunWook
2005-03-01
This paper presents a new automatic target detection (ATD) algorithm to detect targets such as battle tanks and armored personal carriers in ground-to-ground scenarios. Whereas most ATD algorithms were developed for forward-looking infrared (FLIR) images, we have developed an ATD algorithm for charge-coupled device (CCD) images, which have superior quality to FLIR images in daylight. The proposed algorithm uses fast binary template matching with an adaptive binarization, which is robust to various light conditions in CCD images and saves computation time. Experimental results show that the proposed method has good detection performance.
Solar-Powered Water Distillation
NASA Technical Reports Server (NTRS)
Menninger, F. J.; Elder, R. J.
1985-01-01
Solar-powered still produces pure water at rate of 6,000 gallons per year. Still fully automatic and gravity-fed. Only outside electric power is timer clock and solenoid-operated valve. Still saves $5,000 yearly in energy costs and pays for itself in 3 1/2 years.
Higher-order neural network software for distortion invariant object recognition
NASA Technical Reports Server (NTRS)
Reid, Max B.; Spirkovska, Lilly
1991-01-01
The state-of-the-art in pattern recognition for such applications as automatic target recognition and industrial robotic vision relies on digital image processing. We present a higher-order neural network model and software which performs the complete feature extraction-pattern classification paradigm required for automatic pattern recognition. Using a third-order neural network, we demonstrate complete, 100 percent accurate invariance to distortions of scale, position, and in-plate rotation. In a higher-order neural network, feature extraction is built into the network, and does not have to be learned. Only the relatively simple classification step must be learned. This is key to achieving very rapid training. The training set is much smaller than with standard neural network software because the higher-order network only has to be shown one view of each object to be learned, not every possible view. The software and graphical user interface run on any Sun workstation. Results of the use of the neural software in autonomous robotic vision systems are presented. Such a system could have extensive application in robotic manufacturing.
NASA Astrophysics Data System (ADS)
Zollo, Aldo
2016-04-01
RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non-expert external users who are interested in the seismological data. The software is a valid tool for the automatic analysis of the background seismicity at different time scales and can be a relevant tool for the monitoring of both natural and induced seismicity.
Aquila, Iolanda; González, Ariana; Fernández-Golfín, Covadonga; Rincón, Luis Miguel; Casas, Eduardo; García, Ana; Hinojar, Rocio; Jiménez-Nacher, José Julio; Zamorano, José Luis
2016-05-17
3D transesophageal echocardiography (TEE) is superior to 2D TEE in quantitative anatomic evaluation of the mitral valve (MV) but it shows limitations regarding automatic quantification. Here, we tested the inter-/intra-observer reproducibility of a novel full-automated software in the evaluation of MV anatomy compared to manual 3D assessment. Thirty-six out of 61 screened patients referred to our Cardiac Imaging Unit for TEE were retrospectively included. 3D TEE analysis was performed both manually and with the automated software by two independent operators. Mitral annular area, intercommissural distance, anterior leaflet length and posterior leaflet length were assessed. A significant correlation between both methods was found for all variables: intercommissural diameter (r = 0.84, p < 0.01), mitral annular area (r = 0.94, p > 0, 01), anterior leaflet length (r = 0.83, p < 0.01) and posterior leaflet length (r = 0.67, p < 0.01). Interobserver variability assessed by the intraclass correlation coefficient was superior for the automatic software: intercommisural distance 0.997 vs. 0.76; mitral annular area 0.957 vs. 0.858; anterior leaflet length 0.963 vs. 0.734 and posterior leaflet length 0.936 vs. 0.838. Intraobserver variability was good for both methods with a better level of agreement with the automatic software. The novel 3D automated software is reproducible in MV anatomy assessment. The incorporation of this new tool in clinical MV assessment may improve patient selection and outcomes for MV interventions as well as patient diagnosis and prognosis stratification. Yet, high-quality 3D images are indispensable.
Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C
2018-05-01
A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
3D Orbit Visualization for Earth-Observing Missions
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Plesea, Lucian; Chafin, Brian G.; Weiss, Barry H.
2011-01-01
This software visualizes orbit paths for the Orbiting Carbon Observatory (OCO), but was designed to be general and applicable to any Earth-observing mission. The software uses the Google Earth user interface to provide a visual mechanism to explore spacecraft orbit paths, ground footprint locations, and local cloud cover conditions. In addition, a drill-down capability allows for users to point and click on a particular observation frame to pop up ancillary information such as data product filenames and directory paths, latitude, longitude, time stamp, column-average dry air mole fraction of carbon dioxide, and solar zenith angle. This software can be integrated with the ground data system for any Earth-observing mission to automatically generate daily orbit path data products in Google Earth KML format. These KML data products can be directly loaded into the Google Earth application for interactive 3D visualization of the orbit paths for each mission day. Each time the application runs, the daily orbit paths are encapsulated in a KML file for each mission day since the last time the application ran. Alternatively, the daily KML for a specified mission day may be generated. The application automatically extracts the spacecraft position and ground footprint geometry as a function of time from a daily Level 1B data product created and archived by the mission s ground data system software. In addition, ancillary data, such as the column-averaged dry air mole fraction of carbon dioxide and solar zenith angle, are automatically extracted from a Level 2 mission data product. Zoom, pan, and rotate capability are provided through the standard Google Earth interface. Cloud cover is indicated with an image layer from the MODIS (Moderate Resolution Imaging Spectroradiometer) aboard the Aqua satellite, which is automatically retrieved from JPL s OnEarth Web service.
NASA Technical Reports Server (NTRS)
Larsen, D. Gail; Schwieder, Paul R.
1993-01-01
Network video conferencing is advancing rapidly throughout the nation, and the Idaho National Engineering Laboratory (INEL), a Department of Energy (DOE) facility, is at the forefront of the development. Engineers at INEL/EG&G designed and installed a very unique DOE videoconferencing system, offering many outstanding features, that include true multipoint conferencing, user-friendly design and operation with no full-time operators required, and the potential for cost effective expansion of the system. One area where INEL/EG&G engineers made a significant contribution to video conferencing was in the development of effective, user-friendly, end station driven scheduling software. A PC at each user site is used to schedule conferences via a windows package. This software interface provides information to the users concerning conference availability, scheduling, initiation, and termination. The menus are 'mouse' controlled. Once a conference is scheduled, a workstation at the hubs monitors the network to initiate all scheduled conferences. No active operator participation is required once a user schedules a conference through the local PC; the workstation automatically initiates and terminates the conference as scheduled. As each conference is scheduled, hard copy notification is also printed at each participating site. Video conferencing is the wave of the future. The use of these user-friendly systems will save millions in lost productivity and travel cost throughout the nation. The ease of operation and conference scheduling will play a key role on the extent industry uses this new technology. The INEL/EG&G has developed a prototype scheduling system for both commercial and federal government use.
Low Cost Efficient Deliverying Video Surveillance Service to Moving Guard for Smart Home.
Gualotuña, Tatiana; Macías, Elsa; Suárez, Álvaro; C, Efraín R Fonseca; Rivadeneira, Andrés
2018-03-01
Low-cost video surveillance systems are attractive for Smart Home applications (especially in emerging economies). Those systems use the flexibility of the Internet of Things to operate the video camera only when an intrusion is detected. We are the only ones that focus on the design of protocols based on intelligent agents to communicate the video of an intrusion in real time to the guards by wireless or mobile networks. The goal is to communicate, in real time, the video to the guards who can be moving towards the smart home. However, this communication suffers from sporadic disruptions that difficults the control and drastically reduces user satisfaction and operativity of the system. In a novel way, we have designed a generic software architecture based on design patterns that can be adapted to any hardware in a simple way. The implanted hardware is of very low economic cost; the software frameworks are free. In the experimental tests we have shown that it is possible to communicate to the moving guard, intrusion notifications (by e-mail and by instant messaging), and the first video frames in less than 20 s. In addition, we automatically recovered the frames of video lost in the disruptions in a transparent way to the user, we supported vertical handover processes and we could save energy of the smartphone's battery. However, the most important thing was that the high satisfaction of the people who have used the system.
Wellness engineering for better quality of life of aging baby boomer
NASA Astrophysics Data System (ADS)
Szu, Harold
2007-04-01
Current health care system serving 78M aging baby-boomers is no longer sustainable, as the cost about 1/5 GDP will reach 1/4 GDT when all is retired in decades, unless the system is changed. We design a high-tech safe net to enhance the timeliness of early correct treatment execution (otherwise, causing 1/4 mortality associated with an escalating legal fee waste). We follow the common sense that "a stitch in time saves nine," and adopt the military surveillance know-how in designing early warning health management system, comprising of smart sensor pairs for point-care surveillance. However, the grand plan of affordable smart sensors hardware for households requires an ODM & OEM teaming to conduct parallel designing and sequential marketing strategy. The military software strategy combating a treacherous adversary enemy match well with point cares surveillance overcoming real world microorganism variability. Moreover, such smart military software provides self-reference change detection, not by traditional cohort ensemble average, but by individual own higher order statistics (HOS) independent component analysis (ICA), which take the advantage of known initial condition for each individual and desirable over-sampling daily dynamics. The triggering of warning follows the military algorithms comprising of Receiver Operation Characteristics (ROC) and Automatic Target Recognition (ATR). To further reduce the unwanted false negative rate, a benchmarked is made against the traditional cohort-ensemble baseline average & the upper & lower bounds of variance as adopted by the gatekeepers - Medical Doctors (MD) and Nurses.
Recce NG: from Recce sensor to image intelligence (IMINT)
NASA Astrophysics Data System (ADS)
Larroque, Serge
2001-12-01
Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.
NASA Astrophysics Data System (ADS)
Larsen, D. Gail; Schwieder, Paul R.
1993-02-01
Network video conferencing is advancing rapidly throughout the nation, and the Idaho National Engineering Laboratory (INEL), a Department of Energy (DOE) facility, is at the forefront of the development. Engineers at INEL/EG&G designed and installed a very unique DOE videoconferencing system, offering many outstanding features, that include true multipoint conferencing, user-friendly design and operation with no full-time operators required, and the potential for cost effective expansion of the system. One area where INEL/EG&G engineers made a significant contribution to video conferencing was in the development of effective, user-friendly, end station driven scheduling software. A PC at each user site is used to schedule conferences via a windows package. This software interface provides information to the users concerning conference availability, scheduling, initiation, and termination. The menus are 'mouse' controlled. Once a conference is scheduled, a workstation at the hubs monitors the network to initiate all scheduled conferences. No active operator participation is required once a user schedules a conference through the local PC; the workstation automatically initiates and terminates the conference as scheduled. As each conference is scheduled, hard copy notification is also printed at each participating site. Video conferencing is the wave of the future. The use of these user-friendly systems will save millions in lost productivity and travel cost throughout the nation. The ease of operation and conference scheduling will play a key role on the extent industry uses this new technology. The INEL/EG&G has developed a prototype scheduling system for both commercial and federal government use.
NASA Astrophysics Data System (ADS)
Larsen, D. G.; Schwieder, P. R.
Network video conferencing is advancing rapidly throughout the nation, and the Idaho National Engineering Laboratory (INEL), a Department of Energy (DOE) facility, is at the forefront of the development. Engineers at INEL/EG&G designed and installed a very unique DOE video conferencing system, offering many outstanding features, that include true multipoint conferencing, user-friendly design and operation with no full-time operators required, and the potential for cost effective expansion of the system. One area where INEL/EG&G engineers made a significant contribution to video conferencing was in the development of effective, user-friendly, end station driven scheduling software. A PC at each user site is used to schedule conferences via a windows package. This software interface provides information to the users concerning conference availability, scheduling, initiation, and termination. The menus are 'mouse' controlled. Once a conference is scheduled, a workstation at the hub monitors the network to initiate all scheduled conferences. No active operator participation is required once a user schedules a conference through the local PC; the workstation automatically initiates and terminates the conference as scheduled. As each conference is scheduled, hard copy notification is also printed at each participating site. Video conferencing is the wave of the future. The use of these user-friendly systems will save millions in lost productivity and travel costs throughout the nation. The ease of operation and conference scheduling will play a key role on the extent industry uses this new technology. The INEL/EG&G has developed a prototype scheduling system for both commercial and federal government use.
Low Cost Efficient Deliverying Video Surveillance Service to Moving Guard for Smart Home
Gualotuña, Tatiana; Fonseca C., Efraín R.; Rivadeneira, Andrés
2018-01-01
Low-cost video surveillance systems are attractive for Smart Home applications (especially in emerging economies). Those systems use the flexibility of the Internet of Things to operate the video camera only when an intrusion is detected. We are the only ones that focus on the design of protocols based on intelligent agents to communicate the video of an intrusion in real time to the guards by wireless or mobile networks. The goal is to communicate, in real time, the video to the guards who can be moving towards the smart home. However, this communication suffers from sporadic disruptions that difficults the control and drastically reduces user satisfaction and operativity of the system. In a novel way, we have designed a generic software architecture based on design patterns that can be adapted to any hardware in a simple way. The implanted hardware is of very low economic cost; the software frameworks are free. In the experimental tests we have shown that it is possible to communicate to the moving guard, intrusion notifications (by e-mail and by instant messaging), and the first video frames in less than 20 s. In addition, we automatically recovered the frames of video lost in the disruptions in a transparent way to the user, we supported vertical handover processes and we could save energy of the smartphone's battery. However, the most important thing was that the high satisfaction of the people who have used the system. PMID:29494551
The unbalanced signal measuring of automotive brake drum
NASA Astrophysics Data System (ADS)
Wang, Xiao-Dong; Ye, Sheng-Hua; Zhang, Bang-Cheng
2005-04-01
For the purpose of the research and development of automatic balancing system by mass removing, the dissertation deals with the measuring method of the unbalance signal, the design the automatic balance equipment and the software. This paper emphases the testing system of the balancer of automotive brake drum. The paper designs the band-pass filter product with favorable automatic follow of electronic product, and with favorable automatic follow capability, filtration effect and stability. The system of automatic balancing system by mass removing based on virtual instrument is designed in this paper. A lab system has been constructed. The results of contrast experiments indicate the notable effect of 1-plane automatic balance and the high precision of dynamic balance, and demonstrate the application value of the system.
Model Checking Satellite Operational Procedures
NASA Astrophysics Data System (ADS)
Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri
2011-08-01
We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.
NASA Technical Reports Server (NTRS)
2014-01-01
Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.
Abstracts Produced Using Computer Assistance.
ERIC Educational Resources Information Center
Craven, Timothy C.
2000-01-01
Describes an experiment that evaluated features of TEXNET abstracting software, compared the use of keywords and phrases that were automatically extracted, tested hypotheses about relations between abstractors' backgrounds and their reactions to abstracting assistance software, and obtained ideas for further features to be developed in TEXNET.…
A Summary of the Naval Postgraduate School Research Program
1989-08-30
5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database
The development of automated behavior analysis software
NASA Astrophysics Data System (ADS)
Jaana, Yuki; Prima, Oky Dicky A.; Imabuchi, Takashi; Ito, Hisayoshi; Hosogoe, Kumiko
2015-03-01
The measurement of behavior for participants in a conversation scene involves verbal and nonverbal communications. The measurement validity may vary depending on the observers caused by some aspects such as human error, poorly designed measurement systems, and inadequate observer training. Although some systems have been introduced in previous studies to automatically measure the behaviors, these systems prevent participants to talk in a natural way. In this study, we propose a software application program to automatically analyze behaviors of the participants including utterances, facial expressions (happy or neutral), head nods, and poses using only a single omnidirectional camera. The camera is small enough to be embedded into a table to allow participants to have spontaneous conversation. The proposed software utilizes facial feature tracking based on constrained local model to observe the changes of the facial features captured by the camera, and the Japanese female facial expression database to recognize expressions. Our experiment results show that there are significant correlations between measurements observed by the observers and by the software.
Using PAFEC as a preprocessor for COSMIC/NASTRAN
NASA Technical Reports Server (NTRS)
Gray, W. H.; Baudry, T. V.
1983-01-01
Programs for Automatic Finite Element Calculations (PAFEC) is a general purpose, three dimensional linear and nonlinear finite element program (ref. 1). PAFEC's features include free format input utilizing engineering keywords, powerful mesh generating facilities, sophisticated data base management procedures, and extensive data validation checks. Presented here is a description of a software interface that permits PAFEC to be used as a preprocessor for COSMIC/NASTRAN. This user friendly software, called PAFCOS, frees the stress analyst from the laborious and error prone procedure of creating and debugging a rigid format COSMIC/NASTRAN bulk data deck. By interactively creating and debugging a finite element model with PAFEC, thus taking full advantage of the free format engineering keyword oriented data structure of PAFEC, the amount of time spent during model generation can be drastically reduced. The PAFCOS software will automatically convert a PAFEC data structure into a COSMIC/NASTRAN bulk data deck. The capabilities and limitations of the PAFCOS software are fully discussed in the following report.
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Evaluation of the efficiency and fault density of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1993-01-01
Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.
Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization
NASA Astrophysics Data System (ADS)
Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin
This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.
Auto identification technology and its impact on patient safety in the Operating Room of the Future.
Egan, Marie T; Sandberg, Warren S
2007-03-01
Automatic identification technologies, such as bar coding and radio frequency identification, are ubiquitous in everyday life but virtually nonexistent in the operating room. User expectations, based on everyday experience with automatic identification technologies, have generated much anticipation that these systems will improve readiness, workflow, and safety in the operating room, with minimal training requirements. We report, in narrative form, a multi-year experience with various automatic identification technologies in the Operating Room of the Future Project at Massachusetts General Hospital. In each case, the additional human labor required to make these ;labor-saving' technologies function in the medical environment has proved to be their undoing. We conclude that while automatic identification technologies show promise, significant barriers to realizing their potential still exist. Nevertheless, overcoming these obstacles is necessary if the vision of an operating room of the future in which all processes are monitored, controlled, and optimized is to be achieved.
Pouplin, Samuel; Roche, Nicolas; Antoine, Jean-Yves; Vaugier, Isabelle; Pottier, Sandra; Figere, Marjorie; Bensmail, Djamel
2017-06-01
To determine whether activation of the frequency of use and automatic learning parameters of word prediction software has an impact on text input speed. Forty-five participants with cervical spinal cord injury between C4 and C8 Asia A or B accepted to participate to this study. Participants were separated in two groups: a high lesion group for participants with lesion level is at or above C5 Asia AIS A or B and a low lesion group for participants with lesion is between C6 and C8 Asia AIS A or B. A single evaluation session was carried out for each participant. Text input speed was evaluated during three copying tasks: • without word prediction software (WITHOUT condition) • with automatic learning of words and frequency of use deactivated (NOT_ACTIV condition) • with automatic learning of words and frequency of use activated (ACTIV condition) Results: Text input speed was significantly higher in the WITHOUT than the NOT_ACTIV (p< 0.001) or ACTIV conditions (p = 0.02) for participants with low lesions. Text input speed was significantly higher in the ACTIV than in the NOT_ACTIV (p = 0.002) or WITHOUT (p < 0.001) conditions for participants with high lesions. Use of word prediction software with the activation of frequency of use and automatic learning increased text input speed in participants with high-level tetraplegia. For participants with low-level tetraplegia, the use of word prediction software with frequency of use and automatic learning activated only decreased the number of errors. Implications in rehabilitation Access to technology can be difficult for persons with disabilities such as cervical spinal cord injury (SCI). Several methods have been developed to increase text input speed such as word prediction software.This study show that parameter of word prediction software (frequency of use) affected text input speed in persons with cervical SCI and differed according to the level of the lesion. • For persons with high-level lesion, our results suggest that this parameter must be activated so that text input speed is increased. • For persons with low lesion group, this parameter must be activated so that the numbers of errors are decreased. • In all cases, the activation of the parameter of frequency of use is essential in order to improve the efficiency of the word prediction software. • Health-related professionals should use these results in their clinical practice for better results and therefore better patients 'satisfaction.
NASA Technical Reports Server (NTRS)
Chan, William M.; Akien, Edwin (Technical Monitor)
2002-01-01
For many years, generation of overset grids for complex configurations has required the use of a number of different independently developed software utilities. Results created by each step were then visualized using a separate visualization tool before moving on to the next. A new software tool called OVERGRID was developed which allows the user to perform all the grid generation steps and visualization under one environment. OVERGRID provides grid diagnostic functions such as surface tangent and normal checks as well as grid manipulation functions such as extraction, extrapolation, concatenation, redistribution, smoothing, and projection. Moreover, it also contains hyperbolic surface and volume grid generation modules that are specifically suited for overset grid generation. It is the first time that such a unified interface existed for the creation of overset grids for complex geometries. New concepts on automatic overset surface grid generation around surface discontinuities will also be briefly presented. Special control curves on the surface such as intersection curves, sharp edges, open boundaries, are called seam curves. The seam curves are first automatically extracted from a multiple panel network description of the surface. Points where three or more seam curves meet are automatically identified and are called seam corners. Seam corner surface grids are automatically generated using a singular axis topology. Hyperbolic surface grids are then grown from the seam curves that are automatically trimmed away from the seam corners.
Windows VPN Set Up | High-Performance Computing | NREL
it in your My Documents folder Configure the client software using that conf file Start the TEXT NEEDED Configure the Client Software Start the Endian Connect App. You'll configure the connection using the hpcvpn-win.conf file, uncheck the "save password" link, and add your UserID. Start
Software for Testing Electroactive Structural Components
NASA Technical Reports Server (NTRS)
Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar
2003-01-01
A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.
Fire Protection. Honeywell Planning Guide.
ERIC Educational Resources Information Center
Honeywell, Inc., Minneapolis, Minn.
A general discussion of fire alarms and protection is provided by a manufacturer of automated monitoring and control systems. Background information describes old and new fire alarm systems, comparing system components, wage savings, and cost analysis. Different kinds of automatic systems are listed, including--(1) local system, (2) auxiliary…
Occupancy Fire Record: Schools.
ERIC Educational Resources Information Center
National Fire Protection Association, Boston, MA.
The considerations of human safety and preservation of facilities are examined in relation to school fires. Various aspects of planning which would decrease the probability of fires and thereby save life and property are reviewed and include--(1) causes, (2) automatic protection devices, (3) evacuation and fire drills, and (4) construction…
31 CFR 363.59 - What is a payroll savings plan?
Code of Federal Regulations, 2014 CFR
2014-07-01
... zero-percent certificate of indebtedness. (See Subpart D for more information about a payroll zero-percent certificate of indebtedness.) When you have accumulated a sufficient amount of payroll zero... that you selected, the TreasuryDirect® system will automatically redeem your payroll zero-percent...
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio
NASA Astrophysics Data System (ADS)
Lassnig, M.; Vigne, R.; Beermann, T.; Barisits, M.; Garonne, V.; Serfon, C.
2015-12-01
This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possible to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy and Apache. In this contribution, the interplay between these components, their deployment, software mitigation, and the monitoring strategy are discussed.
Automatic counting and classification of bacterial colonies using hyperspectral imaging
USDA-ARS?s Scientific Manuscript database
Detection and counting of bacterial colonies on agar plates is a routine microbiology practice to get a rough estimate of the number of viable cells in a sample. There have been a variety of different automatic colony counting systems and software algorithms mainly based on color or gray-scale pictu...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
An automatic method for segmentation of fission tracks in epidote crystal photomicrographs
NASA Astrophysics Data System (ADS)
de Siqueira, Alexandre Fioravante; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Tello Saenz, Carlos Alberto; Job, Aldo Eloizo
2014-08-01
Manual identification of fission tracks has practical problems, such as variation due to observe-observation efficiency. An automatic processing method that could identify fission tracks in a photomicrograph could solve this problem and improve the speed of track counting. However, separation of nontrivial images is one of the most difficult tasks in image processing. Several commercial and free softwares are available, but these softwares are meant to be used in specific images. In this paper, an automatic method based on starlet wavelets is presented in order to separate fission tracks in mineral photomicrographs. Automatization is obtained by the Matthews correlation coefficient, and results are evaluated by precision, recall and accuracy. This technique is an improvement of a method aimed at segmentation of scanning electron microscopy images. This method is applied in photomicrographs of epidote phenocrystals, in which accuracy higher than 89% was obtained in fission track segmentation, even for difficult images. Algorithms corresponding to the proposed method are available for download. Using the method presented here, a user could easily determine fission tracks in photomicrographs of mineral samples.
Automatic Texture Mapping of Architectural and Archaeological 3d Models
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Stallmann, D.
2012-07-01
Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.
Development of on line automatic separation device for apple and sleeve
NASA Astrophysics Data System (ADS)
Xin, Dengke; Ning, Duo; Wang, Kangle; Han, Yuhang
2018-04-01
Based on STM32F407 single chip microcomputer as control core, automatic separation device of fruit sleeve is designed. This design consists of hardware and software. In hardware, it includes mechanical tooth separator and three degree of freedom manipulator, as well as industrial control computer, image data acquisition card, end effector and other structures. The software system is based on Visual C++ development environment, to achieve localization and recognition of fruit sleeve with the technology of image processing and machine vision, drive manipulator of foam net sets of capture, transfer, the designated position task. Test shows: The automatic separation device of the fruit sleeve has the advantages of quick response speed and high separation success rate, and can realize separation of the apple and plastic foam sleeve, and lays the foundation for further studying and realizing the application of the enterprise production line.
Software Cost Measuring and Reporting. One of the Software Acquisition Engineering Guidebook Series.
1979-01-02
through the peripherals. How- and performance criteria), ever, his interaction is usually minimal since, by difinition , the automatic test Since TS...performs its Software estimating is still heavily intended functions properly. dependent on experienced judgement. However, quantitative methods...apply to systems of totally different can be distributed to specialists who content. The Quantitative guideline may are most familiar with the work. One
ERIC Educational Resources Information Center
Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis
2016-01-01
In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
Cascade Apartments: Deep Energy Multifamily Retrofit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, A.; Mattheis, L.; Kunkle, R.
2014-02-01
In December of 2009-10, King County Housing Authority (KCHA) implemented energy retrofit improvements in the Cascade multifamily community, located in Kent, Washington (marine climate.)This research effort involved significant coordination from stakeholders KCHA, WA State Department of Commerce, utility Puget Sound Energy, and Cascade tenants. This report focuses on the following three primary BA research questions : 1. What are the modeled energy savings using DOE low income weatherization approved TREAT software? 2. How did the modeled energy savings compare with measured energy savings from aggregate utility billing analysis? 3. What is the Savings to Investment Ratio (SIR) of the retrofitmore » package after considering utility window incentives and KCHA capitol improvement funding.« less
Cascade Apartments - Deep Energy Multifamily Retrofit , Kent, Washington (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2014-02-01
In December of 2009-10, King County Housing Authority (KCHA) implemented energy retrofit improvements in the Cascade multifamily community, located in Kent, Washington (marine climate.)This research effort involved significant coordination from stakeholders KCHA, WA State Department of Commerce, utility Puget Sound Energy, and Cascade tenants. This report focuses on the following three primary BA research questions : 1. What are the modeled energy savings using DOE low income weatherization approved TREAT software? 2. How did the modeled energy savings compare with measured energy savings from aggregate utility billing analysis? 3. What is the Savings to Investment Ratio (SIR) of the retrofitmore » package after considering utility window incentives and KCHA capitol improvement funding.« less
Cascade Apartments: Deep Energy Multifamily Retrofit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, A.; Mattheis, L.; Kunkle, R.
2014-02-01
In December of 2009-10, King County Housing Authority (KCHA) implemented energy retrofit improvements in the Cascade multifamily community, located in Kent, Washington (marine climate.)This research effort involved significant coordination from stakeholders KCHA, WA State Department of Commerce, utility Puget Sound Energy, and Cascade tenants. This report focuses on the following three primary BA research questions: 1. What are the modeled energy savings using DOE low income weatherization approved TREAT software? 2. How did the modeled energy savings compare with measured energy savings from aggregate utility billing analysis? 3. What is the Savings to Investment Ratio (SIR) of the retrofit packagemore » after considering utility window incentives and KCHA capitol improvement funding.« less
Infrared Sky Imager (IRSI) Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Victor R.
2016-04-01
The Infrared Sky Imager (IRSI) deployed at the Atmospheric Radiation Measurement (ARM) Climate Research Facility is a Solmirus Corp. All Sky Infrared Visible Analyzer. The IRSI is an automatic, continuously operating, digital imaging and software system designed to capture hemispheric sky images and provide time series retrievals of fractional sky cover during both the day and night. The instrument provides diurnal, radiometrically calibrated sky imagery in the mid-infrared atmospheric window and imagery in the visible wavelengths for cloud retrievals during daylight hours. The software automatically identifies cloudy and clear regions at user-defined intervals and calculates fractional sky cover, providing amore » real-time display of sky conditions.« less
Translating expert system rules into Ada code with validation and verification
NASA Technical Reports Server (NTRS)
Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam
1991-01-01
The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.
Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M
2012-07-01
With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.
Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias
2016-11-01
Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS 2 experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10 5 in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.
Computer program for analysis of hemodynamic response to head-up tilt test
NASA Astrophysics Data System (ADS)
ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor
2014-11-01
The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.
Vignally, P; Fondi, G; Taggi, F; Pitidis, A
2011-03-31
In Italy the European Union Injury Database reports the involvement of chemical products in 0.9% of home and leisure accidents. The Emergency Department registry on domestic accidents in Italy and the Poison Control Centres record that 90% of cases of exposure to toxic substances occur in the home. It is not rare for the effects of chemical agents to be observed in hospitals, with a high potential risk of damage - the rate of this cause of hospital admission is double the domestic injury average. The aim of this study was to monitor the effects of injuries caused by caustic agents in Italy using automatic free-text recognition in Emergency Department medical databases. We created a Stata software program to automatically identify caustic or corrosive injury cases using an agent-specific list of keywords. We focused attention on the procedure's sensitivity and specificity. Ten hospitals in six regions of Italy participated in the study. The program identified 112 cases of injury by caustic or corrosive agents. Checking the cases by quality controls (based on manual reading of ED reports), we assessed 99 cases as true positive, i.e. 88.4% of the patients were automatically recognized by the software as being affected by caustic substances (99% CI: 80.6%- 96.2%), that is to say 0.59% (99% CI: 0.45%-0.76%) of the whole sample of home injuries, a value almost three times as high as that expected (p < 0.0001) from European codified information. False positives were 11.6% of the recognized cases (99% CI: 5.1%- 21.5%). Our automatic procedure for caustic agent identification proved to have excellent product recognition capacity with an acceptable level of excess sensitivity. Contrary to our a priori hypothesis, the automatic recognition system provided a level of identification of agents possessing caustic effects that was significantly much greater than was predictable on the basis of the values from current codifications reported in the European Database.
Telemetry and Science Data Software System
NASA Technical Reports Server (NTRS)
Bates, Lakesha; Hong, Liang
2011-01-01
The Telemetry and Science Data Software System (TSDSS) was designed to validate the operational health of a spacecraft, ease test verification, assist in debugging system anomalies, and provide trending data and advanced science analysis. In doing so, the system parses, processes, and organizes raw data from the Aquarius instrument both on the ground and while in space. In addition, it provides a user-friendly telemetry viewer, and an instant pushbutton test report generator. Existing ground data systems can parse and provide simple data processing, but have limitations in advanced science analysis and instant report generation. The TSDSS functions as an offline data analysis system during I&T (integration and test) and mission operations phases. After raw data are downloaded from an instrument, TSDSS ingests the data files, parses, converts telemetry to engineering units, and applies advanced algorithms to produce science level 0, 1, and 2 data products. Meanwhile, it automatically schedules upload of the raw data to a remote server and archives all intermediate and final values in a MySQL database in time order. All data saved in the system can be straightforwardly retrieved, exported, and migrated. Using TSDSS s interactive data visualization tool, a user can conveniently choose any combination and mathematical computation of interesting telemetry points from a large range of time periods (life cycle of mission ground data and mission operations testing), and display a graphical and statistical view of the data. With this graphical user interface (GUI), the data queried graphs can be exported and saved in multiple formats. This GUI is especially useful in trending data analysis, debugging anomalies, and advanced data analysis. At the request of the user, mission-specific instrument performance assessment reports can be generated with a simple click of a button on the GUI. From instrument level to observatory level, the TSDSS has been operating supporting functional and performance tests and refining system calibration algorithms and coefficients, in sync with the Aquarius/SAC-D spacecraft. At the time of this reporting, it was prepared and set up to perform anomaly investigation for mission operations preceding the Aquarius/SAC-D spacecraft launch on June 10, 2011.
Secure Video Surveillance System Acquisition Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2009-12-04
The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in a linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build themore » video review system.« less
Saving Time with Automated Account Management
ERIC Educational Resources Information Center
School Business Affairs, 2013
2013-01-01
Thanks to intelligent solutions, schools, colleges, and universities no longer need to manage user account life cycles by using scripts or tedious manual procedures. The solutions house the scripts and manual procedures. Accounts can be automatically created, modified, or deleted in all applications within the school. This article describes how an…
Fast approach for toner saving
NASA Astrophysics Data System (ADS)
Safonov, Ilia V.; Kurilin, Ilya V.; Rychagov, Michael N.; Lee, Hokeun; Kim, Sangho; Choi, Donchul
2011-01-01
Reducing toner consumption is an important task in modern printing devices and has a significant positive ecological impact. Existing toner saving approaches have two main drawbacks: appearance of hardcopy in toner saving mode is worse in comparison with normal mode; processing of whole rendered page bitmap requires significant computational costs. We propose to add small holes of various shapes and sizes to random places inside a character bitmap stored in font cache. Such random perforation scheme is based on processing pipeline in RIP of standard printer languages Postscript and PCL. Processing of text characters only, and moreover, processing of each character for given font and size alone, is an extremely fast procedure. The approach does not deteriorate halftoned bitmap and business graphics and provide toner saving for typical office documents up to 15-20%. Rate of toner saving is adjustable. Alteration of resulted characters' appearance is almost indistinguishable in comparison with solid black text due to random placement of small holes inside the character regions. The suggested method automatically skips small fonts to preserve its quality. Readability of text processed by proposed method is fine. OCR programs process that scanned hardcopy successfully too.
Retina Image Screening and Analysis Software Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Aykac, Deniz
2009-04-01
The software allows physicians or researchers to ground-truth images of retinas, identifying key physiological features and lesions that are indicative of disease. The software features methods to automatically detect the physiological features and lesions. The software contains code to measure the quality of images received from a telemedicine network; create and populate a database for a telemedicine network; review and report the diagnosis of a set of images; and also contains components to transmit images from a Zeiss camera to the network through SFTP.
Management of natural resources through automatic cartographic inventory
NASA Technical Reports Server (NTRS)
Rey, P.; Gourinard, Y.; Cambou, F. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Significant results of the ARNICA program from August 1972 - January 1973 have been: (1) establishment of image to object correspondence codes for all types of soil use and forestry in northern Spain; (2) establishment of a transfer procedure between qualitative (remote identification and remote interpretation) and quantitative (numerization, storage, automatic statistical cartography) use of images; (3) organization of microdensitometric data processing and automatic cartography software; and (4) development of a system for measuring reflectance simultaneous with imagery.
The Associate Principal Astronomer for AI Management of Automatic Telescopes
NASA Technical Reports Server (NTRS)
Henry, Gregory W.
1998-01-01
This research program in scheduling and management of automatic telescopes had the following objectives: 1. To field test the 1993 Automatic Telescope Instruction Set (ATIS93) programming language, which was specifically developed to allow real-time control of an automatic telescope via an artificial intelligence scheduler running on a remote computer. 2. To develop and test the procedures for two-way communication between a telescope controller and remote scheduler via the Internet. 3. To test various concepts in Al scheduling being developed at NASA Ames Research Center on an automatic telescope operated by Tennessee State University at the Fairborn Observatory site in southern Arizona. and 4. To develop a prototype software package, dubbed the Associate Principal Astronomer, for the efficient scheduling and management of automatic telescopes.
SABRE--A Novel Software Tool for Bibliographic Post-Processing.
ERIC Educational Resources Information Center
Burge, Cecil D.
1989-01-01
Describes the software architecture and application of SABRE (Semi-Automated Bibliographic Environment), which is one of the first products to provide a semi-automatic environment for relevancy ranking of citations obtained from searches of bibliographic databases. Features designed to meet the review, categorization, culling, and reporting needs…
75 FR 80677 - The Low-Income Definition
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-23
... original regulatory text so it is consistent with the geo-coding software the agency uses to make the low... Union Act (Act) authorizes the NCUA Board (Board) to define ``low-income members'' so that credit unions... process of implementing geo- coding software to make the calculation automatically for credit unions...
Daisne, Jean-François; Blumhofer, Andreas
2013-06-26
Intensity modulated radiotherapy for head and neck cancer necessitates accurate definition of organs at risk (OAR) and clinical target volumes (CTV). This crucial step is time consuming and prone to inter- and intra-observer variations. Automatic segmentation by atlas deformable registration may help to reduce time and variations. We aim to test a new commercial atlas algorithm for automatic segmentation of OAR and CTV in both ideal and clinical conditions. The updated Brainlab automatic head and neck atlas segmentation was tested on 20 patients: 10 cN0-stages (ideal population) and 10 unselected N-stages (clinical population). Following manual delineation of OAR and CTV, automatic segmentation of the same set of structures was performed and afterwards manually corrected. Dice Similarity Coefficient (DSC), Average Surface Distance (ASD) and Maximal Surface Distance (MSD) were calculated for "manual to automatic" and "manual to corrected" volumes comparisons. In both groups, automatic segmentation saved about 40% of the corresponding manual segmentation time. This effect was more pronounced for OAR than for CTV. The edition of the automatically obtained contours significantly improved DSC, ASD and MSD. Large distortions of normal anatomy or lack of iodine contrast were the limiting factors. The updated Brainlab atlas-based automatic segmentation tool for head and neck Cancer patients is timesaving but still necessitates review and corrections by an expert.
Software Techniques for Balancing Computation & Communication in Parallel Systems
1994-07-01
boer of Tasks: 15 PE Loand Yaltanc: 0.0000 K ] PE Loed Ya tance: 0.0000 Into-Tas Com: LInter-Task Com: 116 Ntwok traffic: ±16 PE LAYMT 1, Networkc...confusion. Because past versions for all files were saved and documented within SCCS, software developers were able to roll back to various combinations of
ERIC Educational Resources Information Center
Paquet, Katherine G.
2013-01-01
Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…
DSN system performance test software
NASA Technical Reports Server (NTRS)
Martin, M.
1978-01-01
The system performance test software is currently being modified to include additional capabilities and enhancements. Additional software programs are currently being developed for the Command Store and Forward System and the Automatic Total Recall System. The test executive is the main program. It controls the input and output of the individual test programs by routing data blocks and operator directives to those programs. It also processes data block dump requests from the operator.
NASA Astrophysics Data System (ADS)
Martsynkovskyy, V. A.; Deineka, A.; Kovalenko, V.
2017-08-01
The article presents forced axial vibrations of the rotor with an automatic unloading machine in an oxidizer pump. A feature of the design is the use in the autoloading system of slotted throttles with mutually inverse throttling. Their conductivity is determined by a numerical experiment in the ANSYS CFX software package.
ERIC Educational Resources Information Center
Sidgi, Lina Fathi Sidig; Shaari, Ahmad Jelani
2017-01-01
The use of technology, such as computer-assisted language learning (CALL), is used in teaching and learning in the foreign language classrooms where it is most needed. One promising emerging technology that supports language learning is automatic speech recognition (ASR). Integrating such technology, especially in the instruction of pronunciation…
A Multiple Sensor Machine Vision System for Automatic Hardwood Feature Detection
D. Earl Kline; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman; Robert L. Brisbin
1993-01-01
A multiple sensor machine vision prototype is being developed to scan full size hardwood lumber at industrial speeds for automatically detecting features such as knots holes, wane, stain, splits, checks, and color. The prototype integrates a multiple sensor imaging system, a materials handling system, a computer system, and application software. The prototype provides...
Feasibility Study on Fully Automatic High Quality Translation: Volume I. Final Technical Report.
ERIC Educational Resources Information Center
Lehmann, Winifred P.; Stachowitz, Rolf
The object of this theoretical inquiry is to examine the controversial issue of a fully automatic high quality translation (FAHQT) in the light of past and projected advances in linguistic theory and hardware/software capability. This first volume of a two-volume report discusses the requirements of translation and aspects of human and machine…
Automatic finite element generators
NASA Technical Reports Server (NTRS)
Wang, P. S.
1984-01-01
The design and implementation of a software system for generating finite elements and related computations are described. Exact symbolic computational techniques are employed to derive strain-displacement matrices and element stiffness matrices. Methods for dealing with the excessive growth of symbolic expressions are discussed. Automatic FORTRAN code generation is described with emphasis on improving the efficiency of the resultant code.
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1987-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1988-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
Dorofeeva, A A; Khrustalev, A V; Krylov, Iu V; Bocharov, D A; Negasheva, M A
2010-01-01
Digital images of the iris were received for study peculiarities of the iris color during the anthropological examination of 578 students aged 16-24 years. Simultaneously with the registration of the digital images, the visual assessment of the eye color was carried out using the traditional scale of Bunak, based on 12 ocular prostheses. Original software for automatic determination of the iris color based on 12 classes scale of Bunak was designed, and computer version of that scale was developed. The software proposed allows to conduct the determination of the iris color with high validity based on numerical evaluation; its application may reduce the bias due to subjective assessment and methodological divergences of the different researchers. The software designed for automatic determination of the iris color may help develop both theoretical and applied anthropology, it may be used in forensic and emergency medicine, sports medicine, medico-genetic counseling and professional selection.
NASA Astrophysics Data System (ADS)
Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.
Existing Whole-House Solutions Case Study: Cascade Apartments - Deep Energy Multifamily Retrofit
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-02-01
In December of 2009-10, King County Housing Authority (KCHA) implemented energy retrofit improvements in the Cascade multifamily community, located in Kent, Washington, which resulted in annual energy cost savings of 22%, improved comfort and air quality for residents, and increased durability of the units. This research effort involved significant coordination from stakeholders KCHA, WA State Department of Commerce, utility Puget Sound Energy, and Cascade tenants. This report focuses on the following three primary Building America research questions: 1. What are the modeled energy savings using DOE low income weatherization approved TREAT software? 2. How did the modeled energy savings comparemore » with measured energy savings from aggregate utility billing analysis? 3. What is the Savings to Investment Ratio of the retrofit package after considering utility window incentives and KCHA capital improvement funding.« less
Global moment tensor computation at GFZ Potsdam
NASA Astrophysics Data System (ADS)
Saul, J.; Becker, J.; Hanka, W.
2011-12-01
As part of its earthquake information service, GFZ Potsdam has started to provide seismic moment tensor solutions for significant earthquakes world-wide. The software used to compute the moment tensors is a GFZ-Potsdam in-house development, which uses the framework of the software SeisComP 3 (Hanka et al., 2010). SeisComP 3 (SC3) is a software package for seismological data acquisition, archival, quality control and analysis. SC3 is developed by GFZ Potsdam with significant contributions from its user community. The moment tensor inversion technique uses a combination of several wave types, time windows and frequency bands depending on magnitude and station distance. Wave types include body, surface and mantle waves as well as the so-called 'W-Phase' (Kanamori and Rivera, 2008). The inversion is currently performed in the time domain only. An iterative centroid search can be performed independently both horizontally and in depth. Moment tensors are currently computed in a semi-automatic fashion. This involves inversions that are performed automatically in near-real time, followed by analyst review prior to publication. The automatic results are quite often good enough to be published without further improvements, sometimes in less than 30 minutes from origin time. In those cases where a manual interaction is still required, the automatic inversion usually does a good job at pre-selecting those traces that are the most relevant for the inversion, keeping the work required for the analyst at a minimum. Our published moment tensors are generally in good agreement with those published by the Global Centroid-Moment-Tensor (GCMT) project for earthquakes above a magnitude of about Mw 5. Additionally we provide solutions for smaller earthquakes above about Mw 4 in Europe, which are normally not analyzed by the GCMT project. We find that for earthquakes above Mw 6, the most robust automatic inversions can usually be obtained using the W-Phase time window. The GFZ earthquake bulletin is located at http://geofon.gfz-potsdam.de/eqinfo For more information on the SeisComP 3 software visit http://www.seiscomp3.org
Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit
NASA Technical Reports Server (NTRS)
Ansar, Adnan I.; Clouse, Daniel S.; McHenry, Michael C.; Zarzhitsky, Dimitri V.; Pagdett, Curtis W.
2013-01-01
This software automatically calibrates a camera or an imaging array to an inertial navigation system (INS) that is rigidly mounted to the array or imager. In effect, it recovers the coordinate frame transformation between the reference frame of the imager and the reference frame of the INS. This innovation can automatically derive the camera-to-INS alignment using image data only. The assumption is that the camera fixates on an area while the aircraft flies on orbit. The system then, fully automatically, solves for the camera orientation in the INS frame. No manual intervention or ground tie point data is required.
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Foudriat, E. C.; Will, R. W.
1977-01-01
The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.
Automatic AVHRR image navigation software
NASA Technical Reports Server (NTRS)
Baldwin, Dan; Emery, William
1992-01-01
This is the final report describing the work done on the project entitled Automatic AVHRR Image Navigation Software funded through NASA-Washington, award NAGW-3224, Account 153-7529. At the onset of this project, we had developed image navigation software capable of producing geo-registered images from AVHRR data. The registrations were highly accurate but required a priori knowledge of the spacecraft's axes alignment deviations, commonly known as attitude. The three angles needed to describe the attitude are called roll, pitch, and yaw, and are the components of the deviations in the along scan, along track and about center directions. The inclusion of the attitude corrections in the navigation software results in highly accurate georegistrations, however, the computation of the angles is very tedious and involves human interpretation for several steps. The technique also requires easily identifiable ground features which may not be available due to cloud cover or for ocean data. The current project was motivated by the need for a navigation system which was automatic and did not require human intervention or ground control points. The first step in creating such a system must be the ability to parameterize the spacecraft's attitude. The immediate goal of this project was to study the attitude fluctuations and determine if they displayed any systematic behavior which could be modeled or parameterized. We chose a period in 1991-1992 to study the attitude of the NOAA 11 spacecraft using data from the Tiros receiving station at the Colorado Center for Astrodynamic Research (CCAR) at the University of Colorado.
A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection
NASA Astrophysics Data System (ADS)
Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood
2013-02-01
In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.
On the engineering of crucial software
NASA Technical Reports Server (NTRS)
Pratt, T. W.; Knight, J. C.; Gregory, S. T.
1983-01-01
The various aspects of the conventional software development cycle are examined. This cycle was the basis of the augmented approach contained in the original grant proposal. This cycle was found inadequate for crucial software development, and the justification for this opinion is presented. Several possible enhancements to the conventional software cycle are discussed. Software fault tolerance, a possible enhancement of major importance, is discussed separately. Formal verification using mathematical proof is considered. Automatic programming is a radical alternative to the conventional cycle and is discussed. Recommendations for a comprehensive approach are presented, and various experiments which could be conducted in AIRLAB are described.
Software architecture of the Magdalena Ridge Observatory Interferometer
NASA Astrophysics Data System (ADS)
Farris, Allen; Klinglesmith, Dan; Seamons, John; Torres, Nicolas; Buscher, David; Young, John
2010-07-01
Merging software from 36 independent work packages into a coherent, unified software system with a lifespan of twenty years is the challenge faced by the Magdalena Ridge Observatory Interferometer (MROI). We solve this problem by using standardized interface software automatically generated from simple highlevel descriptions of these systems, relying only on Linux, GNU, and POSIX without complex software such as CORBA. This approach, based on gigabit Ethernet with a TCP/IP protocol, provides the flexibility to integrate and manage diverse, independent systems using a centralized supervisory system that provides a database manager, data collectors, fault handling, and an operator interface.
Software and electronic developments for TUG - T60 robotic telescope
NASA Astrophysics Data System (ADS)
Parmaksizoglu, M.; Dindar, M.; Kirbiyik, H.; Helhel, S.
2014-12-01
A robotic telescope is a telescope that can make observations without hands-on human control. Its low level behavior is automatic and computer-controlled. Robotic telescopes usually run under the control of a scheduler, which provides high-level control by selecting astronomical targets for observation. TUBITAK National Observatory (TUG) T60 Robotic Telescope is controlled by open source OCAAS software, formally named TALON. This study introduces the improvements on TALON software, new electronic and mechanic designs. The designs and software improvements were implemented in the T60 telescope control software and tested on the real system successfully.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
Large - scale Rectangular Ruler Automated Verification Device
NASA Astrophysics Data System (ADS)
Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie
2018-03-01
This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.
Final Report Ra Power Management 1255 10-15-16 FINAL_Public
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
Automatic Rock Detection and Mapping from HiRISE Imagery
NASA Technical Reports Server (NTRS)
Huertas, Andres; Adams, Douglas S.; Cheng, Yang
2008-01-01
This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hang Bae
A reliability testing was performed for the software of Shutdown(SDS) Computers for Wolsong Nuclear Power Plants Units 2, 3 and 4. profiles to the SDS Computers and compared the outputs with the predicted results generated by the oracle. Test softwares were written to execute the test automatically. Random test profiles were generated using analysis code. 11 refs., 1 fig.
ERIC Educational Resources Information Center
Cordier, Deborah
2009-01-01
A renewed focus on foreign language (FL) learning and speech for communication has resulted in computer-assisted language learning (CALL) software developed with Automatic Speech Recognition (ASR). ASR features for FL pronunciation (Lafford, 2004) are functional components of CALL designs used for FL teaching and learning. The ASR features…
2010-09-01
cards and software in almost any computer can communicate with each other seamlessly. The cable modem protocol is another example of competing...streaming: it means that special client software applications known as podcatchers (such as Apple Inc.’s iTunes or Nullsoft’s Winamp) can automatically
Using Software Tools to Automate the Assessment of Student Programs.
ERIC Educational Resources Information Center
Jackson, David
1991-01-01
Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…
Polisher (conflicting versions 2.0.8 on IM Form, 1.0 on abstract)
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-09-18
Polisher is a software package designed to facilitate the error correction of an assembled genome using Illumia read data. The software addresses substandard regions by automatically correcting consensus errors and/or suggesting primer walking reactions to improve the quality of the bases. This is done by performing the following:...........
Adaptive System Modeling for Spacecraft Simulation
NASA Technical Reports Server (NTRS)
Thomas, Justin
2011-01-01
This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).
Traffic-Light-Preemption Vehicle-Transponder Software Module
NASA Technical Reports Server (NTRS)
Bachelder, Aaron; Foster, Conrad
2005-01-01
A prototype wireless data-communication and control system automatically modifies the switching of traffic lights to give priority to emergency vehicles. The system, which was reported in several NASA Tech Briefs articles at earlier stages of development, includes a transponder on each emergency vehicle, a monitoring and control unit (an intersection controller) at each intersection equipped with traffic lights, and a central monitoring subsystem. An essential component of the system is a software module executed by a microcontroller in each transponder. This module integrates and broadcasts data on the position, velocity, acceleration, and emergency status of the vehicle. The position, velocity, and acceleration data are derived partly from the Global Positioning System, partly from deductive reckoning, and partly from a diagnostic computer aboard the vehicle. The software module also monitors similar broadcasts from other vehicles and from intersection controllers, informs the driver of which intersections it controls, and generates visible and audible alerts to inform the driver of any other emergency vehicles that are close enough to create a potential hazard. The execution of the software module can be monitored remotely and the module can be upgraded remotely and, hence, automatically
Optomechanical design software for segmented mirrors
NASA Astrophysics Data System (ADS)
Marrero, Juan
2016-08-01
The software package presented in this paper, still under development, was born to help analyzing the influence of the many parameters involved in the design of a large segmented mirror telescope. In summary, it is a set of tools which were added to a common framework as they were needed. Great emphasis has been made on the graphical presentation, as scientific visualization nowadays cannot be conceived without the use of a helpful 3d environment, showing the analyzed system as close to reality as possible. Use of third party software packages is limited to ANSYS, which should be available in the system only if the FEM results are needed. Among the various functionalities of the software, the next ones are worth mentioning here: automatic 3d model construction of a segmented mirror from a set of parameters, geometric ray tracing, automatic 3d model construction of a telescope structure around the defined mirrors from a set of parameters, segmented mirror human access assessment, analysis of integration tolerances, assessment of segments collision, structural deformation under gravity and thermal variation, mirror support system analysis including warping harness mechanisms, etc.
New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.
Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G
2012-01-01
This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.
Pandora Operation and Analysis Software
NASA Technical Reports Server (NTRS)
Herman, Jay; Cede, Alexander; Abuhassan, Nader
2012-01-01
Pandora Operation and Analysis Software controls the Pandora Sun- and sky-pointing optical head and built-in filter wheels (neutral density, UV bandpass, polarization filters, and opaque). The software also controls the attached spectrometer exposure time and thermoelectric cooler to maintain the spectrometer temperature to within 1 C. All functions are available through a GUI so as to be easily accessible by the user. The data are automatically stored on a miniature computer (netbook) for automatic download to a designated server at user defined intervals (once per day, once per week, etc.), or to a USB external device. An additional software component reduces the raw data (spectrometer counts) to preliminary scientific products for quick-view purposes. The Pandora systems are built from off-the-shelf commercial parts and from mechanical parts machined using electronic machine shop drawings. The Pandora spectrometer system is designed to look at the Sun (tracking to within 0.1 ), or to look at the sky at any zenith or azimuth angle, to gather information about the amount of trace gases or aerosols that are present.
New generation of the health monitoring system SMS 2001
NASA Astrophysics Data System (ADS)
Berndt, Rolf-Dietrich; Schwesinger, Peter
2001-08-01
The Structure Monitoring System SMS 2001 (applied for patent) represents a modular structured multi-component measurement devise for use under outdoor conditions. Besides usual continuously (static) measurements of e.g. environmental parameters and structure related responses the SMS is able to register also short term dynamic events automatically with measurement frequencies up to 1 kHz. A larger range of electrical sensors is able to be used. On demand a solar based power supply can be realized. The SMS 2001 is adaptable in a wide range, it is space-saving in its geometric structure and can meet very various demands of the users. The system is applicable preferably for small and medium sized concrete and steel structures (besides buildings and bridges also for special cases). It is suitable to support the efficient concept of a controlled life time extension especially in the case of pre-damaged structures. The interactive communication between SMS and the central office is completely remote controlled. Two point or multi-point connections using the internet can be realized. The measurement data are stored in a central data bank. A safe access supported by software modules can be organized in different levels, e.g. for scientific evaluation, service reasons or needs of authorities.
Real time remote monitoring and pre-warning system for Highway landslide in mountain area.
Zhang, Yonghui; Li, Hongxu; Sheng, Qian; Wu, Kai; Chen, Guoliang
2011-06-01
The wire-pulling trigger displacement meter with precision of 1 mm and the grid pluviometer with precision of 0.1 mm are used to monitor the surface displacement and rainfall for Highway slope, and the measured data are transferred to the remote computer in real time by general packet radio service (GPRS) net of China telecom. The wire-pulling trigger displacement meter, grid pluviometer, data acquisition and transmission unit, and solar power supply device are integrated to form a comprehensive monitoring hardware system for Highway landslide in mountain area, which proven to be economical, energy-saving, automatic and high efficient. Meantime, based on the map and geographic information system (MAPGIS) platform, the software system is also developed for three dimensional (3D) geology modeling and visualization, data inquiring and drawing, stability calculation, displacement forecasting, and real time pre-warning. Moreover, the pre-warning methods based on monitoring displacement and rainfall are discussed. The monitoring and forecasting system for Highway landslide has been successfully applied in engineering practice to provide security for Highway transportation and construction and reduce environment disruption. Copyright © 2011 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.
CCDST: A free Canadian climate data scraping tool
NASA Astrophysics Data System (ADS)
Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.
2015-02-01
In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.
Modulating skylight. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1983-01-01
A reflector boosted skylight with automatic controls to maintain interior light level and maximize heat capture by opening and closing the insulated reflector, was built and operated. Light levels maintained by the skylight in a small test building were monitored for a one year period and the energy saving computed by comparison to conventional fluorescent lighting.
Mathematical models of water application for a variable rate irrigating hill-seeder
USDA-ARS?s Scientific Manuscript database
A variable rate irrigating hill-seeder can adjust water application automatically according to the difference in soil moisture content in the field to alleviate drought and save water. Two key problems to realize variable rate water application are how to determine the right amount of water for the ...
Mathematic models of water application for a variable rate irrigating hill-seeder
USDA-ARS?s Scientific Manuscript database
A variable rate irrigating hill-seeder can adjust water application automatically according to the difference in soil moisture content in the field to alleviate drought and save water. Two key problems to realize variable rate water application are how to determine the right amount of water for the ...
Leaders Look to Technology for Savings and Change
ERIC Educational Resources Information Center
Young, Jeffrey R.
2012-01-01
Students are bringing the latest devices to campuses expecting to use them as learning tools, and colleges are trying to deliver. Some of the world's best-known universities tried some experiments with a new model of online learning, in which students watch short video lectures, take automatically graded quizzes, and use online communities to work…
[Implementation of a new electronic patient record in surgery].
Eggli, S; Holm, J
2001-12-01
The increasing amount of clinical data, intensified interest of patients in medical information, medical quality management and the recent cost explosion in health care systems have forced medical institutions to improve their strategy in handling medical data. In the orthopedic department (3,600 surgeries, 75 beds, 14,000 consultations) software application for comprehensive patient data management has been developed. When implementing the electronic patient history following criteria were evaluated: 1. software evaluation, 2. implementation, 3. work flow, 4. data security/system stability. In the first phase the functional character was defined. Implementation required 3 months after parametrization. The expense amounted to 130,000 DM (30 clients). The training requirements were one afternoon for the secretaries and a 2-h session for the residents. The access speed on medically relevant data averaged under 3 s. The average saving in working hours was approximately 5 h/week for the secretaries and 4 h/week for the residents. The saving in paper amounted to 36,000 sheets/year. In 3 operational years there were 3 server breakdowns. Evaluation of the saving on working hours showed that such a system can amortize within a year. The latest improvements in hardware and software technology made the electronic medical record with integrated quality-control practicable without massive expenditure. The system supplies an extensive platform of information for patient treatment and an instrument to evaluate the efficiency of therapy strategies independent of the clinical field.
Blanco, Jesús; García, Andrés; Morenas, Javier de Las
2018-06-09
Energy saving has become a major concern for the developed society of our days. This paper presents a Wireless Sensor and Actuator Network (WSAN) designed to provide support to an automatic intelligent system, based on the Internet of Things (IoT), which enables a responsible consumption of energy. The proposed overall system performs an efficient energetic management of devices, machines and processes, optimizing their operation to achieve a reduction in their overall energy usage at any given time. For this purpose, relevant data is collected from intelligent sensors, which are in-stalled at the required locations, as well as from the energy market through the Internet. This information is analysed to provide knowledge about energy utilization, and to improve efficiency. The system takes autonomous decisions automatically, based on the available information and the specific requirements in each case. The proposed system has been implanted and tested in a food factory. Results show a great optimization of energy efficiency and a substantial improvement on energy and costs savings.
A comparison of time-shared vs. batch development of space software
NASA Technical Reports Server (NTRS)
Forthofer, M.
1977-01-01
In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.
Diagnostics Tools Identify Faults Prior to Failure
NASA Technical Reports Server (NTRS)
2013-01-01
Through the SBIR program, Rochester, New York-based Impact Technologies LLC collaborated with Ames Research Center to commercialize the Center s Hybrid Diagnostic Engine, or HyDE, software. The fault detecting program is now incorporated into a software suite that identifies potential faults early in the design phase of systems ranging from printers to vehicles and robots, saving time and money.
Running Gaussian16 Software Jobs on the Peregrine System | High-Performance
, parallel setup is taken care of automatically based on settings in the PBS script example below. Previous filesystem called /dev/shm. This scratch space is set automatically by the example script below. The Gaussian system. An example script for batch submission is given below. #!/bin/bash #PBS -l nodes=2 #PBS -l
Automatic Astrometric and Photometric Calibration with SCAMP
NASA Astrophysics Data System (ADS)
Bertin, E.
2006-07-01
Astrometric and photometric calibrations have remained the most tiresome step in the reduction of large imaging surveys. I present a new software package, SCAMP which has been written to address this problem. SCAMP efficiently computes accurate astrometric and photometric solutions for any arbitrary sequence of FITS images in a completely automatic way. SCAMP is released under the GNU General Public Licence.
Casiraghi, Elena; Cossa, Mara; Huber, Veronica; Rivoltini, Licia; Tozzi, Matteo; Villa, Antonello; Vergani, Barbara
2017-11-02
In the clinical practice, automatic image analysis methods quickly quantizing histological results by objective and replicable methods are getting more and more necessary and widespread. Despite several commercial software products are available for this task, they are very little flexible, and provided as black boxes without modifiable source code. To overcome the aforementioned problems, we employed the commonly used MATLAB platform to develop an automatic method, MIAQuant, for the analysis of histochemical and immunohistochemical images, stained with various methods and acquired by different tools. It automatically extracts and quantifies markers characterized by various colors and shapes; furthermore, it aligns contiguous tissue slices stained by different markers and overlaps them with differing colors for visual comparison of their localization. Application of MIAQuant for clinical research fields, such as oncology and cardiovascular disease studies, has proven its efficacy, robustness and flexibility with respect to various problems; we highlight that, the flexibility of MIAQuant makes it an important tool to be exploited for basic researches where needs are constantly changing. MIAQuant software and its user manual are freely available for clinical studies, pathological research, and diagnosis.
Effectiveness of sequential automatic-manual home respiratory polygraphy scoring.
Masa, Juan F; Corral, Jaime; Pereira, Ricardo; Duran-Cantolla, Joaquin; Cabello, Marta; Hernández-Blasco, Luis; Monasterio, Carmen; Alonso-Fernandez, Alberto; Chiner, Eusebi; Vázquez-Polo, Francisco-José; Montserrat, Jose M
2013-04-01
Automatic home respiratory polygraphy (HRP) scoring functions can potentially confirm the diagnosis of sleep apnoea-hypopnoea syndrome (SAHS) (obviating technician scoring) in a substantial number of patients. The result would have important management and cost implications. The aim of this study was to determine the diagnostic cost-effectiveness of a sequential HRP scoring protocol (automatic and then manual for residual cases) compared with manual HRP scoring, and with in-hospital polysomnography. We included suspected SAHS patients in a multicentre study and assigned them to home and hospital protocols at random. We constructed receiver operating characteristic (ROC) curves for manual and automatic scoring. Diagnostic agreement for several cut-off points was explored and costs for two equally effective alternatives were calculated. Of 366 randomised patients, 348 completed the protocol. Manual scoring produced better ROC curves than automatic scoring. There was no sensitive automatic or subsequent manual HRP apnoea-hypopnoea index (AHI) cut-off point. The specific cut-off points for automatic and subsequent manual HRP scorings (AHI >25 and >20, respectively) had a specificity of 93% for automatic and 94% for manual scorings. The costs of manual protocol were 9% higher than sequential HRP protocol; these were 69% and 64%, respectively, of the cost of the polysomnography. A sequential HRP scoring protocol is a cost-effective alternative to polysomnography, although with limited cost savings compared to HRP manual scoring.
Software development for teleroentgenogram analysis
NASA Astrophysics Data System (ADS)
Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.
2017-09-01
A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.
Applying CASE Tools for On-Board Software Development
NASA Astrophysics Data System (ADS)
Brammer, U.; Hönle, A.
For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.
The Role of Computers in Research and Development at Langley Research Center
NASA Technical Reports Server (NTRS)
Wieseman, Carol D. (Compiler)
1994-01-01
This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.
Language and Program for Documenting Software Design
NASA Technical Reports Server (NTRS)
Kleine, H.; Zepko, T. M.
1986-01-01
Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.
Documenting clinical pharmacist intervention before and after the introduction of a web-based tool.
Nurgat, Zubeir A; Al-Jazairi, Abdulrazaq S; Abu-Shraie, Nada; Al-Jedai, Ahmed
2011-04-01
To develop a database for documenting pharmacist intervention through a web-based application. The secondary endpoint was to determine if the new, web-based application provides any benefits with regards to documentation compliance by clinical pharmacists and ease of calculating cost savings compared with our previous method of documenting pharmacist interventions. A tertiary care hospital in Saudi Arabia. The documentation of interventions using a web-based documentation application was retrospectively compared with previous methods of documentation of clinical pharmacists' interventions (multi-user PC software). The number and types of interventions recorded by pharmacists, data mining of archived data, efficiency, cost savings, and the accuracy of the data generated. The number of documented clinical interventions increased from 4,926, using the multi-user PC software, to 6,840 for the web-based application. On average, we observed 653 interventions per clinical pharmacist using the web-based application, which showed an increase compared to an average of 493 interventions using the old multi-user PC software. However, using a paired Student's t-test there was no statistical significance difference between the two means (P = 0.201). Using a χ² test, which captured management level and the type of system used, we found a strong effect of management level (P < 2.2 × 10⁻¹⁶) on the number of documented interventions. We also found a moderately significant relationship between educational level and the number of interventions documented (P = 0.045). The mean ± SD time required to document an intervention using the web-based application was 66.55 ± 8.98 s. Using the web-based application, 29.06% of documented interventions resulted in cost-savings, while using the multi-user PC software only 4.75% of interventions did so. The majority of cost savings across both platforms resulted from the discontinuation of unnecessary drugs and a change in dosage regimen. Data collection using the web-based application was consistently more complete when compared to the multi-user PC software. The web-based application is an efficient system for documenting pharmacist interventions. Its flexibility and accessibility, as well as its detailed report functionality is a useful tool that will hopefully encourage other primary and secondary care facilities to adopt similar applications.
The Computer Revolution and Physical Chemistry.
ERIC Educational Resources Information Center
O'Brien, James F.
1989-01-01
Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Scheduling algorithms for automatic control systems for technological processes
NASA Astrophysics Data System (ADS)
Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.
2017-01-01
Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.
FlowerMorphology: fully automatic flower morphometry software.
Rozov, Sergey M; Deineko, Elena V; Deyneko, Igor V
2018-05-01
The software FlowerMorphology is designed for automatic morphometry of actinomorphic flowers. The novel complex parameters of flowers calculated by FlowerMorphology allowed us to quantitatively characterize a polyploid series of tobacco. Morphological differences of plants representing closely related lineages or mutants are mostly quantitative. Very often, there are only very fine variations in plant morphology. Therefore, accurate and high-throughput methods are needed for their quantification. In addition, new characteristics are necessary for reliable detection of subtle changes in morphology. FlowerMorphology is an all-in-one software package to automatically image and analyze five-petal actinomorphic flowers of the dicotyledonous plants. Sixteen directly measured parameters and ten calculated complex parameters of a flower allow us to characterize variations with high accuracy. The program was developed for the needs of automatic characterization of Nicotiana tabacum flowers, but is applicable to many other plants with five-petal actinomorphic flowers and can be adopted for flowers of other merosity. A genetically similar polyploid series of N. tabacum plants was used to investigate differences in flower morphology. For the first time, we could quantify the dependence between ploidy and size and form of the tobacco flowers. We found that the radius of inner petal incisions shows a persistent positive correlation with the chromosome number. In contrast, a commonly used parameter-radius of outer corolla-does not discriminate 2n and 4n plants. Other parameters show that polyploidy leads to significant aberrations in flower symmetry and are also positively correlated with chromosome number. Executables of FlowerMorphology, source code, documentation, and examples are available at the program website: https://github.com/Deyneko/FlowerMorphology .
NASA Astrophysics Data System (ADS)
Ismail, M. A. M.; Kumar, N. S.; Abidin, M. H. Z.; Madun, A.
2018-04-01
This study is about systematic approach to photogrammetric survey that is applicable in the extraction of elevation data for geophysical surveys in hilly terrains using Unmanned Aerial Vehicles (UAVs). The outcome will be to acquire high-quality geophysical data from areas where elevations vary by locating the best survey lines. The study area is located at the proposed construction site for the development of a water reservoir and related infrastructure in Kampus Pauh Putra, Universiti Malaysia Perlis. Seismic refraction surveys were carried out for the modelling of the subsurface for detailed site investigations. Study were carried out to identify the accuracy of the digital elevation model (DEM) produced from an UAV. At 100 m altitude (flying height), over 135 overlapping images were acquired using a DJI Phantom 3 quadcopter. All acquired images were processed for automatic 3D photo-reconstruction using Agisoft PhotoScan digital photogrammetric software, which was applied to all photogrammetric stages. The products generated included a 3D model, dense point cloud, mesh surface, digital orthophoto, and DEM. In validating the accuracy of the produced DEM, the coordinates of the selected ground control point (GCP) of the survey line in the imaging area were extracted from the generated DEM with the aid of Global Mapper software. These coordinates were compared with the GCPs obtained using a real-time kinematic global positioning system. The maximum percentage of difference between GCP’s and photogrammetry survey is 13.3 %. UAVs are suitable for acquiring elevation data for geophysical surveys which can save time and cost.
Traffic Aware Planner (TAP) Flight Evaluation
NASA Technical Reports Server (NTRS)
Maris, John M.; Haynes, Mark A.; Wing, David J.; Burke, Kelly A.; Henderson, Jeff; Woods, Sharon E.
2014-01-01
NASA's Traffic Aware Planner (TAP) is a cockpit decision support tool that has the potential to achieve significant fuel and time savings when it is embedded in the data-rich Next Generation Air Transportation System (NextGen) airspace. To address a key step towards the operational deployment of TAP and the NASA concept of Traffic Aware Strategic Aircrew Requests (TASAR), a system evaluation was conducted in a representative flight environment in November, 2013. Numerous challenges were overcome to achieve this goal, including the porting of the foundational Autonomous Operations Planner (AOP) software from its original simulation-based, avionics-embedded environment to an Electronic Flight Bag (EFB) platform. A flight-test aircraft was modified to host the EFB, the TAP application, an Automatic Dependent Surveillance Broadcast (ADS-B) processor, and a satellite broadband datalink. Nine Evaluation Pilots conducted 26 hours of TAP assessments using four route profiles in the complex eastern and north-eastern United States airspace. Extensive avionics and video data were collected, supplemented by comprehensive inflight and post-flight questionnaires. TAP was verified to function properly in the live avionics and ADS-B environment, characterized by recorded data dropouts, latency, and ADS-B message fluctuations. Twelve TAP-generated optimization requests were submitted to ATC, of which nine were approved, and all of which resulted in fuel and/or time savings. Analysis of subjective workload data indicated that pilot interaction with TAP during flight operations did not induce additional cognitive loading. Additionally, analyses of post-flight questionnaire data showed that the pilots perceived TAP to be useful, understandable, intuitive, and easy to use. All program objectives were met, and the next phase of TAP development and evaluations with partner airlines is in planning for 2015.
Automatic pelvis segmentation from x-ray images of a mouse model
NASA Astrophysics Data System (ADS)
Al Okashi, Omar M.; Du, Hongbo; Al-Assam, Hisham
2017-05-01
The automatic detection and quantification of skeletal structures has a variety of different applications for biological research. Accurate segmentation of the pelvis from X-ray images of mice in a high-throughput project such as the Mouse Genomes Project not only saves time and cost but also helps achieving an unbiased quantitative analysis within the phenotyping pipeline. This paper proposes an automatic solution for pelvis segmentation based on structural and orientation properties of the pelvis in X-ray images. The solution consists of three stages including pre-processing image to extract pelvis area, initial pelvis mask preparation and final pelvis segmentation. Experimental results on a set of 100 X-ray images showed consistent performance of the algorithm. The automated solution overcomes the weaknesses of a manual annotation procedure where intra- and inter-observer variations cannot be avoided.
Automatic Layout Design for Power Module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, Puqi; Wang, Fei; Ngo, Khai
The layout of power modules is one of the most important elements in power module design, especially for high power densities, where couplings are increased. In this paper, an automatic design process using a genetic algorithm is presented. Some practical considerations are introduced in the optimization of the layout design of the module. This paper presents a process for automatic layout design for high power density modules. Detailed GA implementations are introduced both for outer loop and inner loop. As verified by a design example, the results of the automatic design process presented here are better than those from manualmore » design and also better than the results from a popular design software. This automatic design procedure could be a major step toward improving the overall performance of future layout design.« less
Reproducible research in palaeomagnetism
NASA Astrophysics Data System (ADS)
Lurcock, Pontus; Florindo, Fabio
2015-04-01
The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined, then saved as a self-contained configuration which can be re-run without human interaction. PuffinPlot can thus be used as a component of a larger scientific workflow, integrated with workflow management tools such as Kepler, without compromising its capabilities as an exploratory tool. Since both PuffinPlot and the platform it runs on (Java) are Free/Open Source software, even the most fundamental components of an analysis can be verified and reproduced.
A MultiDiscipline Approach to Digitizing Historic Seismograms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Andrew
2016-04-07
Retriever Technology has developed and has made available free of charge a seismogram digitization software package called SKATE (Seismogram Kit for Automatic Trace Extraction). We have developed an extensive set of algorithms that process seismogram image files, provide editing tools, and output time series data. The software is available online and free of charge at seismo.redfish.com. To demonstrate the speed and cost effectiveness of the software, we have processed over 30,000 images.
Accuracy of open-source software segmentation and paper-based printed three-dimensional models.
Szymor, Piotr; Kozakiewicz, Marcin; Olszewski, Raphael
2016-02-01
In this study, we aimed to verify the accuracy of models created with the help of open-source Slicer 3.6.3 software (Surgical Planning Lab, Harvard Medical School, Harvard University, Boston, MA, USA) and the Mcor Matrix 300 paper-based 3D printer. Our study focused on the accuracy of recreating the walls of the right orbit of a cadaveric skull. Cone beam computed tomography (CBCT) of the skull was performed (0.25-mm pixel size, 0.5-mm slice thickness). Acquired DICOM data were imported into Slicer 3.6.3 software, where segmentation was performed. A virtual model was created and saved as an .STL file and imported into Netfabb Studio professional 4.9.5 software. Three different virtual models were created by cutting the original file along three different planes (coronal, sagittal, and axial). All models were printed with a Selective Deposition Lamination Technology Matrix 300 3D printer using 80 gsm A4 paper. The models were printed so that their cutting plane was parallel to the paper sheets creating the model. Each model (coronal, sagittal, and axial) consisted of three separate parts (∼200 sheets of paper each) that were glued together to form a final model. The skull and created models were scanned with a three-dimensional (3D) optical scanner (Breuckmann smart SCAN) and were saved as .STL files. Comparisons of the orbital walls of the skull, the virtual model, and each of the three paper models were carried out with GOM Inspect 7.5SR1 software. Deviations measured between the models analysed were presented in the form of a colour-labelled map and covered with an evenly distributed network of points automatically generated by the software. An average of 804.43 ± 19.39 points for each measurement was created. Differences measured in each point were exported as a .csv file. The results were statistically analysed using Statistica 10, with statistical significance set at p < 0.05. The average number of points created on models for each measurement was 804.43 ± 19.39; however, deviation in some of the generated points could not be calculated, and those points were excluded from further calculations. From 94% to 99% of the measured absolute deviations were <1 mm. The mean absolute deviation between the skull and virtual model was 0.15 ± 0.11 mm, between the virtual and printed models was 0.15 ± 0.12 mm, and between the skull and printed models was 0.24 ± 0.21 mm. Using the optical scanner and specialized inspection software for measurements of accuracy of the created parts is recommended, as it allows one not only to measure 2-dimensional distances between anatomical points but also to perform more clinically suitable comparisons of whole surfaces. However, it requires specialized software and a very accurate scanner in order to be useful. Threshold-based, manually corrected segmentation of orbital walls performed with 3D Slicer software is accurate enough to be used for creating a virtual model of the orbit. The accuracy of the paper-based Mcor Matrix 300 3D printer is comparable to those of other commonly used 3-dimensional printers and allows one to create precise anatomical models for clinical use. The method of dividing the model into smaller parts and sticking them together seems to be quite accurate, although we recommend it only for creating small, solid models with as few parts as possible to minimize shift associated with gluing. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Activity and task of the saveMLAK and aid for library
NASA Astrophysics Data System (ADS)
Okamoto, Makoto
We report the activities of saveMLAK, an organization dedicated to supporting museums, libraries, archives, and kominkans damaged by the Great East Japan Earthquake, focusing on the activities for libraries. saveMLAK provides a website using MediaWiki collaborative editing software for accumulating information regarding damage and support activities, offering information support, indirect support, and intermediary support. We also report the collaboration with Miyagi Prefectural Library based on the accumulated, shared information as an example of support for libraries in the disaster area. We describe the process of the activities of saveMLAK and problems emerging so far, and provide constructive criticism and proposals to other support activities for libraries. In conclusion, we suggest establishment of permanent organizations/functions to prepare for emergencies and to cope with disasters in the future.
Using Wmatrix to Explore Discourse of Economic Growth
ERIC Educational Resources Information Center
Hu, Chunyu
2015-01-01
Growth is a concept of particular interest for economic discourse. This paper sets out to explore a small corpus of economic growth, which consists of articles from "The Economist". The corpus software used in this study is a web-based tool Wmatrix, an automatic tagging software able to assign semantic field (domain) tags, and to permit…
ERIC Educational Resources Information Center
Franco, Horacio; Bratt, Harry; Rossier, Romain; Rao Gadde, Venkata; Shriberg, Elizabeth; Abrash, Victor; Precoda, Kristin
2010-01-01
SRI International's EduSpeak[R] system is a software development toolkit that enables developers of interactive language education software to use state-of-the-art speech recognition and pronunciation scoring technology. Automatic pronunciation scoring allows the computer to provide feedback on the overall quality of pronunciation and to point to…
Tailoring Software for Multiple Processor Systems
1982-10-01
resource management decisions . Despite the lack of programming support, the use of multiple processor systems has grown sub- -stantially. Software has...making resource management decisions . Specifically, program- 1 mers need not allocate specific hardware resources to individual program components...Instead, such allocation decisions are automatically made based on high-level resource directives stated by ap- plication programmers, where each directive
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
An automated drift time extraction and computed associated collision cross section software tool for small molecule analysis with ion mobility spectrometry-mass spectrometry (IMS-MS). The software automatically extracts drift times and computes associated collision cross sections for small molecules analyzed using ion mobility spectrometry-mass spectrometry (IMS-MS) based on a target list of expected ions provided by the user.
The algorithm for automatic detection of the calibration object
NASA Astrophysics Data System (ADS)
Artem, Kruglov; Irina, Ugfeld
2017-06-01
The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.
Payload accommodation and development planning tools - A Desktop Resource Leveling Model (DRLM)
NASA Technical Reports Server (NTRS)
Hilchey, John D.; Ledbetter, Bobby; Williams, Richard C.
1989-01-01
The Desktop Resource Leveling Model (DRLM) has been developed as a tool to rapidly structure and manipulate accommodation, schedule, and funding profiles for any kind of experiments, payloads, facilities, and flight systems or other project hardware. The model creates detailed databases describing 'end item' parameters, such as mass, volume, power requirements or costs and schedules for payload, subsystem, or flight system elements. It automatically spreads costs by calendar quarters and sums costs or accommodation parameters by total project, payload, facility, payload launch, or program phase. Final results can be saved or printed out, automatically documenting all assumptions, inputs, and defaults.
Automatic system for ionization chamber current measurements.
Brancaccio, Franco; Dias, Mauro S; Koskinas, Marina F
2004-12-01
The present work describes an automatic system developed for current integration measurements at the Laboratório de Metrologia Nuclear of Instituto de Pesquisas Energéticas e Nucleares. This system includes software (graphic user interface and control) and a module connected to a microcomputer, by means of a commercial data acquisition card. Measurements were performed in order to check the performance and for validating the proposed design.
Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness
2016-06-01
within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need
Quality Control of True Height Profiles Obtained Automatically from Digital Ionograms.
1982-05-01
nece.,ssary and Identify by block number) Ionosphere Digisonde Electron Density Profile Ionogram Autoscaling ARTIST 2 , ABSTRACT (Continue on reverae...analysis technique currently used with the ionogram traces scaled automatically by the ARTIST software [Reinisch and Huang, 1983; Reinisch et al...19841, and the generalized polynomial analysis technique POLAN [Titheridge, 1985], using the same ARTIST -identified ionogram traces. 2. To determine how
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Huang, Hsun-Chin; Liao, Yung-Kun; Shih, Ching-Tien; Chiang, Ming-Shan
2010-01-01
The latest researches adopted software technology to improve pointing performance; however, Drag-and-Drop (DnD) operation is also commonly used in modern GUI programming. This study evaluated whether two children with developmental disabilities would be able to improve their DnD performance, through an Automatic DnD Assistive Program (ADnDAP). At…
Super Strypi HWIL 6DOF (Hardware-In-Loop six-degree-of-freedom) Rev. 2175
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Jeff C.; Harl, Nathan R.; Kowalchuk, Scott A.
2016-02-23
The Super Strypi HWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle. The simulation is used to test the NGC flight software including the navigation software. Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters are defined in input files. Output parameters are saved to a Matlab mat file.
Funding for Life: When to Spend the Acquisition Pot
2010-05-01
Private Military Sector Software Requirements for OA Spiral Development Strategy for Defense Acquisition Research The Software, Hardware...qb=p`elli= Capital Budgeting for the DoD Energy Saving Contracts/DoD Mobile Assets Financing DoD Budget via PPPs Lessons from Private Sector ...the endeavor can, in part, be related to the stability of the aims and contributory components. Economic growth has been driven by globalisation
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization
NASA Technical Reports Server (NTRS)
Birge, B.
2013-01-01
A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.
Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models
NASA Astrophysics Data System (ADS)
Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.
2012-04-01
The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation process as a sequence of discrete equations which are assembled and solved. It is the coupling of the respective abstractions employed by libadjoint and the FEniCS project which produces the adjoint model automatically, without further intervention from the model developer. This presentation will demonstrate this new technology through linear and non-linear shallow water test cases. The exceptionally simple model syntax will be highlighted and the correctness of the resulting adjoint simulations will be demonstrated using rigorous convergence tests.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Image processing in biodosimetry: A proposal of a generic free software platform.
Dumpelmann, Matthias; Cadena da Matta, Mariel; Pereira de Lemos Pinto, Marcela Maria; de Salazar E Fernandes, Thiago; Borges da Silva, Edvane; Amaral, Ademir
2015-08-01
The scoring of chromosome aberrations is the most reliable biological method for evaluating individual exposure to ionizing radiation. However, microscopic analyses of chromosome human metaphases, generally employed to identify aberrations mainly dicentrics (chromosome with two centromeres), is a laborious task. This method is time consuming and its application in biological dosimetry would be almost impossible in case of a large scale radiation incidents. In this project, a generic software was enhanced for automatic chromosome image processing from a framework originally developed for the Framework V project Simbio, of the European Union for applications in the area of source localization from electroencephalographic signals. The platforms capability is demonstrated by a study comparing automatic segmentation strategies of chromosomes from microscopic images.
[Integrated Development of Full-automatic Fluorescence Analyzer].
Zhang, Mei; Lin, Zhibo; Yuan, Peng; Yao, Zhifeng; Hu, Yueming
2015-10-01
In view of the fact that medical inspection equipment sold in the domestic market is mainly imported from abroad and very expensive, we developed a full-automatic fluorescence analyzer in our center, presented in this paper. The present paper introduces the hardware architecture design of FPGA/DSP motion controlling card+PC+ STM32 embedded micro processing unit, software system based on C# multi thread, design and implementation of double-unit communication in detail. By simplifying the hardware structure, selecting hardware legitimately and adopting control system software to object-oriented technology, we have improved the precision and velocity of the control system significantly. Finally, the performance test showed that the control system could meet the needs of automated fluorescence analyzer on the functionality, performance and cost.
Advanced Manufacturing of Superconducting Magnets
NASA Technical Reports Server (NTRS)
Senti, Mark W.
1996-01-01
The development of specialized materials, processes, and robotics technology allows for the rapid prototype and manufacture of superconducting and normal magnets which can be used for magnetic suspension applications. Presented are highlights of the Direct Conductor Placement System (DCPS) which enables automatic design and assembly of 3-dimensional coils and conductor patterns using LTS and HTS conductors. The system enables engineers to place conductors in complex patterns with greater efficiency and accuracy, and without the need for hard tooling. It may also allow researchers to create new types of coils and patterns which were never practical before the development of DCPS. The DCPS includes a custom designed eight-axis robot, patented end effector, CoilCAD(trademark) design software, RoboWire(trademark) control software, and automatic inspection.
Army Sustainment. Volume 43, Issue 3, May-June 2011
2011-06-01
data entry, which produces a decrease in errors and also an increase in efficiency because automatic reading saves time . Today’s RFID...company, and even pla- toon, level . This push does not necessarily call for drastic adjustments to the personnel and equipment structure of the company...specialists. This increased capability would allow more streamlined processes for submitting
’When You Get a Job to Do, Do It.’ The Airpower Leadership of Lt Gen William H. Tunner
2008-01-01
controls; oxygen system; painting of placards and insignia ; rigging and surface controls Station No. 5 Instruments; automatic pilot; electrical system...misinform the Germans and con- vince them that they alone had saved the city from Nazi control.4 While these actions focused on establishing a dominant
Create Network Connection Request | CTIO
CISS uses to add your MAC addresses to the system. It is not automatic. Thank you. In order to use the testing whether you are a human visitor and to prevent automated spam submissions. In what country is the Cerro Tololo Observatory located? * Fill in the blank. Vertical Tabs Save Preview
García-Martín, Ana; Lázaro-Rivera, Carla; Fernández-Golfín, Covadonga; Salido-Tahoces, Luisa; Moya-Mur, Jose-Luis; Jiménez-Nacher, Jose-Julio; Casas-Rojo, Eduardo; Aquila, Iolanda; González-Gómez, Ariana; Hernández-Antolín, Rosana; Zamorano, José Luis
2016-07-01
A specialized three-dimensional transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced; the system automatically configures a geometric model of the aortic root from the images obtained by 3D-TOE and performs quantitative analysis of these structures. The aim of this study was to compare the measurements of the aortic annulus (AA) obtained by the new model to that obtained by 3D-TOE and multidetector computed tomography (MDCT) in candidates to transcatheter aortic valve implantation (TAVI) and to assess the reproducibility of this new method. We included 31 patients who underwent TAVI. The AA diameters and area were evaluated by the manual 3D-TOE method and by the automatic software. We showed an excellent correlation between the measurements obtained by both methods: intra-class correlation coefficient (ICC): 0.731 (0.508-0.862), r: 0.742 for AA diameter and ICC: 0.723 (0.662-0.923), r: 0.723 for the AA area, with no significant differences regardless of the method used. The interobserver variability was superior for the automatic measurements than for the manual ones. In a subgroup of 10 patients, we also found an excellent correlation between the automatic measurements and those obtained by MDCT, ICC: 0.941 (0.761-0.985), r: 0.901 for AA diameter and ICC: 0.853 (0.409-0.964), r: 0.744 for the AA area. The new automatic 3D-TOE software allows modelling and quantifying the aortic root from 3D-TOE data with high reproducibility. There is good correlation between the automated measurements and other 3D validated techniques. Our results support its use in clinical practice as an alternative to MDCT previous to TAVI. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Automation of Flight Software Regression Testing
NASA Technical Reports Server (NTRS)
Tashakkor, Scott B.
2016-01-01
NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add automation, and the ability of the harness and cases to be executed continually. This test concept is an approach that can be adapted to support other projects.
NASA Astrophysics Data System (ADS)
Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.
2014-07-01
The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.
Semiautomated analysis of small-animal PET data.
Kesner, Adam L; Dahlbom, Magnus; Huang, Sung-Cheng; Hsueh, Wei-Ann; Pio, Betty S; Czernin, Johannes; Kreissl, Michael; Wu, Hsiao-Ming; Silverman, Daniel H S
2006-07-01
The objective of the work reported here was to develop and test automated methods to calculate biodistribution of PET tracers using small-animal PET images. After developing software that uses visually distinguishable organs and other landmarks on a scan to semiautomatically coregister a digital mouse phantom with a small-animal PET scan, we elastically transformed the phantom to conform to those landmarks in 9 simulated scans and in 18 actual PET scans acquired of 9 mice. Tracer concentrations were automatically calculated in 22 regions of interest (ROIs) reflecting the whole body and 21 individual organs. To assess the accuracy of this approach, we compared the software-measured activities in the ROIs of simulated PET scans with the known activities, and we compared the software-measured activities in the ROIs of real PET scans both with manually established ROI activities in original scan data and with actual radioactivity content in immediately harvested tissues of imaged animals. PET/atlas coregistrations were successfully generated with minimal end-user input, allowing rapid quantification of 22 separate tissue ROIs. The simulated scan analysis found the method to be robust with respect to the overall size and shape of individual animal scans, with average activity values for all organs tested falling within the range of 98% +/- 3% of the organ activity measured in the unstretched phantom scan. Standardized uptake values (SUVs) measured from actual PET scans using this semiautomated method correlated reasonably well with radioactivity content measured in harvested organs (median r = 0.94) and compared favorably with conventional SUV correlations with harvested organ data (median r = 0.825). A semiautomated analytic approach involving coregistration of scan-derived images with atlas-type images can be used in small-animal whole-body radiotracer studies to estimate radioactivity concentrations in organs. This approach is rapid and less labor intensive than are traditional methods, without diminishing overall accuracy. Such techniques have the possibility of saving time, effort, and the number of animals needed for such assessments.
Elliptic Curve Cryptography with Security System in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Huang, Xu; Sharma, Dharmendra
2010-10-01
The rapid progress of wireless communications and embedded micro-electro-system technologies has made wireless sensor networks (WSN) very popular and even become part of our daily life. WSNs design are generally application driven, namely a particular application's requirements will determine how the network behaves. However, the natures of WSN have attracted increasing attention in recent years due to its linear scalability, a small software footprint, low hardware implementation cost, low bandwidth requirement, and high device performance. It is noted that today's software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed, including the WSNs. But WSNs typically need to configure themselves automatically and support as hoc routing. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper based on our previous works [1, 2], three contributions have made, namely (a) fuzzy controller for dynamic slide window size to improve the performance of running ECC (b) first presented a hidden generation point for protection from man-in-the middle attack and (c) we first investigates multi-agent applying for key exchange together. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of high potential candidates for WSNs, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. For saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man-in-the middle attack. A designed a hidden generator point that offer a good protection from the man-in-the middle (MinM) attack which becomes one of major worries for the sensor's networks with multiagent system is also discussed.
MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeLorenzo, M; Wu, D; Rutel, I
2015-06-15
Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancymore » factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation. We have confirmed that this software accurately calculates air-kerma rates and required barrier thicknesses for diagnostic radiography and fluoroscopic rooms.« less
TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images
2012-01-01
Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (<8 GB) and processing power. It exploits the knowledge of approximate tile positions and uses ad-hoc strategies and algorithms designed for such very large datasets. The produced images can be saved into a multiresolution representation to be efficiently retrieved and processed. We provide TeraStitcher both as standalone application and as plugin of the free software Vaa3D. PMID:23181553
TICK: Transparent Incremental Checkpointing at Kernel Level
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrini, Fabrizio; Gioiosa, Roberto
2004-10-25
TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5
USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 1.
1998-12-31
solely to have a record that could be matched with the CMOS receipt data. (This problem is caused by DLA systems that currently do not populate CMOS with...unable to obtain passwords to the Depot D035 systems. Figure 16 shows daily savings as of 30 September 1998 (current time frame ) and projects savings...Engineering, modeling, and systems/software development company LAN Local Area Network LFA Large Frame Aircraft LMA Logistics Management Agency LMR
[Micron]ADS-B Detect and Avoid Flight Tests on Phantom 4 Unmanned Aircraft System
NASA Technical Reports Server (NTRS)
Arteaga, Ricardo; Dandachy, Mike; Truong, Hong; Aruljothi, Arun; Vedantam, Mihir; Epperson, Kraettli; McCartney, Reed
2018-01-01
Researchers at the National Aeronautics and Space Administration Armstrong Flight Research Center in Edwards, California and Vigilant Aerospace Systems collaborated for the flight-test demonstration of an Automatic Dependent Surveillance-Broadcast based collision avoidance technology on a small unmanned aircraft system equipped with the uAvionix Automatic Dependent Surveillance-Broadcast transponder. The purpose of the testing was to demonstrate that National Aeronautics and Space Administration / Vigilant software and algorithms, commercialized as the FlightHorizon UAS"TM", are compatible with uAvionix hardware systems and the DJI Phantom 4 small unmanned aircraft system. The testing and demonstrations were necessary for both parties to further develop and certify the technology in three key areas: flights beyond visual line of sight, collision avoidance, and autonomous operations. The National Aeronautics and Space Administration and Vigilant Aerospace Systems have developed and successfully flight-tested an Automatic Dependent Surveillance-Broadcast Detect and Avoid system on the Phantom 4 small unmanned aircraft system. The Automatic Dependent Surveillance-Broadcast Detect and Avoid system architecture is especially suited for small unmanned aircraft systems because it integrates: 1) miniaturized Automatic Dependent Surveillance-Broadcast hardware; 2) radio data-link communications; 3) software algorithms for real-time Automatic Dependent Surveillance-Broadcast data integration, conflict detection, and alerting; and 4) a synthetic vision display using a fully-integrated National Aeronautics and Space Administration geobrowser for three dimensional graphical representations for ownship and air traffic situational awareness. The flight-test objectives were to evaluate the performance of Automatic Dependent Surveillance-Broadcast Detect and Avoid collision avoidance technology as installed on two small unmanned aircraft systems. In December 2016, four flight tests were conducted at Edwards Air Force Base. Researchers in the ground control station looking at displays were able to verify the Automatic Dependent Surveillance-Broadcast target detection and collision avoidance resolutions.
JANIS: NEA JAva-based Nuclear Data Information System
NASA Astrophysics Data System (ADS)
Soppera, Nicolas; Bossant, Manuel; Cabellos, Oscar; Dupont, Emmeric; Díez, Carlos J.
2017-09-01
JANIS (JAva-based Nuclear Data Information System) software is developed by the OECD Nuclear Energy Agency (NEA) Data Bank to facilitate the visualization and manipulation of nuclear data, giving access to evaluated nuclear data libraries, such as ENDF, JEFF, JENDL, TENDL etc., and also to experimental nuclear data (EXFOR) and bibliographical references (CINDA). It is available as a standalone Java program, downloadable and distributed on DVD and also a web application available on the NEA website. One of the main new features in JANIS is the scripting capability via command line, which notably automatizes plots generation and permits automatically extracting data from the JANIS database. Recent NEA software developments rely on these JANIS features to access nuclear data, for example the Nuclear Data Sensitivity Tool (NDaST) makes use of covariance data in BOXER and COVERX formats, which are retrieved from the JANIS database. New features added in this version of the JANIS software are described along this paper with some examples.
Design and Implementation of a Modern Automatic Deformation Monitoring System
NASA Astrophysics Data System (ADS)
Engel, Philipp; Schweimler, Björn
2016-03-01
The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.
SolTrack: an automatic video processing software for in situ interface tracking.
Griesser, S; Pierer, R; Reid, M; Dippenaar, R
2012-10-01
High-Resolution in situ observation of solidification experiments has become a powerful technique to improve the fundamental understanding of solidification processes of metals and alloys. In the present study, high-temperature laser-scanning confocal microscopy (HTLSCM) was utilized to observe and capture in situ solidification and phase transformations of alloys for subsequent post processing and analysis. Until now, this analysis has been very time consuming as frame-by-frame manual evaluation of propagating interfaces was used to determine the interface velocities. SolTrack has been developed using the commercial software package MATLAB and is designed to automatically detect, locate and track propagating interfaces during solidification and phase transformations as well as to calculate interfacial velocities. Different solidification phenomena have been recorded to demonstrate a wider spectrum of applications of this software. A validation, through comparison with manual evaluation, is included where the accuracy is shown to be very high. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.
Automated software system for checking the structure and format of ACM SIG documents
NASA Astrophysics Data System (ADS)
Mirza, Arsalan Rahman; Sah, Melike
2017-04-01
Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.
Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.
2010-01-01
The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792
Automated real-time software development
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.
1993-01-01
A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.
Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang
2012-03-01
Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.
Organization and use of a Software/Hardware Avionics Research Program (SHARP)
NASA Technical Reports Server (NTRS)
Karmarkar, J. S.; Kareemi, M. N.
1975-01-01
The organization and use is described of the software/hardware avionics research program (SHARP) developed to duplicate the automatic portion of the STOLAND simulator system, on a general-purpose computer system (i.e., IBM 360). The program's uses are: (1) to conduct comparative evaluation studies of current and proposed airborne and ground system concepts via single run or Monte Carlo simulation techniques, and (2) to provide a software tool for efficient algorithm evaluation and development for the STOLAND avionics computer.
NASA Technical Reports Server (NTRS)
Orr, James K.; Peltier, Daryl
2010-01-01
Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.
Radiometer Calibration and Characterization (RCC) User's Manual: Windows Version 4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreas, Afshin M.; Wilcox, Stephen M.
2016-02-29
The Radiometer Calibration and Characterization (RCC) software is a data acquisition and data archival system for performing Broadband Outdoor Radiometer Calibrations (BORCAL). RCC provides a unique method of calibrating broadband atmospheric longwave and solar shortwave radiometers using techniques that reduce measurement uncertainty and better characterize a radiometer's response profile. The RCC software automatically monitors and controls many of the components that contribute to uncertainty in an instrument's responsivity. This is a user's manual and guide to the RCC software.
MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.
Alic, Andy S; Blanquer, Ignacio
2016-09-01
Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.
Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)
1998-01-01
This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.
Bio-inspired color sketch for eco-friendly printing
NASA Astrophysics Data System (ADS)
Safonov, Ilia V.; Tolstaya, Ekaterina V.; Rychagov, Michael N.; Lee, Hokeun; Kim, Sang Ho; Choi, Donchul
2012-01-01
Saving of toner/ink consumption is an important task in modern printing devices. It has a positive ecological and social impact. We propose technique for converting print-job pictures to a recognizable and pleasant color sketches. Drawing a "pencil sketch" from a photo relates to a special area in image processing and computer graphics - non-photorealistic rendering. We describe a new approach for automatic sketch generation which allows to create well-recognizable sketches and to preserve partly colors of the initial picture. Our sketches contain significantly less color dots then initial images and this helps to save toner/ink. Our bio-inspired approach is based on sophisticated edge detection technique for a mask creation and multiplication of source image with increased contrast by this mask. To construct the mask we use DoG edge detection, which is a result of blending of initial image with its blurred copy through the alpha-channel, which is created from Saliency Map according to Pre-attentive Human Vision model. Measurement of percentage of saved toner and user study proves effectiveness of proposed technique for toner saving in eco-friendly printing mode.
Automatization of hardware configuration for plasma diagnostic system
NASA Astrophysics Data System (ADS)
Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.
2016-09-01
Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.
Save Energy Now Assessments Results 2008 Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, Anthony L; Martin, Michaela A; Nimbalkar, Sachin U
In October 2005, U.S. Department of Energy Secretary Bodman launched his Easy Ways to Save Energy campaign with a promise to provide energy assessments to 200 of the largest U.S. manufacturing plants. DOE's Industrial Technologies Program (ITP) responded to the Secretary's campaign with its Save Energy Now initiative, featuring a new and highly cost-effective form of energy savings assessment. The approach for these assessments drew heavily on the existing resources of ITP's technology delivery component. Over the years, ITP Technology Delivery has worked with industry partners to assemble a suite of respected software tools, proven assessment protocols, training curricula, certified energy experts, and strong partnerships for deployment. The Save Energy Now assessments conducted in calendar year 2006 focused on natural gas savings and targeted many of the nation's largest manufacturing plants - those that consume at least 1 TBtu of energy annually. The 2006 Save Energy Now assessments focused primarily on assessments of steam and process heating systems, which account for an estimated 74% of all natural gas use by U.S. manufacturing plants. Because of the success of the Save Energy Now assessments conducted in 2006 and 2007, the program was expanded and enhanced in two major ways in 2008: (1) a new goal was set to perform at least 260 assessments; and (2) the assessment focus was expanded to include pumping, compressed air, and fan systems in addition to steam and process heating. DOE ITP also has developed software tools to assess energy efficiency improvement opportunities in pumping, compressed air, and fan systems. The Save Energy Now assessments integrate a strong training component designed to teach industrial plant personnel how to use DOE's opportunity assessment software tools. This approach has the advantages of promoting strong buy-in of plant personnel for the assessment and its outcomes and preparing them better to independently replicate the assessment process at the company's other facilities. Another important element of the Save Energy Now assessment process is the follow-up process used to identify how many of the recommended savings opportunities from individual assessments have been implemented in the industrial plants. Plant personnel involved with the Save Energy Now assessments are contacted 6 months, 12 months, and 24 months after individual assessments are completed to determine implementation results. A total of 260 Save Energy Now assessments were successfully completed in calendar year 2008. This means that a total of 718 assessments were completed in 2006, 2007, and 2008. As of July 2009, we have received a total of 239 summary reports from the ESAs that were conducted in year 2008. Hence, at the time that this report was prepared, 680 final assessment reports were completed (200 from year 2006, 241 from year 2007, and 239 from year 2008). The total identified potential cost savings from these 680 assessments ismore » $$1.1 billion per year, including natural gas savings of about 98 TBtu per year. These results, if fully implemented, could reduce CO{sub 2} emissions by about 8.9 million metric tons annually. When this report was prepared, data on implementation of recommended energy and cost savings measures from 488 Save Energy Now assessments were available. For these 488 plants, measures saving a total of $$147 million per year have been implemented, measures that will save $169 million per year are in the process of being implemented, and plants are planning implementation of measures that will save another $239 million per year. The implemented recommendations are already achieving total CO{sub 2} reductions of about 1.8 million metric tons per year. This report provides a summary of the key results for the Save Energy Now assessments completed in 2008; details of the 6-month, 12-month, and 24-month implementation results obtained to date; and an evaluation of these implementation results. This report also summarizes key accomplishments, findings, and lessons learned from all the Save Energy Now assessments completed to date. A separate report (Wright et al. 2010) provides more detailed information on key results for all of the 2008 assessments of steam, process heating, pumping, compressed air, and fan systems. Two prior reports (Wright et al. 2007 and Wright et al. 2009) detail the results from the 2006 and 2007 assessments and discuss the major components of the assessment process and improvements in the process made in 2007.« less
Save Energy Now Assessments Results 2008 Detailed Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, Anthony L; Martin, Michaela A; Nimbalkar, Sachin U
In October 2005, U.S. Department of Energy Secretary Bodman launched his Easy Ways to Save Energy campaign with a promise to provide energy assessments to 200 of the largest U.S. manufacturing plants. DOE's Industrial Technologies Program (ITP) responded to the Secretary's campaign with its Save Energy Now initiative, featuring a new and highly cost-effective form of energy savings assessment. The approach for these assessments drew heavily on the existing resources of ITP's technology delivery component. Over the years, ITP Technology Delivery has worked with industry partners to assemble a suite of respected software tools, proven assessment protocols, training curricula, certified energy experts, and strong partnerships for deployment. The Save Energy Now assessments conducted in calendar year 2006 focused on natural gas savings and targeted many of the nation's largest manufacturing plants - those that consume at least 1 TBtu of energy annually. The 2006 Save Energy Now assessments focused primarily on assessments of steam and process heating systems, which account for an estimated 74% of all natural gas use by U.S. manufacturing plants. Because of the success of the Save Energy Now assessments conducted in 2006 and 2007, the program was expanded and enhanced in two major ways in 2008: (1) a new goal was set to perform at least 260 assessments; and (2) the assessment focus was expanded to include pumping, compressed air, and fan systems in addition to steam and process heating. DOE ITP also has developed software tools to assess energy efficiency improvement opportunities in pumping, compressed air, and fan systems. The Save Energy Now assessments integrate a strong training component designed to teach industrial plant personnel how to use DOE's opportunity assessment software tools. This approach has the advantages of promoting strong buy-in of plant personnel for the assessment and its outcomes and preparing them better to independently replicate the assessment process at the company's other facilities. Another important element of the Save Energy Now assessment process is the follow-up process used to identify how many of the recommended savings opportunities from individual assessments have been implemented in the industrial plants. Plant personnel involved with the Save Energy Now assessments are contacted 6 months, 12 months, and 24 months after individual assessments are completed to determine implementation results. A total of 260 Save Energy Now assessments were successfully completed in calendar year 2008. This means that a total of 718 assessments were completed in 2006, 2007, and 2008. As of July 2009, we have received a total of 239 summary reports from the ESAs that were conducted in year 2008. Hence, at the time that this report was prepared, 680 final assessment reports were completed (200 from year 2006, 241 from year 2007, and 239 from year 2008). The total identified potential cost savings from these 680 assessments ismore » $$1.1 billion per year, including natural gas savings of about 98 TBtu per year. These results, if fully implemented, could reduce CO{sub 2} emissions by about 8.9 million metric tons annually. When this report was prepared, data on implementation of recommended energy and cost savings measures from 488 Save Energy Now assessments were available. For these 488 plants, measures saving a total of $$147 million per year have been implemented, measures that will save $169 million per year are in the process of being implemented, and plants are planning implementation of measures that will save another $239 million per year. The implemented recommendations are already achieving total CO{sub 2} reductions of about 1.8 million metric tons per year. This report provides a summary of the key results for the Save Energy Now assessments completed in 2008; details of the 6-month, 12-month, and 24-month implementation results obtained to date; and an evaluation of these implementation results. This report also summarizes key accomplishments, findings, and lessons learned from all the Save Energy Now assessments completed to date. A separate report (Wright et al. 2010) provides more detailed information on key results for all of the 2008 assessments of steam, process heating, pumping, compressed air, and fan systems. Two prior reports (Wright et al. 2007 and Wright et al. 2009) detail the results from the 2006 and 2007 assessments and discuss the major components of the assessment process and improvements in the process made in 2007.« less
Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro
2015-01-01
Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768
Research-oriented image registry for multimodal image integration.
Tanaka, M; Sadato, N; Ishimori, Y; Yonekura, Y; Yamashita, Y; Komuro, H; Hayahsi, N; Ishii, Y
1998-01-01
To provide multimodal biomedical images automatically, we constructed the research-oriented image registry, Data Delivery System (DDS). DDS was constructed on the campus local area network. Machines which generate images (imagers: DSA, ultrasound, PET, MRI, SPECT and CT) were connected to the campus LAN. Once a patient is registered, all his images are automatically picked up by DDS as they are generated, transferred through the gateway server to the intermediate server, and copied into the directory of the user who registered the patient. DDS informs the user through e-mail that new data have been generated and transferred. Data format is automatically converted into one which is chosen by the user. Data inactive for a certain period in the intermediate server are automatically achieved into the final and permanent data server based on compact disk. As a soft link is automatically generated through this step, a user has access to all (old or new) image data of the patient of his interest. As DDS runs with minimal maintenance, cost and time for data transfer are significantly saved. By making the complex process of data transfer and conversion invisible, DDS has made it easy for naive-to-computer researchers to concentrate on their biomedical interest.
Automating the Presentation of Computer Networks
2006-12-01
software to overlay operational state information. Other network management tools like Computer Associates Unicenter [6,7] generate internal network...and required manual placement assistance. A number of software libraries [20] offer a wealth of automatic layout algorithms and presentation...FX010857971033.aspx [2] Microsoft (2005) Visio 2003 Product Demo, http://www.microsoft.com/office/visio/prodinfo/demo.mspx [3] Smartdraw (2005) Network
NASA Technical Reports Server (NTRS)
Wolf, S. W. D.; Goodyer, M. J.
1982-01-01
Operation of the Transonic Self-Streamlining Wind Tunnel (TSWT) involved on-line data acquisition with automatic wall adjustment. A tunnel run consisted of streamlining the walls from known starting contours in iterative steps and acquiring model data. Each run performs what is described as a streamlining cycle. The associated software is presented.
Software-aided automatic laser optoporation and transfection of cells
Georg Breunig, Hans; Uchugonova, Aisada; Batista, Ana; König, Karsten
2015-01-01
Optoporation, the permeabilization of a cell membrane by laser pulses, has emerged as a powerful non-invasive and highly efficient technique to induce transfection of cells. However, the usual tedious manual targeting of individual cells significantly limits the addressable cell number. To overcome this limitation, we present an experimental setup with custom-made software control, for computer-automated cell optoporation. The software evaluates the image contrast of cell contours, automatically designates cell locations for laser illumination, centres those locations in the laser focus, and executes the illumination. By software-controlled meandering of the sample stage, in principle all cells in a typical cell culture dish can be targeted without further user interaction. The automation allows for a significant increase in the number of treatable cells compared to a manual approach. For a laser illumination duration of 100 ms, 7-8 positions on different cells can be targeted every second inside the area of the microscope field of view. The experimental capabilities of the setup are illustrated in experiments with Chinese hamster ovary cells. Furthermore, the influence of laser power is discussed, with mention on post-treatment cell survival and optoporation-efficiency rates. PMID:26053047
Spreadsheets for Analyzing and Optimizing Space Missions
NASA Technical Reports Server (NTRS)
Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick
2009-01-01
XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.
Automatic mouse ultrasound detector (A-MUD): A new tool for processing rodent vocalizations.
Zala, Sarah M; Reitschmidt, Doris; Noll, Anton; Balazs, Peter; Penn, Dustin J
2017-01-01
House mice (Mus musculus) emit complex ultrasonic vocalizations (USVs) during social and sexual interactions, which have features similar to bird song (i.e., they are composed of several different types of syllables, uttered in succession over time to form a pattern of sequences). Manually processing complex vocalization data is time-consuming and potentially subjective, and therefore, we developed an algorithm that automatically detects mouse ultrasonic vocalizations (Automatic Mouse Ultrasound Detector or A-MUD). A-MUD is a script that runs on STx acoustic software (S_TOOLS-STx version 4.2.2), which is free for scientific use. This algorithm improved the efficiency of processing USV files, as it was 4-12 times faster than manual segmentation, depending upon the size of the file. We evaluated A-MUD error rates using manually segmented sound files as a 'gold standard' reference, and compared them to a commercially available program. A-MUD had lower error rates than the commercial software, as it detected significantly more correct positives, and fewer false positives and false negatives. The errors generated by A-MUD were mainly false negatives, rather than false positives. This study is the first to systematically compare error rates for automatic ultrasonic vocalization detection methods, and A-MUD and subsequent versions will be made available for the scientific community.
DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.
Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A
2017-01-01
Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.
2013-01-01
Background Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. Results We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user’s query, advanced data searching based on the specified user’s query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. Conclusions search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/. PMID:23452691
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur
2013-03-01
Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/.
Evalution of a DE-Identification Process for Ocular Imaging
NASA Technical Reports Server (NTRS)
LaPelusa, Michael B.; Mason, Sara S.; Taiym, Wafa F.; Sargsyan, Ashot; Lee, Lesley R.; Wear, Mary L.; Van Baalen, Mary
2015-01-01
Medical privacy of NASA astronauts requires an organized and comprehensive approach when data are being made available outside NASA systems. A combination of factors, including the uniquely small patient population, the extensive medical testing done on these individuals, and the relative cultural popularity of the astronauts puts them at a far greater risk to potential exposure of personal information than the general public. Therefore, care must be taken to ensure that the astronauts' identities are concealed. Magnetic Resonance Imaging (MRI) medical data is a recent source of interest to researchers concerned with the development of Visual Impairment due to Intracranial Pressure (VIIP) in the astronaut population. Each vision MRI scan of an astronaut includes 176 separate sagittal images that are saved as an "image series" for clinical use. In addition to the medical information these image sets provide, they also inherently contain a substantial amount of non-medical personally identifiable information (PII) such as-name, date of birth, and date of exam. We have shown that an image set of this type can be rendered, using free software, to give an accurate representation of the patient's face. This currently restricts NASA from dispensing MRI data to researchers in a deidentified format. Automated software programs, such as the Brain Extraction Tool, are available to researchers who wish to de-identify MRI sagittal brain images by "erasing" identifying characteristics such as the nose and jaw on the image sets. However, this software is not useful to NASA for vision research because it removes the portion of the images around the eye orbits, which is the main area of interest to researchers studying the VIIP syndrome. The Lifetime Surveillance of Astronaut Health program has resolved this issue by developing a protocol to de-identify MRI sagittal brain images using Showcase Premier, a DICOM (Digital Imaging and Communications in Medicine) software package. The software allows manual editing of one image from a patient's image set to be automatically applied to the entire image series. This new approach would allow a new level of access to untapped medical imaging data relating to VIIP that can be utilized by researchers while protecting the privacy of the astronauts. In the next step toward finalizing this technique, NASA clinical radiology consultants will test the images to verify removal of all metadata and PII.
NASA Technical Reports Server (NTRS)
1994-01-01
MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 6
2006-06-01
improvement methods. The total volume of projects studied now exceeds 12,000. Software Productivity Research, LLC Phone: (877) 570-5459 (973) 273-5829...While performing quality con- sulting, Olson has helped organizations measurably improve quality and productivity , save millions of dollars in costs of...This article draws parallels between the outrageous events on the Jerry Springer Show and problems faced by process improvement programs. by Paul
World Wind Tools Reveal Environmental Change
NASA Technical Reports Server (NTRS)
2012-01-01
Originally developed under NASA's Learning Technologies program as a tool to engage and inspire students, World Wind software was released under the NASA Open Source Agreement license. Honolulu, Hawaii based Intelesense Technologies is one of the companies currently making use of the technology for environmental, public health, and other monitoring applications for nonprofit organizations and Government agencies. The company saved about $1 million in development costs by using the NASA software.
Pan, Jui-Wen; Tsai, Pei-Jung; Chang, Kao-Der; Chang, Yung-Yuan
2013-03-01
In this paper, we propose a method to analyze the light extraction efficiency (LEE) enhancement of a nanopatterned sapphire substrates (NPSS) light-emitting diode (LED) by comparing wave optics software with ray optics software. Finite-difference time-domain (FDTD) simulations represent the wave optics software and Light Tools (LTs) simulations represent the ray optics software. First, we find the trends of and an optimal solution for the LEE enhancement when the 2D-FDTD simulations are used to save on simulation time and computational memory. The rigorous coupled-wave analysis method is utilized to explain the trend we get from the 2D-FDTD algorithm. The optimal solution is then applied in 3D-FDTD and LTs simulations. The results are similar and the difference in LEE enhancement between the two simulations does not exceed 8.5% in the small LED chip area. More than 10(4) times computational memory is saved during the LTs simulation in comparison to the 3D-FDTD simulation. Moreover, LEE enhancement from the side of the LED can be obtained in the LTs simulation. An actual-size NPSS LED is simulated using the LTs. The results show a more than 307% improvement in the total LEE enhancement of the NPSS LED with the optimal solution compared to the conventional LED.
New conversion factors between human and automatic readouts of the CDMAM phantom for CR systems
NASA Astrophysics Data System (ADS)
Hummel, Johann; Homolka, Peter; Osanna-Elliot, Angelika; Kaar, Marcus; Semtrus, Friedrich; Figl, Michael
2016-03-01
Mammography screenings demand for profound image quality (IQ) assessment to guarantee their screening success. The European protocol for the quality control of the physical and technical aspects of mammography screening (EPQCM) suggests a contrast detail phantom such as the CDMAM phantom to evaluate IQ. For automatic evaluation a software is provided by the EUREF. As human and automatic readouts differ systematically conversion factors were published by the official reference organisation (EUREF). As we experienced a significant difference for these factors for Computed Radiography (CR) systems we developed an objectifying analysis software which presents the cells including the gold disks randomly in thickness and rotation. This allows to overcome the problem of an inevitable learning effect where observers know the position of the disks in advance. Applying this software, 45 computed radiography (CR) systems were evaluated and the conversion factors between human and automatic readout determined. The resulting conversion factors were compared with the ones resulting from the two methods published by EUREF. We found our conversion factors to be substantially lower than those suggested by EUREF, in particular 1.21 compared to 1.42 (EUREF EU method) and 1.62 (EUREF UK method) for 0.1 mm, and 1.40 compared to 1.73 (EUREF EU) and 1.83 (EUREF UK) for 0.25 mm disc diameter, respectively. This can result in a dose increase of up to 90% using either of these factors to adjust patient dose in order to fulfill image quality requirements. This suggests the need of an agreement on their proper application and limits the validity of the assessment methods. Therefore, we want to stress the need for clear criteria for CR systems based on appropriate studies.
Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas
2016-03-01
The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.
Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A
1993-05-01
Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".
Investigating energy-saving potentials in the cloud.
Lee, Da-Sheng
2014-02-20
Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit.
Investigating Energy-Saving Potentials in the Cloud
Lee, Da-Sheng
2014-01-01
Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit. PMID:24561405
Generating Safety-Critical PLC Code From a High-Level Application Software Specification
NASA Technical Reports Server (NTRS)
2008-01-01
The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.
NASA Technical Reports Server (NTRS)
Chu, Robert L.; Bayha, Tom D.; Davis, HU; Ingram, J. ED; Shukla, Jay G.
1992-01-01
Composite Wing and Fuselage Structural Design/Manufacturing Concepts have been developed and evaluated. Trade studies were performed to determine how well the concepts satisfy the program goals of 25 percent cost savings, 40 percent weight savings with aircraft resizing, and 50 percent part count reduction as compared to the aluminum Lockheed L-1011 baseline. The concepts developed using emerging technologies such as large scale resin transfer molding (RTM), automatic tow placed (ATP), braiding, out-of-autoclave and automated manufacturing processes for both thermoset and thermoplastic materials were evaluated for possible application in the design concepts. Trade studies were used to determine which concepts carry into the detailed design development subtask.
CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients
Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H
1999-01-01
Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server (400 MHz Pentium II, 256 MB RAM) was used. Results So far the EPR-system has been running for eight consecutive months and contains complete records of 673 transplant recipients with an average follow-up of 9.9 (SD :4.9) years and a total of 1.1 million lab values. Instruction to enable new users to perform basic operations took less than two hours in all cases. The average duration of laboratory access was 0.9 (SD:0.5) seconds, the automatic composition of a letter took 6.1 (SD:2.4) seconds. Apart from the database and Windows NT, all other components are available for free. The development of the EPR-system required less than two person-years. Conclusion Implementation of an Electronic patient record that meets the requirements of comprehensiveness, reliability, security, speed, user-friendliness and affordability using a combination of "of-the-shelf" software-products can be feasible, if the current state-of-the-art internet technology is applied.
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1998-01-01
Historically Command Management Systems (CMS) have been large, expensive, spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as a to develop a more generic or a set of core components for CMS systems. Current MOC (mission operations center) hardware and software include Unix workstations, the C/C++ and Java programming languages, and X and Java window interfaces representations. This configuration provides the power and flexibility to support sophisticated systems and intelligent user interfaces that exploit state-of-the-art technologies in human-machine systems engineering, decision making, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of the issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, design and analysis tools from a human-machine systems engineering point of view (e.g., operator and designer models) and human-computer interaction tools, (e.g., graphics, visualization, and animation), may provide significant savings in the design, operation, and maintenance of a spacecraft-specific CMS as well as continuity for CMS design and development across spacecraft with varying needs. The savings in this case is in software reuse at all stages of the software engineering process.
Carrafiello, Gianpaolo; Ierardi, Anna Maria; Duka, Ejona; Radaelli, Alessandro; Floridi, Chiara; Bacuzzi, Alessandro; de Bucourt, Maximilian; De Marchi, Giuseppe
2016-04-01
This study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings. Twenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was defined as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered. The bleeding site was not detected by initial angiogram in 20% of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95%) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90%) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95%. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min. C-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.
A methodology for testing fault-tolerant software
NASA Technical Reports Server (NTRS)
Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.
1985-01-01
A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.
Liu, Haisong; Li, Jun; Pappas, Evangelos; Andrews, David; Evans, James; Werner-Wasik, Maria; Yu, Yan; Dicker, Adam; Shi, Wenyin
2016-09-08
An automatic brain-metastases planning (ABMP) software has been installed in our institution. It is dedicated for treating multiple brain metastases with radiosurgery on linear accelerators (linacs) using a single-setup isocenter with noncoplanar dynamic conformal arcs. This study is to validate the calculated absolute dose and dose distribution of ABMP. Three types of measurements were performed to validate the planning software: 1, dual micro ion chambers were used with an acrylic phantom to measure the absolute dose; 2, a 3D cylindrical phantom with dual diode array was used to evaluate 2D dose distribution and point dose for smaller targets; and 3, a 3D pseudo-in vivo patient-specific phantom filled with polymer gels was used to evaluate the accuracy of 3D dose distribution and radia-tion delivery. Micro chamber measurement of two targets (volumes of 1.2 cc and 0.9 cc, respectively) showed that the percentage differences of the absolute dose at both targets were less than 1%. Averaged GI passing rate of five different plans measured with the diode array phantom was above 98%, using criteria of 3% dose difference, 1 mm distance to agreement (DTA), and 10% low-dose threshold. 3D gel phantom measurement results demonstrated a 3D displacement of nine targets of 0.7 ± 0.4 mm (range 0.2 ~ 1.1 mm). The averaged two-dimensional (2D) GI passing rate for several region of interests (ROI) on axial slices that encompass each one of the nine targets was above 98% (5% dose difference, 2 mm DTA, and 10% low-dose threshold). Measured D95, the minimum dose that covers 95% of the target volume, of the nine targets was 0.7% less than the calculated D95. Three different types of dosimetric verification methods were used and proved the dose calculation of the new automatic brain metastases planning (ABMP) software was clinical acceptable. The 3D pseudo-in vivo patient-specific gel phantom test also served as an end-to-end test for validating not only the dose calculation, but the treatment delivery accuracy as well. © 2016 The Authors.
Program Helps Decompose Complicated Design Problems
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.
1993-01-01
Time saved by intelligent decomposition into smaller, interrelated problems. DeMAID is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problem. Displays modules in N x N matrix format. Requires investment of time to generate and refine list of modules for input, it saves considerable amount of money and time in total design process, particularly new design problems in which ordering of modules has not been defined. Program also implemented to examine assembly-line process or ordering of tasks and milestones.
DAISY: a new software tool to test global identifiability of biological and physiological systems.
Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina
2007-10-01
A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.
NASA Astrophysics Data System (ADS)
Lelièvre, Peter G.; Grey, Melissa
2017-08-01
Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.
Constructing an Online Test Framework, Using the Example of a Sign Language Receptive Skills Test
ERIC Educational Resources Information Center
Haug, Tobias; Herman, Rosalind; Woll, Bencie
2015-01-01
This paper presents the features of an online test framework for a receptive skills test that has been adapted, based on a British template, into different sign languages. The online test includes features that meet the needs of the different sign language versions. Features such as usability of the test, automatic saving of scores, and score…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rees, Brian G.
These are slides from a presentation. The identiFINDER provides information on radiation levels. It can automatically identify isotopes in its library. It can save spectra for transfer to a computer, and has a 4-8 hour battery life. The following is covered: an overview, operating modes, getting started, finder mode, search, identification mode, dose & rate, warning & alarm, options (ultra LGH), options (identifinder2), and general procedure.
ERIC Educational Resources Information Center
Hartman, JudithAnn R.; Dahm, Donald J.; Nelson, Eric A.
2015-01-01
Studies in cognitive science have verified that working memory (where the brain solves problems) can manipulate nearly all elements of knowledge that can be recalled automatically from long-term memory, but only a few elements that have not previously been well memorized. Research in reading comprehension has found that "lecture notes with…
An Energy-Efficient Multi-Tier Architecture for Fall Detection Using Smartphones.
Guvensan, M Amac; Kansiz, A Oguz; Camgoz, N Cihan; Turkmen, H Irem; Yavuz, A Gokhan; Karsligil, M Elif
2017-06-23
Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions.
Nakamura, R; Sasaki, M; Oikawa, H; Harada, S; Tamakawa, Y
2000-03-01
To use an intranet technique to develop an information system that simultaneously supports both diagnostic reports and radiotherapy planning images. Using a file server as the gateway a radiation oncology LAN was connected to an already operative RIS LAN. Dose-distribution images were saved in tagged-image-file format by way of a screen dump to the file server. X-ray simulator images and portal images were saved in encapsulated postscript format in the file server and automatically converted to portable document format. The files on the file server were automatically registered to the Web server by the search engine and were available for searching and browsing using the Web browser. It took less than a minute to register planning images. For clients, searching and browsing the file took less than 3 seconds. Over 150,000 reports and 4,000 images from a six-month period were accessible. Because the intranet technique was used, construction and maintenance was completed without specialty. Prompt access to essential information about radiotherapy has been made possible by this system. It promotes public access to radiotherapy planning that may improve the quality of treatment.
MEMS product engineering: methodology and tools
NASA Astrophysics Data System (ADS)
Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer
2011-03-01
The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.
DEPEND - A design environment for prediction and evaluation of system dependability
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.; Iyer, Ravishankar K.
1990-01-01
The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.
Automatic Nanodesign Using Evolutionary Techniques
NASA Technical Reports Server (NTRS)
Globus, Al; Saini, Subhash (Technical Monitor)
1998-01-01
Many problems associated with the development of nanotechnology require custom designed molecules. We use genetic graph software, a new development, to automatically evolve molecules of interest when only the requirements are known. Genetic graph software designs molecules, and potentially nanoelectronic circuits, given a fitness function that determines which of two molecules is better. A set of molecules, the first generation, is generated at random then tested with the fitness function, Subsequent generations are created by randomly choosing two parent molecules with a bias towards high scoring molecules, tearing each molecules in two at random, and mating parts from the mother and father to create two children. This procedure is repeated until a satisfactory molecule is found. An atom pair similarity test is currently used as the fitness function to evolve molecules similar to existing pharmaceuticals.
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W.
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures. PMID:26904094
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations.
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures.
Automated Transformation of CDISC ODM to OpenClinica.
Gessner, Sophia; Storck, Michael; Hegselmann, Stefan; Dugas, Martin; Soto-Rey, Iñaki
2017-01-01
Due to the increasing use of electronic data capture systems for clinical research, the interest in saving resources by automatically generating and reusing case report forms in clinical studies is growing. OpenClinica, an open-source electronic data capture system enables the reuse of metadata in its own Excel import template, hampering the reuse of metadata defined in other standard formats. One of these standard formats is the Operational Data Model for metadata, administrative and clinical data in clinical studies. This work suggests a mapping from Operational Data Model to OpenClinica and describes the implementation of a converter to automatically generate OpenClinica conform case report forms based upon metadata in the Operational Data Model.
Experimental analysis of drilling process in cortical bone.
Wang, Wendong; Shi, Yikai; Yang, Ning; Yuan, Xiaoqing
2014-02-01
Bone drilling is an essential part in orthopaedics, traumatology and bone biopsy. Prediction and control of drilling forces and torque are critical to the success of operations involving bone drilling. This paper studied the drilling force, torque and drilling process with automatic and manual drill penetrating into bovine cortical bone. The tests were performed on a drilling system which is used to drill and measure forces and torque during drilling. The effects of drilling speed, feed rate and drill bit diameter on force and torque were discussed separately. The experimental results were proven to be in accordance with the mathematic expressions introduced in this paper. The automatic drilling saved drilling time by 30-60% in the tested range and created less vibration, compared to manual drilling. The deviation between maximum and average force of the automatic drilling was 5N but 25N for manual drilling. To conclude, using the automatic method has significant advantages in control drilling force, torque and drilling process in bone drilling. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.