Code of Federal Regulations, 2014 CFR
2014-01-01
... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE REGULATIONS OF THE NATIONAL WEATHER SERVICE MODERNIZATION OF THE NATIONAL WEATHER SERVICE § 946.2 Definitions. Automate (or automation) means to replace employees performing surface observations at a field office with automated weather service observation...
Code of Federal Regulations, 2012 CFR
2012-01-01
... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE REGULATIONS OF THE NATIONAL WEATHER SERVICE MODERNIZATION OF THE NATIONAL WEATHER SERVICE § 946.2 Definitions. Automate (or automation) means to replace employees performing surface observations at a field office with automated weather service observation...
Code of Federal Regulations, 2011 CFR
2011-01-01
... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE REGULATIONS OF THE NATIONAL WEATHER SERVICE MODERNIZATION OF THE NATIONAL WEATHER SERVICE § 946.2 Definitions. Automate (or automation) means to replace employees performing surface observations at a field office with automated weather service observation...
Code of Federal Regulations, 2010 CFR
2010-01-01
... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE REGULATIONS OF THE NATIONAL WEATHER SERVICE MODERNIZATION OF THE NATIONAL WEATHER SERVICE § 946.2 Definitions. Automate (or automation) means to replace employees performing surface observations at a field office with automated weather service observation...
Code of Federal Regulations, 2013 CFR
2013-01-01
... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE REGULATIONS OF THE NATIONAL WEATHER SERVICE MODERNIZATION OF THE NATIONAL WEATHER SERVICE § 946.2 Definitions. Automate (or automation) means to replace employees performing surface observations at a field office with automated weather service observation...
Automation of surface observations program
NASA Technical Reports Server (NTRS)
Short, Steve E.
1988-01-01
At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.
Michael J. Erickson; Joseph J. Charney; Brian A. Colle
2016-01-01
A fire weather index (FWI) is developed using wildfire occurrence data and Automated Surface Observing System weather observations within a subregion of the northeastern United States (NEUS) from 1999 to 2008. Average values of several meteorological variables, including near-surface temperature, relative humidity, dewpoint, wind speed, and cumulative daily...
Parks, Connie L; Monson, Keith L
2018-05-01
This study employed an automated facial recognition system as a means of objectively evaluating biometric correspondence between a ReFace facial approximation and the computed tomography (CT) derived ground truth skin surface of the same individual. High rates of biometric correspondence were observed, irrespective of rank class (R k ) or demographic cohort examined. Overall, 48% of the test subjects' ReFace approximation probes (n=96) were matched to his or her corresponding ground truth skin surface image at R 1 , a rank indicating a high degree of biometric correspondence and a potential positive identification. Identification rates improved with each successively broader rank class (R 10 =85%, R 25 =96%, and R 50 =99%), with 100% identification by R 57 . A sharp increase (39% mean increase) in identification rates was observed between R 1 and R 10 across most rank classes and demographic cohorts. In contrast, significantly lower (p<0.01) increases in identification rates were observed between R 10 and R 25 (8% mean increase) and R 25 and R 50 (3% mean increase). No significant (p>0.05) performance differences were observed across demographic cohorts or CT scan protocols. Performance measures observed in this research suggest that ReFace approximations are biometrically similar to the actual faces of the approximated individuals and, therefore, may have potential operational utility in contexts in which computerized approximations are utilized as probes in automated facial recognition systems. Copyright © 2018. Published by Elsevier B.V.
Automation of Endmember Pixel Selection in SEBAL/METRIC Model
NASA Astrophysics Data System (ADS)
Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.
2015-12-01
The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.
Roques, Christine; Al Mousa, Haifaa; Duse, Adriano; Gallagher, Rose; Koburger, Torsten; Lingaas, Egil; Petrosillo, Nicola; Škrlin, Jasenka
2015-01-01
Healthcare-associated infections have serious implications for both patients and hospitals. Environmental surface contamination is the key to transmission of nosocomial pathogens. Routine manual cleaning and disinfection eliminates visible soil and reduces environmental bioburden and risk of transmission, but may not address some surface contamination. Automated area decontamination technologies achieve more consistent and pervasive disinfection than manual methods, but it is challenging to demonstrate their efficacy within a randomized trial of the multiple interventions required to reduce healthcare-associated infection rates. Until data from multicenter observational studies are available, automated area decontamination technologies should be an adjunct to manual cleaning and disinfection within a total, multi-layered system and risk-based approach designed to control environmental pathogens and promote patient safety.
1980-07-01
41 3.2 EXPERIMENTAL DETERMINATION OF THE DEPENDENCE OF RAYLEIGH WAVE AMPLITUDE ON PROPERTIES OF THE SOURCE MATERIAL ...Surface Wave Observations ...... ................ 48 3.3.3 Surface Wave Dependence on Source Material Properties ..... ................ .. 51 SYSTEMS...with various aspects of the problem of estimating yield from single station recordings of surface waves. The material in these four summaries has been
Near-surface remote sensing of spatial and temporal variation in canopy phenology
Andrew D. Richardson; Bobby H. Braswell; David Y. Hollinger; Julian P. Jenkins; Scott V. Ollinger
2009-01-01
There is a need to document how plant phenology is responding to global change factors, particularly warming trends. "Near-surface" remote sensing, using radiometric instruments or imaging sensors, has great potential to improve phenological monitoring because automated observations can be made at high temporal frequency. Here we build on previous work and...
Automated classification of articular cartilage surfaces based on surface texture.
Stachowiak, G P; Stachowiak, G W; Podsiadlo, P
2006-11-01
In this study the automated classification system previously developed by the authors was used to classify articular cartilage surfaces with different degrees of wear. This automated system classifies surfaces based on their texture. Plug samples of sheep cartilage (pins) were run on stainless steel discs under various conditions using a pin-on-disc tribometer. Testing conditions were specifically designed to produce different severities of cartilage damage due to wear. Environmental scanning electron microscope (SEM) (ESEM) images of cartilage surfaces, that formed a database for pattern recognition analysis, were acquired. The ESEM images of cartilage were divided into five groups (classes), each class representing different wear conditions or wear severity. Each class was first examined and assessed visually. Next, the automated classification system (pattern recognition) was applied to all classes. The results of the automated surface texture classification were compared to those based on visual assessment of surface morphology. It was shown that the texture-based automated classification system was an efficient and accurate method of distinguishing between various cartilage surfaces generated under different wear conditions. It appears that the texture-based classification method has potential to become a useful tool in medical diagnostics.
Study of living single cells in culture: automated recognition of cell behavior.
Bodin, P; Papin, S; Meyer, C; Travo, P
1988-07-01
An automated system capable of analyzing the behavior, in real time, of single living cells in culture, in a noninvasive and nondestructive way, has been developed. A large number of cell positions in single culture dishes were recorded using a computer controlled, robotized microscope. During subsequent observations, binary images obtained from video image analysis of the microscope visual field allowed the identification of the recorded cells. These cells could be revisited automatically every few minutes. Long-term studies of the behavior of cells make possible the analysis of cellular locomotary and mitotic activities as well as determination of cell shape (chosen from a defined library) for several hours or days in a fully automated way with observations spaced up to 30 minutes. Short-term studies of the behavior of cells permit the study, in a semiautomatic way, of acute effects of drugs (5 to 15 minutes) on changes of surface area and length of cells.
NASA Astrophysics Data System (ADS)
Jin, Rui; kang, Jian
2017-04-01
Wireless Sensor Networks are recognized as one of most important near-surface components of GEOSS (Global Earth Observation System of Systems), with flourish development of low-cost, robust and integrated data loggers and sensors. A nested eco-hydrological wireless sensor network (EHWSN) was installed in the up- and middle-reaches of the Heihe River Basin, operated to obtain multi-scale observation of soil moisture, soil temperature and land surface temperature from 2012 till now. The spatial distribution of EHWSN was optimally designed based on the geo-statistical theory, with the aim to capture the spatial variations and temporal dynamics of soil moisture and soil temperature, and to produce ground truth at grid scale for validating the related remote sensing products and model simulation in the heterogeneous land surface. In terms of upscaling research, we have developed a set of method to aggregate multi-point WSN observations to grid scale ( 1km), including regression kriging estimation to utilize multi-resource remote sensing auxiliary information, block kriging with homogeneous measurement errors, and bayesian-based upscaling algorithm that utilizes MODIS-derived apparent thermal inertia. All the EHWSN observation are organized as datasets to be freely published at http://westdc.westgis.ac.cn/hiwater. EHWSN integrates distributed observation nodes to achieve an automated, intelligent and remote-controllable network that provides superior integrated, standardized and automated observation capabilities for hydrological and ecological processes research at the basin scale.
Shen, Simon; Syal, Karan; Tao, Nongjian; Wang, Shaopeng
2015-12-01
We present a Single-Cell Motion Characterization System (SiCMoCS) to automatically extract bacterial cell morphological features from microscope images and use those features to automatically classify cell motion for rod shaped motile bacterial cells. In some imaging based studies, bacteria cells need to be attached to the surface for time-lapse observation of cellular processes such as cell membrane-protein interactions and membrane elasticity. These studies often generate large volumes of images. Extracting accurate bacterial cell morphology features from these images is critical for quantitative assessment. Using SiCMoCS, we demonstrated simultaneous and automated motion tracking and classification of hundreds of individual cells in an image sequence of several hundred frames. This is a significant improvement from traditional manual and semi-automated approaches to segmenting bacterial cells based on empirical thresholds, and a first attempt to automatically classify bacterial motion types for motile rod shaped bacterial cells, which enables rapid and quantitative analysis of various types of bacterial motion.
Wavelength modulated surface enhanced (resonance) Raman scattering for background-free detection.
Praveen, Bavishna B; Steuwe, Christian; Mazilu, Michael; Dholakia, Kishan; Mahajan, Sumeet
2013-05-21
Spectra in surface-enhanced Raman scattering (SERS) are always accompanied by a continuum emission called the 'background' which complicates analysis and is especially problematic for quantification and automation. Here, we implement a wavelength modulation technique to eliminate the background in SERS and its resonant version, surface-enhanced resonance Raman scattering (SERRS). This is demonstrated on various nanostructured substrates used for SER(R)S. An enhancement in the signal to noise ratio for the Raman bands of the probe molecules is also observed. This technique helps to improve the analytical ability of SERS by alleviating the problem due to the accompanying background and thus making observations substrate independent.
Measurements of Martian dust devil winds with HiRISE
Choi, D.S.; Dundas, C.M.
2011-01-01
We report wind measurements within Martian dust devils observed in plan view from the High Resolution Imaging Science Experiment (HiRISE) orbiting Mars. The central color swath of the HiRISE instrument has three separate charge-coupled devices (CCDs) and color filters that observe the surface in rapid cadence. Active features, such as dust devils, appear in motion when observed by this region of the instrument. Our image animations reveal clear circulatory motion within dust devils that is separate from their translational motion across the Martian surface. Both manual and automated tracking of dust devil clouds reveal tangential winds that approach 20-30 m s -1 in some cases. These winds are sufficient to induce a ???1% decrease in atmospheric pressure within the dust devil core relative to ambient, facilitating dust lifting by reducing the threshold wind speed for particle elevation. Finally, radial velocity profiles constructed from our automated measurements test the Rankine vortex model for dust devil structure. Our profiles successfully reveal the solid body rotation component in the interior, but fail to conclusively illuminate the profile in the outer regions of the vortex. One profile provides evidence for a velocity decrease as a function of r -1/2, instead of r -1, suggestive of surface friction effects. However, other profiles do not support this observation, or do not contain enough measurements to produce meaningful insights. Copyright 2011 by the American Geophysical Union.
Yet another method for triangulation and contouring for automated cartography
NASA Technical Reports Server (NTRS)
De Floriani, L.; Falcidieno, B.; Nasy, G.; Pienovi, C.
1982-01-01
An algorithm is presented for hierarchical subdivision of a set of three-dimensional surface observations. The data structure used for obtaining the desired triangulation is also singularly appropriate for extracting contours. Some examples are presented, and the results obtained are compared with those given by Delaunay triangulation. The data points selected by the algorithm provide a better approximation to the desired surface than do randomly selected points.
NASA Technical Reports Server (NTRS)
Chitre, S. R.
1978-01-01
The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.
Satellite-derived potential evapotranspiration for distributed hydrologic runoff modeling
NASA Astrophysics Data System (ADS)
Spies, R. R.; Franz, K. J.; Bowman, A.; Hogue, T. S.; Kim, J.
2012-12-01
Distributed models have the ability of incorporating spatially variable data, especially high resolution forcing inputs such as precipitation, temperature and evapotranspiration in hydrologic modeling. Use of distributed hydrologic models for operational streamflow prediction has been partially hindered by a lack of readily available, spatially explicit input observations. Potential evapotranspiration (PET), for example, is currently accounted for through PET input grids that are based on monthly climatological values. The goal of this study is to assess the use of satellite-based PET estimates that represent the temporal and spatial variability, as input to the National Weather Service (NWS) Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM). Daily PET grids are generated for six watersheds in the upper Mississippi River basin using a method that applies only MODIS satellite-based observations and the Priestly Taylor formula (MODIS-PET). The use of MODIS-PET grids will be tested against the use of the current climatological PET grids for simulating basin discharge. Gridded surface temperature forcing data are derived by applying the inverse distance weighting spatial prediction method to point-based station observations from the Automated Surface Observing System (ASOS) and Automated Weather Observing System (AWOS). Precipitation data are obtained from the Climate Prediction Center's (CPC) Climatology-Calibrated Precipitation Analysis (CCPA). A-priori gridded parameters for the Sacramento Soil Moisture Accounting Model (SAC-SMA), Snow-17 model, and routing model are initially obtained from the Office of Hydrologic Development and further calibrated using an automated approach. The potential of the MODIS-PET to be used in an operational distributed modeling system will be assessed with the long-term goal of promoting research to operations transfers and advancing the science of hydrologic forecasting.
NASA Astrophysics Data System (ADS)
Smith, Shawn; Bourassa, Mark
2014-05-01
The development of a new surface flux dataset based on underway meteorological observations from research vessels will be presented. The research vessel data center at the Florida State University routinely acquires, quality controls, and distributes underway surface meteorological and oceanographic observations from over 30 oceanographic vessels. These activities are coordinated by the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative in partnership with the Rolling Deck to Repository (R2R) project. Recently, the SAMOS data center has used these underway observations to produce bulk flux estimates for each vessel along individual cruise tracks. A description of this new flux product, along with the underlying data quality control procedures applied to SAMOS observations, will be provided. Research vessels provide underway observations at high-temporal frequency (1 min. sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Vessels recruited to the SAMOS initiative collect a high concentration of data within the U.S. continental shelf and also frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern, Arctic, South Atlantic, and South Pacific oceans). These observations are atypical for their spatial and temporal sampling, making them very useful for many applications including validation of numerical models and satellite retrievals, as well as local assessments of natural variability. Individual SAMOS observations undergo routine automated quality control and select vessels receive detailed visual data quality inspection. The result is a quality-flagged data set that is ideal for calculating turbulent flux estimates. We will describe the bulk flux algorithms that have been applied to the observations and the choices of constants that are used. Analysis of the preliminary SAMOS flux products will be presented, including spatial and temporal coverage for each derived parameter. The unique quality and sampling locations of research vessel observations and their independence from many models and products makes them ideal for validation studies. The strengths and limitations of research observations for flux validation studies will be discussed. The authors welcome a discussion with the flux community regarding expansion of the SAMOS program to include additional international vessels, thus facilitating and expansion of this research vessel-based flux product.
NASA Astrophysics Data System (ADS)
Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti
2014-01-01
An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.
Automated CAD design for sculptured airfoil surfaces
NASA Astrophysics Data System (ADS)
Murphy, S. D.; Yeagley, S. R.
1990-11-01
The design of tightly tolerated sculptured surfaces such as those for airfoils requires a significant design effort in order to machine the tools to create these surfaces. Because of the quantity of numerical data required to describe the airfoil surfaces, a CAD approach is required. Although this approach will result in productivity gains, much larger gains can be achieved by automating the design process. This paper discusses an application which resulted in an eightfold improvement in productivity by automating the design process on the CAD system.
Buscombe, Daniel D.; Rubin, David M.
2012-01-01
1. In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.
NASA Astrophysics Data System (ADS)
Buscombe, D.; Rubin, D. M.
2012-06-01
In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.
Toward Automated Intraocular Laser Surgery Using a Handheld Micromanipulator
Yang, Sungwook; MacLachlan, Robert A.; Riviere, Cameron N.
2014-01-01
This paper presents a technique for automated intraocular laser surgery using a handheld micromanipulator known as Micron. The novel handheld manipulator enables the automated scanning of a laser probe within a cylinder of 4 mm long and 4 mm in diameter. For the automation, the surface of the retina is reconstructed using a stereomicroscope, and then preplanned targets are placed on the surface. The laser probe is precisely located on the target via visual servoing of the aiming beam, while maintaining a specific distance above the surface. In addition, the system is capable of tracking the surface of the eye in order to compensate for any eye movement introduced during the operation. We compared the performance of the automated scanning using various control thresholds, in order to find the most effective threshold in terms of accuracy and speed. Given the selected threshold, we conducted the handheld operation above a fixed target surface. The average error and execution time are reduced by 63.6% and 28.5%, respectively, compared to the unaided trials. Finally, the automated laser photocoagulation was demonstrated also in an eye phantom, including compensation for the eye movement. PMID:25893135
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan Curtis; Margaret Torn
Data generated from an observational platform (Tram) consisting of 68 meters of elevated track 1 to 1.5 meters above the surface and an automated cart carrying a suite of radiation and remote sensing instruments (see below table). The Tram is in the footprint of NGEE Arctic/AmeriFlux tower at the Barrow Environmental Observatory, Barrow, Alaska.
Ground-based automated radiometric calibration system in Baotou site, China
NASA Astrophysics Data System (ADS)
Wang, Ning; Li, Chuanrong; Ma, Lingling; Liu, Yaokai; Meng, Fanrong; Zhao, Yongguang; Pang, Bo; Qian, Yonggang; Li, Wei; Tang, Lingli; Wang, Dongjin
2017-10-01
Post-launch vicarious calibration method, as an important post launch method, not only can be used to evaluate the onboard calibrators but also can be allowed for a traceable knowledge of the absolute accuracy, although it has the drawbacks of low frequency data collections due expensive on personal and cost. To overcome the problems, CEOS Working Group on Calibration and Validation (WGCV) Infrared Visible Optical Sensors (IVOS) subgroup has proposed an Automated Radiative Calibration Network (RadCalNet) project. Baotou site is one of the four demonstration sites of RadCalNet. The superiority characteristics of Baotou site is the combination of various natural scenes and artificial targets. In each artificial target and desert, an automated spectrum measurement instrument is developed to obtain the surface reflected radiance spectra every 2 minutes with a spectrum resolution of 2nm. The aerosol optical thickness and column water vapour content are measured by an automatic sun photometer. To meet the requirement of RadCalNet, a surface reflectance spectrum retrieval method is used to generate the standard input files, with the support of surface and atmospheric measurements. Then the top of atmospheric reflectance spectra are derived from the input files. The results of the demonstration satellites, including Landsat 8, Sentinal-2A, show that there is a good agreement between observed and calculated results.
SciBox, an end-to-end automated science planning and commanding system
NASA Astrophysics Data System (ADS)
Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott
2014-01-01
SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.
Airport Surface Traffic Automation Study.
1988-05-09
the use of Artificial Intellignece * technology in enroute ATC can be applied directly to the surface control problem. 7.6 Development Approach The next...problems in airport surface control. If artificial intelligance provides useful results for airborne automation, the same techniques should prove useful
Automated detection of solar eruptions
NASA Astrophysics Data System (ADS)
Hurlburt, N.
2015-12-01
Observation of the solar atmosphere reveals a wide range of motions, from small scale jets and spicules to global-scale coronal mass ejections (CMEs). Identifying and characterizing these motions are essential to advancing our understanding of the drivers of space weather. Both automated and visual identifications are currently used in identifying Coronal Mass Ejections. To date, eruptions near the solar surface, which may be precursors to CMEs, have been identified primarily by visual inspection. Here we report on Eruption Patrol (EP): a software module that is designed to automatically identify eruptions from data collected by the Atmospheric Imaging Assembly on the Solar Dynamics Observatory (SDO/AIA). We describe the method underlying the module and compare its results to previous identifications found in the Heliophysics Event Knowledgebase. EP identifies eruptions events that are consistent with those found by human annotations, but in a significantly more consistent and quantitative manner. Eruptions are found to be distributed within 15 Mm of the solar surface. They possess peak speeds ranging from 4 to 100 km/s and display a power-law probability distribution over that range. These characteristics are consistent with previous observations of prominences.
A connectionist-geostatistical approach for classification of deformation types in ice surfaces
NASA Astrophysics Data System (ADS)
Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.
2014-12-01
Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.
Stereo imaging with spaceborne radars
NASA Technical Reports Server (NTRS)
Leberl, F.; Kobrick, M.
1983-01-01
Stereo viewing is a valuable tool in photointerpretation and is used for the quantitative reconstruction of the three dimensional shape of a topographical surface. Stereo viewing refers to a visual perception of space by presenting an overlapping image pair to an observer so that a three dimensional model is formed in the brain. Some of the observer's function is performed by machine correlation of the overlapping images - so called automated stereo correlation. The direct perception of space with two eyes is often called natural binocular vision; techniques of generating three dimensional models of the surface from two sets of monocular image measurements is the topic of stereology.
Automated position control of a surface array relative to a liquid microjunction surface sampler
Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James
2007-11-13
A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.
NASA Astrophysics Data System (ADS)
Gilson, Gaëlle F.; Jiskoot, Hester; Cassano, John J.; Gultepe, Ismail; James, Timothy D.
2018-05-01
An automated method to classify Arctic fog into distinct thermodynamic profiles using historic in-situ surface and upper-air observations is presented. This classification is applied to low-resolution Integrated Global Radiosonde Archive (IGRA) soundings and high-resolution Arctic Summer Cloud Ocean Study (ASCOS) soundings in low- and high-Arctic coastal and pack-ice environments. Results allow investigation of fog macrophysical properties and processes in coastal East Greenland during melt seasons 1980-2012. Integrated with fog observations from three synoptic weather stations, 422 IGRA soundings are classified into six fog thermodynamic types based on surface saturation ratio, type of temperature inversion, fog-top height relative to inversion-base height and stability using the virtual potential temperature gradient. Between 65-80% of fog observations occur with a low-level inversion, and statically neutral or unstable surface layers occur frequently. Thermodynamic classification is sensitive to the assigned dew-point depression threshold, but categorization is robust. Despite differences in the vertical resolution of radiosonde observations, IGRA and ASCOS soundings yield the same six fog classes, with fog-class distribution varying with latitude and environmental conditions. High-Arctic fog frequently resides within an elevated inversion layer, whereas low-Arctic fog is more often restricted to the mixed layer. Using supplementary time-lapse images, ASCOS microwave radiometer retrievals and airmass back-trajectories, we hypothesize that the thermodynamic classes represent different stages of advection fog formation, development, and dissipation, including stratus-base lowering and fog lifting. This automated extraction of thermodynamic boundary-layer and inversion structure can be applied to radiosonde observations worldwide to better evaluate fog conditions that affect transportation and lead to improvements in numerical models.
NASA Astrophysics Data System (ADS)
Smith, S. R.; Rolph, J.; Briggs, K.; Elya, J. L.; Bourassa, M. A.
2016-02-01
The authors will describe the successes and lessons learned from the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative. Over the past decade, SAMOS has acquired, quality controlled, and distributed underway surface meteorological and oceanographic observations from nearly 40 oceanographic research vessels. Research vessels provide underway observations at high-temporal frequency (1-minute sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Vessels recruited to the SAMOS initiative collect a high concentration of data within the U.S. continental shelf, around Hawaii and the islands of the tropical Pacific, and frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern, Arctic, South Atlantic, and South Pacific oceans) desired by the air-sea exchange, modeling, and satellite remote sensing communities. The presentation will highlight the data stewardship practices of the SAMOS initiative. Activities include routine automated and visual data quality evaluation, feedback to vessel technicians and operators regarding instrumentation errors, best practices for instrument siting and exposure on research vessels, and professional development activities for research vessel technicians. Best practices for data, metadata, and quality evaluation will be presented. We will discuss ongoing efforts to expand data services to enhance interoperability between marine data centers. Data access and archival protocols will also be presented, including how these data may be referenced and accessed via NCEI.
The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival
NASA Astrophysics Data System (ADS)
O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.
2016-02-01
The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.
Automated Inspection And Precise Grinding Of Gears
NASA Technical Reports Server (NTRS)
Frint, Harold; Glasow, Warren
1995-01-01
Method of precise grinding of spiral bevel gears involves automated inspection of gear-tooth surfaces followed by adjustments of machine-tool settings to minimize differences between actual and nominal surfaces. Similar to method described in "Computerized Inspection of Gear-Tooth Surfaces" (LEW-15736). Yields gears of higher quality, with significant reduction in manufacturing and inspection time.
NASA Astrophysics Data System (ADS)
Zakharov, A. V.
1988-07-01
Aspects of the Soviet mission to Phobos are examined, including the objectives of the mission, the spapcecraft, experiments, and landers. Past Mars research and unanswered questions concerning Mars and its satellites are discussed. The spacecraft is expected to reach Mars in early 1989 and to observe the planet from two orbits, coming as close as 500 km from the surface, before moving into a third path close to Phobos. After studying the Phobos terrain from above, the craft will jettison one or two small long-duration automated landers, which will perform surface experiments, including work on celestial mechanics, the history of the Phobos orbit, surface composition, and mechanical properties. In addition to studying Phobos and Mars, the craft will examine the interplanetary medium, make observations of the Sun, and possibly study Deimos.
Automated Telerobotic Inspection Of Surfaces
NASA Technical Reports Server (NTRS)
Balaram, J.; Prasad, K. Venkatesh
1996-01-01
Method of automated telerobotic inspection of surfaces undergoing development. Apparatus implementing method includes video camera that scans over surfaces to be inspected, in manner of mine detector. Images of surfaces compared with reference images to detect flaws. Developed for inspecting external structures of Space Station Freedom for damage from micrometeorites and debris from prior artificial satellites. On Earth, applied to inspection for damage, missing parts, contamination, and/or corrosion on interior surfaces of pipes or exterior surfaces of bridges, towers, aircraft, and ships.
Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Tom.
2013-01-01
NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.
NASA Astrophysics Data System (ADS)
Yang, Jian; He, Yuhong
2017-02-01
Quantifying impervious surfaces in urban and suburban areas is a key step toward a sustainable urban planning and management strategy. With the availability of fine-scale remote sensing imagery, automated mapping of impervious surfaces has attracted growing attention. However, the vast majority of existing studies have selected pixel-based and object-based methods for impervious surface mapping, with few adopting sub-pixel analysis of high spatial resolution imagery. This research makes use of a vegetation-bright impervious-dark impervious linear spectral mixture model to characterize urban and suburban surface components. A WorldView-3 image acquired on May 9th, 2015 is analyzed for its potential in automated unmixing of meaningful surface materials for two urban subsets and one suburban subset in Toronto, ON, Canada. Given the wide distribution of shadows in urban areas, the linear spectral unmixing is implemented in non-shadowed and shadowed areas separately for the two urban subsets. The results indicate that the accuracy of impervious surface mapping in suburban areas reaches up to 86.99%, much higher than the accuracies in urban areas (80.03% and 79.67%). Despite its merits in mapping accuracy and automation, the application of our proposed vegetation-bright impervious-dark impervious model to map impervious surfaces is limited due to the absence of soil component. To further extend the operational transferability of our proposed method, especially for the areas where plenty of bare soils exist during urbanization or reclamation, it is still of great necessity to mask out bare soils by automated classification prior to the implementation of linear spectral unmixing.
Application of a simple cerebellar model to geologic surface mapping
Hagens, A.; Doveton, J.H.
1991-01-01
Neurophysiological research into the structure and function of the cerebellum has inspired computational models that simulate information processing associated with coordination and motor movement. The cerebellar model arithmetic computer (CMAC) has a design structure which makes it readily applicable as an automated mapping device that "senses" a surface, based on a sample of discrete observations of surface elevation. The model operates as an iterative learning process, where cell weights are continuously modified by feedback to improve surface representation. The storage requirements are substantially less than those of a conventional memory allocation, and the model is extended easily to mapping in multidimensional space, where the memory savings are even greater. ?? 1991.
NASA Astrophysics Data System (ADS)
Chisholm, Bret J.; Webster, Dean C.; Bennett, James C.; Berry, Missy; Christianson, David; Kim, Jongsoo; Mayo, Bret; Gubbins, Nathan
2007-07-01
An automated, high-throughput adhesion workflow that enables pseudobarnacle adhesion and coating/substrate adhesion to be measured on coating patches arranged in an array format on 4×8in.2 panels was developed. The adhesion workflow consists of the following process steps: (1) application of an adhesive to the coating array; (2) insertion of panels into a clamping device; (3) insertion of aluminum studs into the clamping device and onto coating surfaces, aligned with the adhesive; (4) curing of the adhesive; and (5) automated removal of the aluminum studs. Validation experiments comparing data generated using the automated, high-throughput workflow to data obtained using conventional, manual methods showed that the automated system allows for accurate ranking of relative coating adhesion performance.
Verdaguer, Paula; Gris, Oscar; Casaroli-Marano, Ricardo P; Elies, Daniel; Muñoz-Gutierrez, Gerardo; Güell, Jose L
2015-08-01
To describe a case of hydrophilic intraocular lens (IOL) opacification based on IOL analysis after Descemet stripping automated endothelial keratoplasty. A 60-year-old woman had uneventful phacoemulsification after the implantation of a hydrophilic IOL (Akreos-Adapt; Bausch & Lomb) into both eyes. Because of postoperative corneal decompensation in the right eye, 2 Descemet stripping automated endothelial keratoplasty operations were performed within 1 year. After the second procedure, the graft was not well attached, requiring an intracameral injection of air on day 3. After 1 year, opacification was observed on the superior 2/3 of the anterior surface of the IOL, along with a significant decrease in visual acuity. The IOL was explanted 6 months after the opacification. Environmental scanning electron microscopy followed by x-ray microanalysis revealed an organic biofilm on the surface of the IOL. To our knowledge, this is the first reported case in which the material deposited on the lens is organic rather than calcific.
U.S. National / Naval Ice Center (NIC) Support to Naval and Maritime Operations
2011-06-20
States and Canadian governments. • International Arctic Buoy Programme ( IABP ) Global participants working together to maintain a network of... Modeling Surface Observations Satellite Air Recon Data Fusion Derived Data Automation Direct Data Dissemination TODAY’S CHALLENGES...and AUVs • Improve modeling and forecasting capabilities (OTSR/WEAX) • More trained ice analysts, ice pilots, and Arctic marine weather forecasters
Characteristics of lightning and wildland fire ignition in the Pacific Northwest.
Miriam L. Rorig; Sue A. Ferguson
1999-01-01
Lightning is the primary cause of fire in the forested regions of the Pacific Northwest, especially when it occurs without significant precipitation at the surface. Using thunderstorm occurrence and precipitation observations for the period 1948â77, along with automated lightning strike data for the period 1986â96, it was possible to classify convective days as either...
Contamination analyses of technology mirror assembly optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1991-01-01
Automated electron microprobe analyses were performed on tape lift samples from the Technology Mirror Assembly (TMA) optical surfaces. Details of the analyses are given, and the contamination of the mirror surfaces is discussed. Based on the automated analyses of the tape lifts from the TMA surfaces and the control blank, we can conclude that the particles identified on the actual samples were not a result of contamination due to the handling or sampling process itself and that the particles reflect the actual contamination on the surface of the mirror.
History of surface weather observations in the United States
NASA Astrophysics Data System (ADS)
Fiebrich, Christopher A.
2009-04-01
In this paper, the history of surface weather observations in the United States is reviewed. Local weather observations were first documented in the 17th Century along the East Coast. For many years, the progression of a weather observation from an initial reading to dissemination remained a slow and laborious process. The number of observers remained small and unorganized until agencies including the Surgeon General, Army, and General Land Office began to request regular observations at satellite locations in the 1800s. The Smithsonian was responsible for first organizing a large "network" of volunteer weather observers across the nation. These observers became the foundation for today's Cooperative Observer network. As applications of weather data continued to grow and users required the data with an ever-decreasing latency, automated weather networks saw rapid growth in the later part of the 20th century. Today, the number of weather observations across the U.S. totals in the tens of thousands due largely to privately-owned weather networks and amateur weather observers who submit observations over the internet.
NASA Astrophysics Data System (ADS)
Hopp, T.; Zapf, M.; Ruiter, N. V.
2014-03-01
An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.
Automated Reasoning CICT Program/Intelligent Systems Project ATAC-PRT Review
NASA Technical Reports Server (NTRS)
Morris, Robert; Smith, Ben
2003-01-01
An overview is presented of the Automated Reasoning CICT Program/Intelligent Systems project. Automated reasoning technology will help NASA missions by increasing the amount of science achieved, ensuring safety of spacecraft and surface explorers, and by enabling more robust mission operations.
Automated optimization of water-water interaction parameters for a coarse-grained model.
Fogarty, Joseph C; Chiu, See-Wing; Kirby, Peter; Jakobsson, Eric; Pandit, Sagar A
2014-02-13
We have developed an automated parameter optimization software framework (ParOpt) that implements the Nelder-Mead simplex algorithm and applied it to a coarse-grained polarizable water model. The model employs a tabulated, modified Morse potential with decoupled short- and long-range interactions incorporating four water molecules per interaction site. Polarizability is introduced by the addition of a harmonic angle term defined among three charged points within each bead. The target function for parameter optimization was based on the experimental density, surface tension, electric field permittivity, and diffusion coefficient. The model was validated by comparison of statistical quantities with experimental observation. We found very good performance of the optimization procedure and good agreement of the model with experiment.
Automated surface quality inspection with ARGOS: a case study
NASA Astrophysics Data System (ADS)
Kiefhaber, Daniel; Etzold, Fabian; Warken, Arno F.; Asfour, Jean-Michel
2017-06-01
The commercial availability of automated inspection systems for optical surfaces specified according to ISO 10110-7 promises unsupervised and automated quality control with reproducible results. In this study, the classification results of the ARGOS inspection system are compared to the decisions by well-trained inspectors based on manual-visual inspection. Both are found to agree in 93.6% of the studied cases. Exemplary cases with differing results are studied, and shown to be partly caused by shortcomings of the ISO 10110-7 standard, which was written for the industry standard manual-visual inspection. Applying it to high resolution images of the whole surface of objective machine vision systems brings with it a few challenges which are discussed.
Satellite freeze forecast system: Executive summary
NASA Technical Reports Server (NTRS)
Martsolf, J. D. (Principal Investigator)
1983-01-01
A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.
Body surface detection method for photoacoustic image data using cloth-simulation technique
NASA Astrophysics Data System (ADS)
Sekiguchi, H.; Yoshikawa, A.; Matsumoto, Y.; Asao, Y.; Yagi, T.; Togashi, K.; Toi, M.
2018-02-01
Photoacoustic tomography (PAT) is a novel modality that can visualize blood vessels without contrast agents. It clearly shows blood vessels near the body surface. However, these vessels obstruct the observation of deep blood vessels. As the existence range of each vessel is determined by the distance from the body surface, they can be separated if the position of the skin is known. However, skin tissue, which does not contain hemoglobin, does not appear in PAT results, therefore, manual estimation is required. As this task is very labor-intensive, its automation is highly desirable. Therefore, we developed a method to estimate the body surface using the cloth-simulation technique, which is a commonly used method to create computer graphics (CG) animations; however, it has not yet been employed for medical image processing. In cloth simulations, the virtual cloth is represented by a two-dimensional array of mass nodes. The nodes are connected with each other by springs. Once the cloth is released from a position away from the body, each node begins to move downwards under the effect of gravity, spring, and other forces; some of the nodes hit the superficial vessels and stop. The cloth position in the stationary state represents the body surface. The body surface estimation, which required approximately 1 h with the manual method, is automated and it takes only approximately 10 s with the proposed method. The proposed method could facilitate the practical use of PAT.
Quantifying mesoscale eddies in the Lofoten Basin
NASA Astrophysics Data System (ADS)
Raj, R. P.; Johannessen, J. A.; Eldevik, T.; Nilsen, J. E. Ø.; Halo, I.
2016-07-01
The Lofoten Basin is the most eddy rich region in the Norwegian Sea. In this paper, the characteristics of these eddies are investigated from a comprehensive database of nearly two decades of satellite altimeter data (1995-2013) together with Argo profiling floats and surface drifter data. An automated method identified 1695/1666 individual anticyclonic/cyclonic eddies in the Lofoten Basin from more than 10,000 altimeter-based eddy observations. The eddies are found to be predominantly generated and residing locally. The spatial distributions of lifetime, occurrence, generation sites, size, intensity, and drift of the eddies are studied in detail. The anticyclonic eddies in the Lofoten Basin are the most long-lived eddies (>60 days), especially in the western part of the basin. We reveal two hotspots of eddy occurrence on either side of the Lofoten Basin. Furthermore, we infer a cyclonic drift of eddies in the western Lofoten Basin. Barotropic energy conversion rates reveals energy transfer from the slope current to the eddies during winter. An automated colocation of surface drifters trapped inside the altimeter-based eddies are used to corroborate the orbital speed of the anticyclonic and cyclonic eddies. Moreover, the vertical structure of the altimeter-based eddies is examined using colocated Argo profiling float profiles. Combination of altimetry, Argo floats, and surface drifter data is therefore considered to be a promising observation-based approach for further studies of the role of eddies in transport of heat and biomass from the slope current to the Lofoten Basin.
NASA Astrophysics Data System (ADS)
Campbell, B. D.; Higgins, S. R.
2008-12-01
Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.
NASA Astrophysics Data System (ADS)
Yang, Xiucheng; Chen, Li
2017-04-01
Urban surface water is characterized by complex surface continents and small size of water bodies, and the mapping of urban surface water is currently a challenging task. The moderate-resolution remote sensing satellites provide effective ways of monitoring surface water. This study conducts an exploratory evaluation on the performance of the newly available Sentinel-2A multispectral instrument (MSI) imagery for detecting urban surface water. An automatic framework that integrates pixel-level threshold adjustment and object-oriented segmentation is proposed. Based on the automated workflow, different combinations of visible, near infrared, and short-wave infrared bands in Sentinel-2 image via different water indices are first compared. Results show that object-level modified normalized difference water index (MNDWI with band 11) and automated water extraction index are feasible in urban surface water mapping for Sentinel-2 MSI imagery. Moreover, comparative results are obtained utilizing optimal MNDWI from Sentinel-2 and Landsat 8 images, respectively. Consequently, Sentinel-2 MSI achieves the kappa coefficient of 0.92, compared with that of 0.83 from Landsat 8 operational land imager.
NASA Astrophysics Data System (ADS)
Crawford, Ben; Grimmond, Sue; Kent, Christoph; Gabey, Andrew; Ward, Helen; Sun, Ting; Morrison, William
2017-04-01
Remotely sensed data from satellites have potential to enable high-resolution, automated calculation of urban surface energy balance terms and inform decisions about urban adaptations to environmental change. However, aerodynamic resistance methods to estimate sensible heat flux (QH) in cities using satellite-derived observations of surface temperature are difficult in part due to spatial and temporal variability of the thermal aerodynamic resistance term (rah). In this work, we extend an empirical function to estimate rah using observational data from several cities with a broad range of surface vegetation land cover properties. We then use this function to calculate spatially and temporally variable rah in London based on high-resolution (100 m) land cover datasets and in situ meteorological observations. In order to calculate high-resolution QH based on satellite-observed land surface temperatures, we also develop and employ novel methods to i) apply source area-weighted averaging of surface and meteorological variables across the study spatial domain, ii) calculate spatially variable, high-resolution meteorological variables (wind speed, friction velocity, and Obukhov length), iii) incorporate spatially interpolated urban air temperatures from a distributed sensor network, and iv) apply a modified Monte Carlo approach to assess uncertainties with our results, methods, and input variables. Modeled QH using the aerodynamic resistance method is then compared to in situ observations in central London from a unique network of scintillometers and eddy-covariance measurements.
A comparison of automated crater detection methods
NASA Astrophysics Data System (ADS)
Bandeira, L.; Barreira, C.; Pina, P.; Saraiva, J.
2008-09-01
Abstract This work presents early results of a comparison between some common methodologies for automated crater detection. The three procedures considered were applied to images of the surface of Mars, thus illustrating some pros and cons of their use. We aim to establish the clear advantages in using this type of methods in the study of planetary surfaces.
Automated search of control points in surface-based morphometry.
Canna, Antonietta; Russo, Andrea G; Ponticorvo, Sara; Manara, Renzo; Pepino, Alessandro; Sansone, Mario; Di Salle, Francesco; Esposito, Fabrizio
2018-04-16
Cortical surface-based morphometry is based on a semi-automated analysis of structural MRI images. In FreeSurfer, a widespread tool for surface-based analyses, a visual check of gray-white matter borders is followed by the manual placement of control points to drive the topological correction (editing) of segmented data. A novel algorithm combining radial sampling and machine learning is presented for the automated control point search (ACPS). Four data sets with 3 T MRI structural images were used for ACPS validation, including raw data acquired twice in 36 healthy subjects and both raw and FreeSurfer preprocessed data of 125 healthy subjects from public databases. The unedited data from a subgroup of subjects were submitted to manual control point search and editing. The ACPS algorithm was trained on manual control points and tested on new (unseen) unedited data. Cortical thickness (CT) and fractal dimensionality (FD) were estimated in three data sets by reconstructing surfaces from both unedited and edited data, and the effects of editing were compared between manual and automated editing and versus no editing. The ACPS-based editing improved the surface reconstructions similarly to manual editing. Compared to no editing, ACPS-based and manual editing significantly reduced CT and FD in consistent regions across different data sets. Despite the extra processing of control point driven reconstructions, CT and FD estimates were highly reproducible in almost all cortical regions, albeit some problematic regions (e.g. entorhinal cortex) may benefit from different editing. The use of control points improves the surface reconstruction and the ACPS algorithm can automate their search reducing the burden of manual editing. Copyright © 2018 Elsevier Inc. All rights reserved.
The evaluation of ASOS for the Kennedy Space Center's Shuttle Landing Facility
NASA Technical Reports Server (NTRS)
Yersavich, Ann; Wheeler, Mark; Taylor, Gregory; Schumann, Robin; Manobianco, John
1994-01-01
This report documents the Applied Meteorology Unit's (AMU) evaluation of the effectiveness and utility of the Automated Surface Observing System (ASOS) in terms of spaceflight operations and user requirements. In particular, the evaluation determines which of the Shuttle Landing Facility (SLF) observation requirements can be satisfied by ASOS. This report also includes a summary of ASOS' background, current configuration and specifications, system performance, and the possible concepts of operations for use of ASOS at the SLF. This evaluation stems from a desire by the Air Force to determine if ASOS units could be used to reduce the cost of SLF meteorological observations.
Kertesz, Vilmos; Paranthaman, Nithya; Moench, Paul; ...
2014-10-01
The aim of this paper was to evaluate the analytical performance of a fully automated droplet-based surface-sampling system for determining the distribution of the drugs acetaminophen and terfenadine, and their metabolites, in rat thin tissue sections. The following are the results: The rank order of acetaminophen concentration observed in tissues was stomach > small intestine > liver, while the concentrations of its glucuronide and sulfate metabolites were greatest in the liver and small intestine. Terfenadine was most concentrated in the liver and kidney, while its major metabolite, fexofenadine, was found in the liver and small intestine. In conclusion, the spatialmore » distributions of both drugs and their respective metabolites observed in this work were consistent with previous studies using radiolabeled drugs.« less
Building Airport Surface HITL Simulation Capability
NASA Technical Reports Server (NTRS)
Chinn, Fay Cherie
2016-01-01
FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.
An automated high throughput tribometer for adhesion, wear, and friction measurements
NASA Astrophysics Data System (ADS)
Kalihari, Vivek; Timpe, Shannon J.; McCarty, Lyle; Ninke, Matthew; Whitehead, Jim
2013-03-01
Understanding the origin and correlation of different surface properties under a multitude of operating conditions is critical in tribology. Diverse tribological properties and a lack of a single instrument to measure all make it difficult to compare and correlate properties, particularly in light of the wide range of interfaces commonly investigated. In the current work, a novel automated tribometer has been designed and validated, providing a unique experimental platform capable of high throughput adhesion, wear, kinetic friction, and static friction measurements. The innovative design aspects are discussed that allow for a variety of probes, sample surfaces, and testing conditions. Critical components of the instrument and their design criteria are described along with examples of data collection schemes. A case study is presented with multiple surface measurements performed on a set of characteristic substrates. Adhesion, wear, kinetic friction, and static friction are analyzed and compared across surfaces, highlighting the comprehensive nature of the surface data that can be generated using the automated high throughput tribometer.
NASA Astrophysics Data System (ADS)
Pardyjak, E.
2014-12-01
The MATERHORN (Mountain Terrain Atmospheric Modeling and Observation) Program is a multiuniversity, multidisciplinary research initiative designed to improve numerical weather prediction in complex terrain and to better understand the physics of complex terrain flow phenomena across a wide range of scales. As part of MATERHORN, field campaigns were conducted at Dugway, UT, USA in Autumn 2012 and Spring 2013. A subset of the campaigns included dense observations along the East Slope of Granite Peak (40.096° N, -113.253° W), as well as additional observations on the opposing west facing slope. East Slope observations included five multi-sonic anemometer eddy covariance towers (two with full energy budget stations), eleven small energy budget stations, fifteen automated weather stations, a distributed temperature sensing (DTS) system, hot-film anemometry, infrared camera surface temperature observations and up to three Doppler lidars. West Slope operations were less intense with three main towers, two of which included sonic anemometry and one, which included full surface energy balance observations. For this presentation, our analysis will focus on characterizing and contrasting the response of mean wind circulations and thermodynamics variables, as well as turbulence quantities during the evening transitions on both the East Slope and West Slope when solar irradiation differences of the slope surfaces is extremely large.
Computer-automated ABCD versus dermatologists with different degrees of experience in dermoscopy.
Piccolo, Domenico; Crisman, Giuliana; Schoinas, Spyridon; Altamura, Davide; Peris, Ketty
2014-01-01
Dermoscopy is a very useful and non-invasive technique for in vivo observation and preoperative diagnosis of pigmented skin lesions (PSLs) inasmuch as it enables analysis of surface and subsurface structures that are not discernible to the naked eye. The authors used the ABCD rule of dermoscopy to test the accuracy of melanoma diagnosis with respect to a panel of 165 PSLs and the intra- and inter-observer diagnostic agreement obtained between three dermatologists with different degrees of experience, one General Practitioner and a DDA for computer-assisted diagnosis (Nevuscreen(®), Arkè s.a.s., Avezzano, Italy). 165 Pigmented Skin Lesions from 165 patients were selected. Histopathological examination revealed 132 benign melanocytic skin lesions and 33 melanomas. The kappa statistic, sensitivity, specificity and predictive positive and negative values were calculated to measure agreement between all the human observers and in comparison with the automated DDA. Our results revealed poor reproducibility of the semi-quantitative algorithm devised by Stolz et al. independently of observers' experience in dermoscopy. Nevuscreen(®) (Arkè s.a.s., Avezzano, Italy) proved to be 'user friendly' to all observers, thus enabling a more critical evaluation of each lesion and representing a helpful tool for clinicians without significant experience in dermoscopy in improving and achieving more accurate diagnosis of PSLs.
van Pelt, Roy; Nguyen, Huy; ter Haar Romeny, Bart; Vilanova, Anna
2012-03-01
Quantitative analysis of vascular blood flow, acquired by phase-contrast MRI, requires accurate segmentation of the vessel lumen. In clinical practice, 2D-cine velocity-encoded slices are inspected, and the lumen is segmented manually. However, segmentation of time-resolved volumetric blood-flow measurements is a tedious and time-consuming task requiring automation. Automated segmentation of large thoracic arteries, based solely on the 3D-cine phase-contrast MRI (PC-MRI) blood-flow data, was done. An active surface model, which is fast and topologically stable, was used. The active surface model requires an initial surface, approximating the desired segmentation. A method to generate this surface was developed based on a voxel-wise temporal maximum of blood-flow velocities. The active surface model balances forces, based on the surface structure and image features derived from the blood-flow data. The segmentation results were validated using volunteer studies, including time-resolved 3D and 2D blood-flow data. The segmented surface was intersected with a velocity-encoded PC-MRI slice, resulting in a cross-sectional contour of the lumen. These cross-sections were compared to reference contours that were manually delineated on high-resolution 2D-cine slices. The automated approach closely approximates the manual blood-flow segmentations, with error distances on the order of the voxel size. The initial surface provides a close approximation of the desired luminal geometry. This improves the convergence time of the active surface and facilitates parametrization. An active surface approach for vessel lumen segmentation was developed, suitable for quantitative analysis of 3D-cine PC-MRI blood-flow data. As opposed to prior thresholding and level-set approaches, the active surface model is topologically stable. A method to generate an initial approximate surface was developed, and various features that influence the segmentation model were evaluated. The active surface segmentation results were shown to closely approximate manual segmentations.
Velodyne HDL-64E lidar for unmanned surface vehicle obstacle detection
NASA Astrophysics Data System (ADS)
Halterman, Ryan; Bruch, Michael
2010-04-01
The Velodyne HDL-64E is a 64 laser 3D (360×26.8 degree) scanning LIDAR. It was designed to fill perception needs of DARPA Urban Challenge vehicles. As such, it was principally intended for ground use. This paper presents the performance of the HDL-64E as it relates to the marine environment for unmanned surface vehicle (USV) obstacle detection and avoidance. We describe the sensor's capacity for discerning relevant objects at sea- both through subjective observations of the raw data and through a rudimentary automated obstacle detection algorithm. We also discuss some of the complications that have arisen with the sensor.
The role of haemorrhage and exudate detection in automated grading of diabetic retinopathy.
Fleming, Alan D; Goatman, Keith A; Philip, Sam; Williams, Graeme J; Prescott, Gordon J; Scotland, Graham S; McNamee, Paul; Leese, Graham P; Wykes, William N; Sharp, Peter F; Olson, John A
2010-06-01
Automated grading has the potential to improve the efficiency of diabetic retinopathy screening services. While disease/no disease grading can be performed using only microaneurysm detection and image-quality assessment, automated recognition of other types of lesions may be advantageous. This study investigated whether inclusion of automated recognition of exudates and haemorrhages improves the detection of observable/referable diabetic retinopathy. Images from 1253 patients with observable/referable retinopathy and 6333 patients with non-referable retinopathy were obtained from three grading centres. All images were reference-graded, and automated disease/no disease assessments were made based on microaneurysm detection and combined microaneurysm, exudate and haemorrhage detection. Introduction of algorithms for exudates and haemorrhages resulted in a statistically significant increase in the sensitivity for detection of observable/referable retinopathy from 94.9% (95% CI 93.5 to 96.0) to 96.6% (95.4 to 97.4) without affecting manual grading workload. Automated detection of exudates and haemorrhages improved the detection of observable/referable retinopathy.
Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G
2011-01-01
To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.
DOT National Transportation Integrated Search
2015-12-01
Automated pavement performance data collection is a method that uses advanced technology to collect detailed road surface : distress information at traffic speed. Agencies are driven to use automated survey techniques to enhance or replace their : cu...
Partial Automated Alignment and Integration System
NASA Technical Reports Server (NTRS)
Kelley, Gary Wayne (Inventor)
2014-01-01
The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.
van 't Klooster, Ronald; de Koning, Patrick J H; Dehnavi, Reza Alizadeh; Tamsma, Jouke T; de Roos, Albert; Reiber, Johan H C; van der Geest, Rob J
2012-01-01
To develop and validate an automated segmentation technique for the detection of the lumen and outer wall boundaries in MR vessel wall studies of the common carotid artery. A new segmentation method was developed using a three-dimensional (3D) deformable vessel model requiring only one single user interaction by combining 3D MR angiography (MRA) and 2D vessel wall images. This vessel model is a 3D cylindrical Non-Uniform Rational B-Spline (NURBS) surface which can be deformed to fit the underlying image data. Image data of 45 subjects was used to validate the method by comparing manual and automatic segmentations. Vessel wall thickness and volume measurements obtained by both methods were compared. Substantial agreement was observed between manual and automatic segmentation; over 85% of the vessel wall contours were segmented successfully. The interclass correlation was 0.690 for the vessel wall thickness and 0.793 for the vessel wall volume. Compared with manual image analysis, the automated method demonstrated improved interobserver agreement and inter-scan reproducibility. Additionally, the proposed automated image analysis approach was substantially faster. This new automated method can reduce analysis time and enhance reproducibility of the quantification of vessel wall dimensions in clinical studies. Copyright © 2011 Wiley Periodicals, Inc.
Benefits estimation framework for automated vehicle operations.
DOT National Transportation Integrated Search
2015-08-01
Automated vehicles have the potential to bring about transformative safety, mobility, energy, and environmental benefits to the surface transportation system. They are also being introduced into a complex transportation system, where second-order imp...
Understanding Is Key: An Analysis of Factors Pertaining to Trust in a Real-World Automation System.
Balfe, Nora; Sharples, Sarah; Wilson, John R
2018-06-01
This paper aims to explore the role of factors pertaining to trust in real-world automation systems through the application of observational methods in a case study from the railway sector. Trust in automation is widely acknowledged as an important mediator of automation use, but the majority of the research on automation trust is based on laboratory work. In contrast, this work explored trust in a real-world setting. Experienced rail operators in four signaling centers were observed for 90 min, and their activities were coded into five mutually exclusive categories. Their observed activities were analyzed in relation to their reported trust levels, collected via a questionnaire. The results showed clear differences in activity, even when circumstances on the workstations were very similar, and significant differences in some trust dimensions were found between groups exhibiting different levels of intervention and time not involved with signaling. Although the empirical, lab-based studies in the literature have consistently found that reliability and competence of the automation are the most important aspects of trust development, understanding of the automation emerged as the strongest dimension in this study. The implications are that development and maintenance of trust in real-world, safety-critical automation systems may be distinct from artificial laboratory automation. The findings have important implications for emerging automation concepts in diverse industries including highly automated vehicles and Internet of things.
Automated scoring of regional lung perfusion in children from contrast enhanced 3D MRI
NASA Astrophysics Data System (ADS)
Heimann, Tobias; Eichinger, Monika; Bauman, Grzegorz; Bischoff, Arved; Puderbach, Michael; Meinzer, Hans-Peter
2012-03-01
MRI perfusion images give information about regional lung function and can be used to detect pulmonary pathologies in cystic fibrosis (CF) children. However, manual assessment of the percentage of pathologic tissue in defined lung subvolumes features large inter- and intra-observer variation, making it difficult to determine disease progression consistently. We present an automated method to calculate a regional score for this purpose. First, lungs are located based on thresholding and morphological operations. Second, statistical shape models of left and right children's lungs are initialized at the determined locations and used to precisely segment morphological images. Segmentation results are transferred to perfusion maps and employed as masks to calculate perfusion statistics. An automated threshold to determine pathologic tissue is calculated and used to determine accurate regional scores. We evaluated the method on 10 MRI images and achieved an average surface distance of less than 1.5 mm compared to manual reference segmentations. Pathologic tissue was detected correctly in 9 cases. The approach seems suitable for detecting early signs of CF and monitoring response to therapy.
Development of an automated processing system for potential fishing zone forecast
NASA Astrophysics Data System (ADS)
Ardianto, R.; Setiawan, A.; Hidayat, J. J.; Zaky, A. R.
2017-01-01
The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processing system for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processing system could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.
Tucker, Colin; McHugh, Theresa A.; Howell, Armin; Gill, Richard; Weber, Bettina; Belnap, Jayne; Grote, Ed; Reed, Sasha C.
2017-01-01
Carbon cycling associated with biological soil crusts, which occupy interspaces between vascular plants in drylands globally, may be an important part of the coupled climate-carbon cycle of the Earth system. A major challenge to understanding CO2 fluxes in these systems is that much of the biotic and biogeochemical activity occurs in the upper few mm of the soil surface layer (i.e., the ‘mantle of fertility’), which exhibits highly dynamic and difficult to measure temperature and moisture fluctuations. Here, we report a multi-sensor approach to simultaneously measuring temperature and moisture of this biocrust surface layer (0–2 mm), and the deeper soil profile, concurrent with automated measurement of surface soil CO2effluxes. Our results illuminate robust relationships between biocrust water content and field CO2 pulses that have previously been difficult to detect and explain. All observed CO2 pulses over the measurement period corresponded to surface wetting events, including when the wetting events did not penetrate into the soil below the biocrust layer (0–2 mm). The variability of temperature and moisture of the biocrust surface layer was much greater than even in the 0–5 cm layer of the soil beneath the biocrust, or deeper in the soil profile. We therefore suggest that coupling surface measurements of biocrust moisture and temperature to automated CO2flux measurements may greatly improve our understanding of the climatic sensitivity of carbon cycling in biocrusted interspaces in our study region, and that this method may be globally relevant and applicable.
NASA Astrophysics Data System (ADS)
Steposhina, S. V.; Fedonin, O. N.
2018-03-01
Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.
Assessing emissions impacts of automated vehicles
DOT National Transportation Integrated Search
2016-06-20
With their potential for transforming surface transportation, understanding the impacts and benefits of automated vehicles (AVs) with regards to safety, mobility, energy and the environment is a necessary first step for informing policy to aid the su...
Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja
2016-11-01
To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.
NASA Astrophysics Data System (ADS)
Fegyveresi, John M.; Alley, Richard B.; Muto, Atsuhiro; Orsi, Anaïs J.; Spencer, Matthew K.
2018-01-01
Observations at the West Antarctic Ice Sheet (WAIS) Divide site show that near-surface snow is strongly altered by weather-related processes such as strong winds and temperature fluctuations, producing features that are recognizable in the deep ice core. Prominent glazed
surface crusts develop frequently at the site during summer seasons. Surface, snow pit, and ice core observations made in this study during summer field seasons from 2008-2009 to 2012-2013, supplemented by automated weather station (AWS) data with short- and longwave radiation sensors, revealed that such crusts formed during relatively low-wind, low-humidity, clear-sky periods with intense daytime sunshine. After formation, such glazed surfaces typically developed cracks in a polygonal pattern likely from thermal contraction at night. Cracking was commonest when several clear days occurred in succession and was generally followed by surface hoar growth; vapor escaping through the cracks during sunny days may have contributed to the high humidity that favored nighttime formation of surface hoar. Temperature and radiation observations show that daytime solar heating often warmed the near-surface snow above the air temperature, contributing to upward mass transfer, favoring crust formation from below, and then surface hoar formation. A simple surface energy calculation supports this observation. Subsequent examination of the WDC06A deep ice core revealed that crusts are preserved through the bubbly ice, and some occur in snow accumulated during winters, although not as commonly as in summertime deposits. Although no one has been on site to observe crust formation during winter, it may be favored by greater wintertime wind packing from stronger peak winds, high temperatures and steep temperature gradients from rapid midwinter warmings reaching as high as -15 °C, and perhaps longer intervals of surface stability. Time variations in crust occurrence in the core may provide paleoclimatic information, although additional studies are required. Discontinuity and cracking of crusts likely explain why crusts do not produce significant anomalies in other paleoclimatic records.
Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.
2012-01-01
Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459
Understanding Is Key: An Analysis of Factors Pertaining to Trust in a Real-World Automation System
Balfe, Nora; Sharples, Sarah; Wilson, John R.
2018-01-01
Objective: This paper aims to explore the role of factors pertaining to trust in real-world automation systems through the application of observational methods in a case study from the railway sector. Background: Trust in automation is widely acknowledged as an important mediator of automation use, but the majority of the research on automation trust is based on laboratory work. In contrast, this work explored trust in a real-world setting. Method: Experienced rail operators in four signaling centers were observed for 90 min, and their activities were coded into five mutually exclusive categories. Their observed activities were analyzed in relation to their reported trust levels, collected via a questionnaire. Results: The results showed clear differences in activity, even when circumstances on the workstations were very similar, and significant differences in some trust dimensions were found between groups exhibiting different levels of intervention and time not involved with signaling. Conclusion: Although the empirical, lab-based studies in the literature have consistently found that reliability and competence of the automation are the most important aspects of trust development, understanding of the automation emerged as the strongest dimension in this study. The implications are that development and maintenance of trust in real-world, safety-critical automation systems may be distinct from artificial laboratory automation. Application: The findings have important implications for emerging automation concepts in diverse industries including highly automated vehicles and Internet of things. PMID:29613815
Benefits Estimation Model for Automated Vehicle Operations: Phase 2 Final Report
DOT National Transportation Integrated Search
2018-01-01
Automated vehicles have the potential to bring about transformative safety, mobility, energy, and environmental benefits to the surface transportation system. They are also being introduced into a complex transportation system, where second-order imp...
Conceptual design of an aircraft automated coating removal system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, J.E.; Draper, J.V.; Pin, F.G.
1996-05-01
Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which ismore » semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).« less
NASA Astrophysics Data System (ADS)
Gold, R. D.; Reitman, N. G.; Briggs, R. W.; Barnhart, W. D.; Hayes, G. P.
2014-12-01
The 24 September 2013 Mw7.7 Balochistan, Pakistan earthquake ruptured a ~200 km-long stretch of the Hoshab fault in southern Pakistan. We remotely measured the coseismic surface deformation field using high-resolution (0.5 m) pre- and post-event satellite imagery. We measured ~300 near-field (0-10 m from fault) laterally offset piercing points (streams, terrace risers, roads, etc.) and find peak left-lateral offsets of ~12-15 m. We characterized the far-field (0-10 km from fault) displacement field using manual (~250 measurements) and automated image cross-correlation methods (e.g., pixel tracking) and find peak displacement values of ~16 m, which commonly exceed the on-fault displacement magnitudes. Our preliminary observations suggest the following: (1) coseismic surface displacement typically increases with distance away from the surface trace of the fault (e.g., highest displacement values in the far field), (2) for certain locations along the fault rupture, as little as 50% of the coseismic displacement field occurred in the near-field; and (3) the magnitudes of individual displacements are inversely correlated to the width of the surface rupture zone (e.g., largest displacements where the fault zone is narrowest). This analysis highlights the importance of identifying field study sites spanning fault sections with narrow deformation zones in order to capture the entire deformation field. For regions of distributed deformation, these results would predict that geologic slip rate studies underestimate a fault's complete slip rate.
Liya Thomas; R. Edward Thomas
2011-01-01
We have developed an automated defect detection system and a state-of-the-art Graphic User Interface (GUI) for hardwood logs. The algorithm identifies defects at least 0.5 inch high and at least 3 inches in diameter on barked hardwood log and stem surfaces. To summarize defect features and to build a knowledge base, hundreds of defects were measured, photographed, and...
Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew
2015-01-01
The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.
Measurement of surface microtopography
NASA Technical Reports Server (NTRS)
Wall, S. D.; Farr, T. G.; Muller, J.-P.; Lewis, P.; Leberl, F. W.
1991-01-01
Acquisition of ground truth data for use in microwave interaction modeling requires measurement of surface roughness sampled at intervals comparable to a fraction of the microwave wavelength and extensive enough to adequately represent the statistics of a surface unit. Sub-centimetric measurement accuracy is thus required over large areas, and existing techniques are usually inadequate. A technique is discussed for acquiring the necessary photogrammetric data using twin film cameras mounted on a helicopter. In an attempt to eliminate tedious data reduction, an automated technique was applied to the helicopter photographs, and results were compared to those produced by conventional stereogrammetry. Derived root-mean-square (RMS) roughness for the same stereo-pair was 7.5 cm for the automated technique versus 6.5 cm for the manual method. The principal source of error is probably due to vegetation in the scene, which affects the automated technique but is ignored by a human operator.
A personal rapid transit/airport automated people mover comparison.
DOT National Transportation Integrated Search
2008-01-01
Airport automated people movers (AAPM) typically consist of driverless trains with : up to about four cars each capable of carrying 20 to 100 passengers who are mostly : standing. They have been successfully used for surface transportation in airport...
Automated protocols for spaceborne sub-meter resolution "Big Data" products for Earth Science
NASA Astrophysics Data System (ADS)
Neigh, C. S. R.; Carroll, M.; Montesano, P.; Slayback, D. A.; Wooten, M.; Lyapustin, A.; Shean, D. E.; Alexandrov, O.; Macander, M. J.; Tucker, C. J.
2017-12-01
The volume of available remotely sensed data has grown exceeding Petabytes per year and the cost for data, storage systems and compute power have both dropped exponentially. This has opened the door for "Big Data" processing systems with high-end computing (HEC) such as the Google Earth Engine, NASA Earth Exchange (NEX), and NASA Center for Climate Simulation (NCCS). At the same time, commercial very high-resolution (VHR) satellites have grown into a constellation with global repeat coverage that can support existing NASA Earth observing missions with stereo and super-spectral capabilities. Through agreements with the National Geospatial-Intelligence Agency NASA-Goddard Space Flight Center is acquiring Petabytes of global sub-meter to 4 meter resolution imagery from WorldView-1,2,3 Quickbird-2, GeoEye-1 and IKONOS-2 satellites. These data are a valuable no-direct cost for the enhancement of Earth observation research that supports US government interests. We are currently developing automated protocols for generating VHR products to support NASA's Earth observing missions. These include two primary foci: 1) on demand VHR 1/2° ortho mosaics - process VHR to surface reflectance, orthorectify and co-register multi-temporal 2 m multispectral imagery compiled as user defined regional mosaics. This will provide an easy access dataset to investigate biodiversity, tree canopy closure, surface water fraction, and cropped area for smallholder agriculture; and 2) on demand VHR digital elevation models (DEMs) - process stereo VHR to extract VHR DEMs with the NASA Ames stereo pipeline. This will benefit Earth surface studies on the cryosphere (glacier mass balance, flow rates and snow depth), hydrology (lake/water body levels, landslides, subsidence) and biosphere (forest structure, canopy height/cover) among others. Recent examples of products used in NASA Earth Science projects will be provided. This HEC API could foster surmounting prior spatial-temporal limitations while providing broad benefits to Earth Science.
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Alan
2011-01-01
Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.
Awad, Joseph; Owrangi, Amir; Villemaire, Lauren; O'Riordan, Elaine; Parraga, Grace; Fenster, Aaron
2012-02-01
Manual segmentation of lung tumors is observer dependent and time-consuming but an important component of radiology and radiation oncology workflow. The objective of this study was to generate an automated lung tumor measurement tool for segmentation of pulmonary metastatic tumors from x-ray computed tomography (CT) images to improve reproducibility and decrease the time required to segment tumor boundaries. The authors developed an automated lung tumor segmentation algorithm for volumetric image analysis of chest CT images using shape constrained Otsu multithresholding (SCOMT) and sparse field active surface (SFAS) algorithms. The observer was required to select the tumor center and the SCOMT algorithm subsequently created an initial surface that was deformed using level set SFAS to minimize the total energy consisting of mean separation, edge, partial volume, rolling, distribution, background, shape, volume, smoothness, and curvature energies. The proposed segmentation algorithm was compared to manual segmentation whereby 21 tumors were evaluated using one-dimensional (1D) response evaluation criteria in solid tumors (RECIST), two-dimensional (2D) World Health Organization (WHO), and 3D volume measurements. Linear regression goodness-of-fit measures (r(2) = 0.63, p < 0.0001; r(2) = 0.87, p < 0.0001; and r(2) = 0.96, p < 0.0001), and Pearson correlation coefficients (r = 0.79, p < 0.0001; r = 0.93, p < 0.0001; and r = 0.98, p < 0.0001) for 1D, 2D, and 3D measurements, respectively, showed significant correlations between manual and algorithm results. Intra-observer intraclass correlation coefficients (ICC) demonstrated high reproducibility for algorithm (0.989-0.995, 0.996-0.997, and 0.999-0.999) and manual measurements (0.975-0.993, 0.985-0.993, and 0.980-0.992) for 1D, 2D, and 3D measurements, respectively. The intra-observer coefficient of variation (CV%) was low for algorithm (3.09%-4.67%, 4.85%-5.84%, and 5.65%-5.88%) and manual observers (4.20%-6.61%, 8.14%-9.57%, and 14.57%-21.61%) for 1D, 2D, and 3D measurements, respectively. The authors developed an automated segmentation algorithm requiring only that the operator select the tumor to measure pulmonary metastatic tumors in 1D, 2D, and 3D. Algorithm and manual measurements were significantly correlated. Since the algorithm segmentation involves selection of a single seed point, it resulted in reduced intra-observer variability and decreased time, for making the measurements.
Automated delay estimation at signalized intersections : phase I concept and algorithm development.
DOT National Transportation Integrated Search
2011-07-01
Currently there are several methods to measure the performance of surface streets, but their capabilities in dynamically estimating vehicle delay are limited. The objective of this research is to develop a method to automate traffic delay estimation ...
Application of automation and robotics to lunar surface human exploration operations
NASA Technical Reports Server (NTRS)
Woodcock, Gordon R.; Sherwood, Brent; Buddington, Patricia A.; Bares, Leona C.; Folsom, Rolfe; Mah, Robert; Lousma, Jack
1990-01-01
Major results of a study applying automation and robotics to lunar surface base buildup and operations concepts are reported. The study developed a reference base scenario with specific goals, equipment concepts, robot concepts, activity schedules and buildup manifests. It examined crew roles, contingency cases and system reliability, and proposed a set of technologies appropriate and necessary for effective lunar operations. This paper refers readers to four companion papers for quantitative details where appropriate.
Chest wall segmentation in automated 3D breast ultrasound scans.
Tan, Tao; Platel, Bram; Mann, Ritse M; Huisman, Henkjan; Karssemeijer, Nico
2013-12-01
In this paper, we present an automatic method to segment the chest wall in automated 3D breast ultrasound images. Determining the location of the chest wall in automated 3D breast ultrasound images is necessary in computer-aided detection systems to remove automatically detected cancer candidates beyond the chest wall and it can be of great help for inter- and intra-modal image registration. We show that the visible part of the chest wall in an automated 3D breast ultrasound image can be accurately modeled by a cylinder. We fit the surface of our cylinder model to a set of automatically detected rib-surface points. The detection of the rib-surface points is done by a classifier using features representing local image intensity patterns and presence of rib shadows. Due to attenuation of the ultrasound signal, a clear shadow is visible behind the ribs. Evaluation of our segmentation method is done by computing the distance of manually annotated rib points to the surface of the automatically detected chest wall. We examined the performance on images obtained with the two most common 3D breast ultrasound devices in the market. In a dataset of 142 images, the average mean distance of the annotated points to the segmented chest wall was 5.59 ± 3.08 mm. Copyright © 2012 Elsevier B.V. All rights reserved.
In-situ thermography of automated fiber placement parts
NASA Astrophysics Data System (ADS)
Gregory, Elizabeth D.; Juarez, Peter D.
2018-04-01
Automated fiber placement (AFP) provides precision and repeatable manufacturing of both simple and complex geometry composite parts. However, AFP also introduces the possibility for unique flaws such as overlapping tows, gaps between tows, tow twists, lack of layer adhesion and foreign object debris. These types of flaws can all result in a significant loss of performance in the final part. The current inspection method for these flaws is a costly and time intensive visual inspection of each ply layer. This work describes some initial efforts to incorporate thermal inspection on the AFP head and analysis of the data to identify the previously mentioned flaws. Previous bench-top laboratory experiments demonstrated that laps, gaps, and twists were identified from a thermal image. The AFP head uses an on- board lamp to preheat the surface of the part during layup to increase ply consolidation. The preheated surface is used as a thermal source to observe the state of the new material after compaction. We will present data collected with the Integrated Structural Assembly of Advanced Composites (ISAAC) AFP machine at Langley Research Center showing that changes to the temperature profile is sufficient for identifying all types of flaws.
Manta Matcher: automated photographic identification of manta rays using keypoint features.
Town, Christopher; Marshall, Andrea; Sethasathien, Nutthaporn
2013-07-01
For species which bear unique markings, such as natural spot patterning, field work has become increasingly more reliant on visual identification to recognize and catalog particular specimens or to monitor individuals within populations. While many species of interest exhibit characteristic markings that in principle allow individuals to be identified from photographs, scientists are often faced with the task of matching observations against databases of hundreds or thousands of images. We present a novel technique for automated identification of manta rays (Manta alfredi and Manta birostris) by means of a pattern-matching algorithm applied to images of their ventral surface area. Automated visual identification has recently been developed for several species. However, such methods are typically limited to animals that can be photographed above water, or whose markings exhibit high contrast and appear in regular constellations. While manta rays bear natural patterning across their ventral surface, these patterns vary greatly in their size, shape, contrast, and spatial distribution. Our method is the first to have proven successful at achieving high matching accuracies on a large corpus of manta ray images taken under challenging underwater conditions. Our method is based on automated extraction and matching of keypoint features using the Scale-Invariant Feature Transform (SIFT) algorithm. In order to cope with the considerable variation in quality of underwater photographs, we also incorporate preprocessing and image enhancement steps. Furthermore, we use a novel pattern-matching approach that results in better accuracy than the standard SIFT approach and other alternative methods. We present quantitative evaluation results on a data set of 720 images of manta rays taken under widely different conditions. We describe a novel automated pattern representation and matching method that can be used to identify individual manta rays from photographs. The method has been incorporated into a website (mantamatcher.org) which will serve as a global resource for ecological and conservation research. It will allow researchers to manage and track sightings data to establish important life-history parameters as well as determine other ecological data such as abundance, range, movement patterns, and structure of manta ray populations across the world.
Manta Matcher: automated photographic identification of manta rays using keypoint features
Town, Christopher; Marshall, Andrea; Sethasathien, Nutthaporn
2013-01-01
For species which bear unique markings, such as natural spot patterning, field work has become increasingly more reliant on visual identification to recognize and catalog particular specimens or to monitor individuals within populations. While many species of interest exhibit characteristic markings that in principle allow individuals to be identified from photographs, scientists are often faced with the task of matching observations against databases of hundreds or thousands of images. We present a novel technique for automated identification of manta rays (Manta alfredi and Manta birostris) by means of a pattern-matching algorithm applied to images of their ventral surface area. Automated visual identification has recently been developed for several species. However, such methods are typically limited to animals that can be photographed above water, or whose markings exhibit high contrast and appear in regular constellations. While manta rays bear natural patterning across their ventral surface, these patterns vary greatly in their size, shape, contrast, and spatial distribution. Our method is the first to have proven successful at achieving high matching accuracies on a large corpus of manta ray images taken under challenging underwater conditions. Our method is based on automated extraction and matching of keypoint features using the Scale-Invariant Feature Transform (SIFT) algorithm. In order to cope with the considerable variation in quality of underwater photographs, we also incorporate preprocessing and image enhancement steps. Furthermore, we use a novel pattern-matching approach that results in better accuracy than the standard SIFT approach and other alternative methods. We present quantitative evaluation results on a data set of 720 images of manta rays taken under widely different conditions. We describe a novel automated pattern representation and matching method that can be used to identify individual manta rays from photographs. The method has been incorporated into a website (mantamatcher.org) which will serve as a global resource for ecological and conservation research. It will allow researchers to manage and track sightings data to establish important life-history parameters as well as determine other ecological data such as abundance, range, movement patterns, and structure of manta ray populations across the world. PMID:23919138
Watershed identification of polygonal patterns in noisy SAR images.
Moreels, Pierre; Smrekar, Suzanne E
2003-01-01
This paper describes a new approach to pattern recognition in synthetic aperture radar (SAR) images. A visual analysis of the images provided by NASA's Magellan mission to Venus has revealed a number of zones showing polygonal-shaped faults on the surface of the planet. The goal of the paper is to provide a method to automate the identification of such zones. The high level of noise in SAR images and its multiplicative nature make automated image analysis difficult and conventional edge detectors, like those based on gradient images, inefficient. We present a scheme based on an improved watershed algorithm and a two-scale analysis. The method extracts potential edges in the SAR image, analyzes the patterns obtained, and decides whether or not the image contains a "polygon area". This scheme can also be applied to other SAR or visual images, for instance in observation of Mars and Jupiter's satellite Europa.
Autonomous Exploration for Gathering Increased Science
NASA Technical Reports Server (NTRS)
Bornstein, Benjamin J.; Castano, Rebecca; Estlin, Tara A.; Gaines, Daniel M.; Anderson, Robert C.; Thompson, David R.; DeGranville, Charles K.; Chien, Steve A.; Tang, Benyang; Burl, Michael C.;
2010-01-01
The Autonomous Exploration for Gathering Increased Science System (AEGIS) provides automated targeting for remote sensing instruments on the Mars Exploration Rover (MER) mission, which at the time of this reporting has had two rovers exploring the surface of Mars (see figure). Currently, targets for rover remote-sensing instruments must be selected manually based on imagery already on the ground with the operations team. AEGIS enables the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. In particular, this technology will be used to automatically acquire sub-framed, high-resolution, targeted images taken with the MER panoramic cameras. This software provides: 1) Automatic detection of terrain features in rover camera images, 2) Feature extraction for detected terrain targets, 3) Prioritization of terrain targets based on a scientist target feature set, and 4) Automated re-targeting of rover remote-sensing instruments at the highest priority target.
An Automated Method for Landmark Identification and Finite-Element Modeling of the Lumbar Spine.
Campbell, Julius Quinn; Petrella, Anthony J
2015-11-01
The purpose of this study was to develop a method for the automated creation of finite-element models of the lumbar spine. Custom scripts were written to extract bone landmarks of lumbar vertebrae and assemble L1-L5 finite-element models. End-plate borders, ligament attachment points, and facet surfaces were identified. Landmarks were identified to maintain mesh correspondence between meshes for later use in statistical shape modeling. 90 lumbar vertebrae were processed creating 18 subject-specific finite-element models. Finite-element model surfaces and ligament attachment points were reproduced within 1e-5 mm of the bone surface, including the critical contact surfaces of the facets. Element quality exceeded specifications in 97% of elements for the 18 models created. The current method is capable of producing subject-specific finite-element models of the lumbar spine with good accuracy, quality, and robustness. The automated methods developed represent advancement in the state of the art of subject-specific lumbar spine modeling to a scale not possible with prior manual and semiautomated methods.
Automated 3D closed surface segmentation: application to vertebral body segmentation in CT images.
Liu, Shuang; Xie, Yiting; Reeves, Anthony P
2016-05-01
A fully automated segmentation algorithm, progressive surface resolution (PSR), is presented in this paper to determine the closed surface of approximately convex blob-like structures that are common in biomedical imaging. The PSR algorithm was applied to the cortical surface segmentation of 460 vertebral bodies on 46 low-dose chest CT images, which can be potentially used for automated bone mineral density measurement and compression fracture detection. The target surface is realized by a closed triangular mesh, which thereby guarantees the enclosure. The surface vertices of the triangular mesh representation are constrained along radial trajectories that are uniformly distributed in 3D angle space. The segmentation is accomplished by determining for each radial trajectory the location of its intersection with the target surface. The surface is first initialized based on an input high confidence boundary image and then resolved progressively based on a dynamic attraction map in an order of decreasing degree of evidence regarding the target surface location. For the visual evaluation, the algorithm achieved acceptable segmentation for 99.35 % vertebral bodies. Quantitative evaluation was performed on 46 vertebral bodies and achieved overall mean Dice coefficient of 0.939 (with max [Formula: see text] 0.957, min [Formula: see text] 0.906 and standard deviation [Formula: see text] 0.011) using manual annotations as the ground truth. Both visual and quantitative evaluations demonstrate encouraging performance of the PSR algorithm. This novel surface resolution strategy provides uniform angular resolution for the segmented surface with computation complexity and runtime that are linearly constrained by the total number of vertices of the triangular mesh representation.
Automation of Space Inventory Management
NASA Technical Reports Server (NTRS)
Fink, Patrick W.; Ngo, Phong; Wagner, Raymond; Barton, Richard; Gifford, Kevin
2009-01-01
This viewgraph presentation describes the utilization of automated space-based inventory management through handheld RFID readers and BioNet Middleware. The contents include: 1) Space-Based INventory Management; 2) Real-Time RFID Location and Tracking; 3) Surface Acoustic Wave (SAW) RFID; and 4) BioNet Middleware.
DOT National Transportation Integrated Search
2018-01-07
Connected and automated vehicles (CAV) are poised to transform surface transportation systems in the United States. Near-term CAV technologies like cooperative adaptive cruise control (CACC) have the potential to deliver energy efficiency and air qua...
NASA Astrophysics Data System (ADS)
Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven
2014-08-01
Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1994-01-01
The continued high cost and risk of placing astronauts in space has placed considerable burden on NASA to cut costs and consider other means of achieving mission goals both effectively and safely. Additionally, future science missions which might place a tremendous burden on Shuttle availability, or require extended vehicle duty cycles on the Lunar surface and Mars surface, might preclude the presence of astronauts altogether.
Automated surface photometry for the Coma Cluster galaxies: The catalog
NASA Technical Reports Server (NTRS)
Doi, M.; Fukugita, M.; Okamura, S.; Tarusawa, K.
1995-01-01
A homogeneous photometry catalog is presented for 450 galaxies with B(sub 25.5) less than or equal to 16 mag located in the 9.8 deg x 9.8 deg region centered on the Coma Cluster. The catalog is based on photographic photometry using an automated surface photometry software for data reduction applied to B-band Schmidt plates. The catalog provides accurate positions, isophotal and total magnitudes, major and minor axes, and a few other photometric parameters including rudimentary morphology (early of late type).
NASA Astrophysics Data System (ADS)
Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter
2015-04-01
Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the laboratory, and in-situ field studies. In particular, we discuss the nature and detection of surface and buried (fossil) subsurface Biological Soil Crusts (BSCs), voids, macroscopic particles and compositional layers. The strength of surface BSCs and the occurrence of buried BSCs and layers has been detected at sub millimetre scales to depths of 40mm. Our measurements and field observations of PR show the importance of morphological layering to overall BSC functions (Felde et al. 2015). We also discuss the effect of penetrometer shaft and probe-tip profiles upon the theoretical and experimental curves, EMP resolution and reproducibility, demonstrating how the model enables voids, buried biological soil crusts, exotic particles, soil horizons and layers to be distinguished one from another. This represents a potentially important contribution to advancing understanding of the relationship between BSCs and dryland soil structure. References: Drahorad SL, Felix-Henningsen P. (2012) An electronic micropenetrometer (EMP) for field measurements of biological soil crust stability, J. Plant Nutr. Soil Sci., 175, 519-520 Felde V.J.M.N.L., Drahorad S.L., Felix-Henningsen P., Hoon S.R. (2015) Ongoing oversanding induces biological soil crust layering - a new approach for BSC structure elucidation determined from high resolution penetration resistance data (submitted) Grunwald, S., Rooney D.J., McSweeney K., Lowery B. (2001) Development of pedotransfer functions for a profile cone penetrometer, Geoderma, 100, 25-47 Van Herwijnen A., Bellaire S., Schweizer J. (2009) Comparison of micro-structural snowpack parameters derived from penetration resistance measurements with fracture character observations from compression tests, Cold Regions Sci. {& Technol.}, 59, 193-201
Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.
1980-06-02
processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis
NASA Astrophysics Data System (ADS)
Smith, S. R.; Lopez, N.; Bourassa, M. A.; Rolph, J.; Briggs, K.
2012-12-01
The research vessel data center at the Florida State University routinely acquires, quality controls, and distributes underway surface meteorological and oceanographic observations from vessels. The activities of the center are coordinated by the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative in partnership with the Rolling Deck to Repository (R2R) project. The data center evaluates the quality of the observations, collects essential metadata, provides data quality feedback to vessel operators, and ensures the long-term data preservation at the National Oceanographic Data Center. A description of the SAMOS data stewardship protocols will be provided, including dynamic web tools that ensure users can select the highest quality observations from over 30 vessels presently recruited to the SAMOS initiative. Research vessels provide underway observations at high-temporal frequency (1 min. sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Recruited vessels collect a high concentration of data within the U.S. continental shelf and also frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern Ocean, Arctic, South Atlantic and Pacific). The unique quality and sampling locations of research vessel observations and there independence from many models and products (RV data are rarely distributed via normal marine weather reports) makes them ideal for validation studies. We will present comparisons between research vessel observations and model estimates of the sea surface temperature and salinity in the Gulf of Mexico. The analysis reveals an underestimation of the freshwater input to the Gulf from rivers, resulting in an overestimation of near coastal salinity in the model. Additional comparisons between surface atmospheric products derived from satellite observations and the underway research vessel observations will be shown. The strengths and limitations of research observations for validation studies will be highlighted through these case studies.
NASA Astrophysics Data System (ADS)
Jones, M.; Longenecker, H. E., III
2017-12-01
The 2017 hurricane season brought the unprecedented landfall of three Category 4 hurricanes (Harvey, Irma and Maria). FEMA is responsible for coordinating the federal response and recovery efforts for large disasters such as these. FEMA depends on timely and accurate depth grids to estimate hazard exposure, model damage assessments, plan flight paths for imagery acquisition, and prioritize response efforts. In order to produce riverine or coastal depth grids based on observed flooding, the methodology requires peak crest water levels at stream gauges, tide gauges, high water marks, and best-available elevation data. Because peak crest data isn't available until the apex of a flooding event and high water marks may take up to several weeks for field teams to collect for a large-scale flooding event, final observed depth grids are not available to FEMA until several days after a flood has begun to subside. Within the last decade NOAA's National Weather Service (NWS) has implemented the Advanced Hydrologic Prediction Service (AHPS), a web-based suite of accurate forecast products that provide hydrograph forecasts at over 3,500 stream gauge locations across the United States. These forecasts have been newly implemented into an automated depth grid script tool, using predicted instead of observed water levels, allowing FEMA access to flood hazard information up to 3 days prior to a flooding event. Water depths are calculated from the AHPS predicted flood stages and are interpolated at 100m spacing along NHD hydrolines within the basin of interest. A water surface elevation raster is generated from these water depths using an Inverse Distance Weighted interpolation. Then, elevation (USGS NED 30m) is subtracted from the water surface elevation raster so that the remaining values represent the depth of predicted flooding above the ground surface. This automated process requires minimal user input and produced forecasted depth grids that were comparable to post-event observed depth grids and remote sensing-derived flood extents for the 2017 hurricane season. These newly available forecasted models were used for pre-event response planning and early estimated hazard exposure counts, allowing FEMA to plan for and stand up operations several days sooner than previously possible.
Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.
2012-01-01
The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.
Assessment of Automated Analyses of Cell Migration on Flat and Nanostructured Surfaces
Grădinaru, Cristian; Łopacińska, Joanna M.; Huth, Johannes; Kestler, Hans A.; Flyvbjerg, Henrik; Mølhave, Kristian
2012-01-01
Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated analysis. The fact that the segmentation routines of such programs are often challenged by nanostructured surfaces makes the question more pertinent. Here we illustrate how it is possible to track cells on bright field microscopy images with image analysis routines implemented in an open-source cell tracking program, PACT (Program for Automated Cell Tracking). We compare the automated motility analysis of three cell tracking programs, PACT, Autozell, and TLA, using the same movies as input for all three programs. We find that different programs track overlapping, but different subsets of cells due to different segmentation methods. Unfortunately, population averages based on such different cell populations, differ significantly in some cases. Thus, results obtained with one software package are not necessarily reproducible by other software. PMID:24688640
Automated inspection of solder joints for surface mount technology
NASA Technical Reports Server (NTRS)
Savage, Robert M.; Park, Hyun Soo; Fan, Mark S.
1993-01-01
Researchers at NASA/GSFC evaluated various automated inspection systems (AIS) technologies using test boards with known defects in surface mount solder joints. These boards were complex and included almost every type of surface mount device typical of critical assemblies used for space flight applications: X-ray radiography; X-ray laminography; Ultrasonic Imaging; Optical Imaging; Laser Imaging; and Infrared Inspection. Vendors, representative of the different technologies, inspected the test boards with their particular machine. The results of the evaluation showed limitations of AIS. Furthermore, none of the AIS technologies evaluated proved to meet all of the inspection criteria for use in high-reliability applications. It was found that certain inspection systems could supplement but not replace manual inspection for low-volume, high-reliability, surface mount solder joints.
Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies
NASA Astrophysics Data System (ADS)
Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.
2016-02-01
Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.
Futamura, Megumi; Sugama, Junko; Okuwa, Mayumi; Sanada, Hiromi; Tabata, Keiko
2008-12-01
This study objectively evaluated the degree of comfort in bedridden older adults using an air-cell mattress with an automated turning mechanism. The sample included 10 bedridden women with verbal communication difficulties. The high frequency (HF) components of heart rate variability, which reflect parasympathetic nervous activity, were compared for the manual and automated turning periods. No significant differences in the HF component were observed in 5 of the participants. Significant increases in the HF component associated with automated turning were observed in 3 participants; however, the two participants with the lowest body mass index values exhibited a significant reduction in the HF component during the automated turning period. The results revealed that comfort might not be disturbed during the automated turning period.
Automated Passive Capillary Lysimeters for Estimating Water Drainage in the Vadose Zone
NASA Astrophysics Data System (ADS)
Jabro, J.; Evans, R.
2009-04-01
In this study, we demonstrated and evaluated the performance and accuracy of an automated PCAP lysimeters that we designed for in-situ continuous measuring and estimating of drainage water below the rootzone of a sugarbeet-potato-barley rotation under two irrigation frequencies. Twelve automated PCAPs with sampling surface dimensions of 31 cm width * 91 cm long and 87 cm in height were placed 90 cm below the soil surface in a Lihen sandy loam. Our state-of-the-art design incorporated Bluetooth wireless technology to enable an automated datalogger to transmit drainage water data simultaneously every 15 minutes to a remote host and had a greater efficiency than other types of lysimeters. It also offered a significantly larger coverage area (2700 cm2) than similarly designed vadose zone lysimeters. The cumulative manually extracted drainage water was compared with the cumulative volume of drainage water recorded by the datalogger from the tipping bucket using several statistical methods. Our results indicated that our automated PCAPs are accurate and provided convenient means for estimating water drainage in the vadose zone without the need for costly and manually time-consuming supportive systems.
NASA Astrophysics Data System (ADS)
Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.
2017-12-01
This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.
NASA Astrophysics Data System (ADS)
Steiner, N.; McDonald, K. C.; Dinardo, S. J.; Miller, C. E.
2015-12-01
Arctic permafrost soils contain a vast amount of organic carbon that will be released into the atmosphere as carbon dioxide or methane when thawed. Surface to air greenhouse gas fluxes are largely dependent on such surface controls as the frozen/thawed state of the snow and soil. Satellite remote sensing is an important means to create continuous mapping of surface properties. Advances in the ability to determine soil and snow freeze/thaw timings from microwave frequency observations improves upon our ability to predict the response of carbon gas emission to warming through synthesis with in-situ observation, such as the 2012-2015 Carbon in Arctic Reservoir Vulnerability Experiment (CARVE). Surface freeze/thaw or snowmelt timings are often derived using a constant or spatially/temporally variable threshold applied to time-series observations. Alternately, time-series singularity classifiers aim to detect discontinuous changes, or "edges", in time-series data similar to those that occur from the large contrast in dielectric constant during the freezing or thaw of soil or snow. We use multi-scale analysis of continuous wavelet transform spectral gradient brightness temperatures from various channel combinations of passive microwave radiometers, Advanced Microwave Scanning Radiometer (AMSR-E, AMSR2) and Special Sensor Microwave Imager (SSM/I F17) gridded at a 10 km posting with resolution proportional to the observational footprint. Channel combinations presented here aim to illustrate and differentiate timings of "edges" from transitions in surface water related to various landscape components (e.g. snow-melt, soil-thaw). To support an understanding of the physical basis of observed "edges" we compare satellite measurements with simple radiative transfer microwave-emission modeling of the snow, soil and vegetation using in-situ observations from the SNOw TELemetry (SNOTEL) automated weather stations. Results of freeze/thaw and snow-melt timings and trends are reported for Alaska and the North-West Canadian Arctic for the period 2002 to 2015.
NASA Astrophysics Data System (ADS)
Goguen, Jay D.; Bauer, James M.
2017-10-01
The reflectivity of solar system surfaces ‘spikes’ sharply when the Sun is less than 1 degree from directly behind the observer. The Galileo spacecraft measured the reflectivity of part of Europa’s surface to increase by as much as a factor of 8 as the observer moves from 5 degrees to the exact backscattering direction! One mechanism explains this spike as coherent light scattering that occurs only close to this unique retro-reflection geometry. Due to the tight linear alignment of the target, observer and Sun required to measure the peak brightness of the spike, accurate and complete measurements of the amplitude and decay of the spike exist for only a few targets. We used the unique capabilities of the automated Las Cumbres Observatory global telescope network (LCO) to systematically measure this extreme opposition surge for 60+ asteroids sampling a variety of taxonomic classes in the Bus/DeMeo taxonomy.Each asteroid was observed in the SDSS r’ and g’ filters during the ~8 hour interval when it passes within ~0.1 deg of the point opposite the Sun on the sky. Supporting observations of each asteroid with LCO collected over ~50 days measure asteroid rotation and phase angle brightness changes to enable accurate characterization of the retro-reflection spike. This data set vastly increases the number and variety of the surfaces characterized at such small phase angles compared to existing asteroid data. We examine how the spike characteristics vary with surface composition, albedo, and wavelength providing new constraints on physical models of this ubiquitous yet poorly understood phenomenon.Analysis and modeling of these measurements will advance our understanding of the physical mechanism responsible for this enhanced retro-reflection thereby improving our ability to characterize these surfaces from remote observations. The ability to infer surface physical properties from remote sensing data is a key capability for future asteroid missions, manned exploration, impact hazard assessment, and fundamental asteroid science.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
Automated digital magnetofluidics
NASA Astrophysics Data System (ADS)
Schneider, J.; Garcia, A. A.; Marquez, M.
2008-08-01
Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.
DOT National Transportation Integrated Search
1997-03-06
This report responds to a request for information on the current goals and future direction of the Department of Transportations (DOT) Automated Highway System program. This program seeks to increase the capacity of the nations highways and to ...
E-healthcare at an experimental welfare techno house in Japan.
Tamura, Toshiyo; Kawarada, Atsushi; Nambu, Masayuki; Tsukada, Akira; Sasaki, Kazuo; Yamakoshi, Ken-Ichi
2007-01-01
An automated monitoring system for home health care has been designed for an experimental house in Japan called the Welfare Techno House (WTH). Automated electrocardiogram (ECG) measurements can be taken while in bed, in the bathtub, and on the toilet, without the subject's awareness, and without using body surface electrodes. In order to evaluate this automated health monitoring system, overnight measurements were performed to monitor health status during the daily lives of both young and elderly subjects.
NWR (National Weather Service) voice synthesis project, phase 1
NASA Astrophysics Data System (ADS)
Sampson, G. W.
1986-01-01
The purpose of the NOAA Weather Radio (NWR) Voice Synthesis Project is to provide a demonstration of the current voice synthesis technology. Phase 1 of this project is presented, providing a complete automation of an hourly surface aviation observation for broadcast over NWR. In examining the products currently available on the market, the decision was made that synthetic voice technology does not have the high quality speech required for broadcast over the NWR. Therefore the system presented uses the phrase concatenation type of technology for a very high quality, versatile, voice synthesis system.
NASA Astrophysics Data System (ADS)
Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.
2015-04-01
To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.
1991-09-01
103 A2352344 Layup Cover Sheets/Inspect ............................. 103 A2352345 Perform Automated Tape Laying operations...A2352345 Perform Automated Tape Laying operations/Inspect The tape is layed in 3-12 inch strips along the surface of the bond mold. The NC program is
Space exploration: The interstellar goal and Titan demonstration
NASA Technical Reports Server (NTRS)
1982-01-01
Automated interstellar space exploration is reviewed. The Titan demonstration mission is discussed. Remote sensing and automated modeling are considered. Nuclear electric propulsion, main orbiting spacecraft, lander/rover, subsatellites, atmospheric probes, powered air vehicles, and a surface science network comprise mission component concepts. Machine, intelligence in space exploration is discussed.
Automated road segment creation process : a report on research sponsored by SaferSim.
DOT National Transportation Integrated Search
2016-08-01
This report provides a summary of a set of tools that can be used to automate the process : of generating roadway surfaces from alignment and texture information. The tools developed : were created in Python 3.x and rely on the availability of two da...
Automated decontamination of surface-adherent prions.
Schmitt, A; Westner, I M; Reznicek, L; Michels, W; Mitteregger, G; Kretzschmar, H A
2010-09-01
At present there is no routinely available decontamination procedure in washer-disinfectors to allow the reliable inactivation and/or elimination of prions present on reusable surgical instruments. This means that is not possible to provide assurance for preventing iatrogenic transmission of prion diseases. We need effective procedures in prion decontamination that can be integrated into the usual routine of reprocessing surgical instruments. This article reports on the evaluation of an automated process designed to decontaminate prions in washer-disinfectors using a quantitative, highly sensitive in vivo assay for surface-adherent 22L prions. The automated process showed great advantages when compared with conventional alkaline cleaning. In contrast, the new process was as effective as autoclaving at 134 degrees C for 2h and left no detectable prion infectivity, even for heavily contaminated surfaces. This indicates a reduction of surface-adherent prion infectivity of >7 log units. Due to its compatibility with even delicate surgical instruments, the process can be integrated into the large scale reprocessing of instruments in a central sterile supply department. The system could potentially make an important contribution to the prevention of iatrogenic transmission of prions. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.
Automated quantification of surface water inundation in wetlands using optical satellite imagery
DeVries, Ben; Huang, Chengquan; Lang, Megan W.; Jones, John W.; Huang, Wenli; Creed, Irena F.; Carroll, Mark L.
2017-01-01
We present a fully automated and scalable algorithm for quantifying surface water inundation in wetlands. Requiring no external training data, our algorithm estimates sub-pixel water fraction (SWF) over large areas and long time periods using Landsat data. We tested our SWF algorithm over three wetland sites across North America, including the Prairie Pothole Region, the Delmarva Peninsula and the Everglades, representing a gradient of inundation and vegetation conditions. We estimated SWF at 30-m resolution with accuracies ranging from a normalized root-mean-square-error of 0.11 to 0.19 when compared with various high-resolution ground and airborne datasets. SWF estimates were more sensitive to subtle inundated features compared to previously published surface water datasets, accurately depicting water bodies, large heterogeneously inundated surfaces, narrow water courses and canopy-covered water features. Despite this enhanced sensitivity, several sources of errors affected SWF estimates, including emergent or floating vegetation and forest canopies, shadows from topographic features, urban structures and unmasked clouds. The automated algorithm described in this article allows for the production of high temporal resolution wetland inundation data products to support a broad range of applications.
Establishment and discontinuance criteria for automated weather observing systems (AWOS)
DOT National Transportation Integrated Search
1983-05-01
This report develops establishment and discontinuance criteria for automated : weather observing systems (AWOS) for publication in FAA Order 703l.2B, Airway : Planning Standard Number One. Airway Planning Standard Number One contains : the policy and...
NASA Technical Reports Server (NTRS)
Wampler, E. J.
1972-01-01
Description and evaluation of the remotely operated Lick Observatory Cassegrain focus of the 120-inch telescope. The experience with this instrument has revealed that an automated system can profoundly change the observer's approach to his work. This makes it difficult to evaluate the 'advantage' of an automated telescope over a conventional instrument. Some of the problems arising with automation in astronomy are discussed.
Endoscope reprocessing methods: a prospective study on the impact of human factors and automation.
Ofstead, Cori L; Wetzler, Harry P; Snyder, Alycea K; Horton, Rebecca A
2010-01-01
The main cause of endoscopy-associated infections is failure to adhere to reprocessing guidelines. More information about factors impacting compliance is needed to support the development of effective interventions. The purpose of this multisite, observational study was to evaluate reprocessing practices, employee perceptions, and occupational health issues. Data were collected utilizing interviews, surveys, and direct observation. Written reprocessing policies and procedures were in place at all five sites, and employees affirmed the importance of most recommended steps. Nevertheless, observers documented guideline adherence, with only 1.4% of endoscopes reprocessed using manual cleaning methods with automated high-level disinfection versus 75.4% of those reprocessed using an automated endoscope cleaner and reprocessor. The majority reported health problems (i.e., pain, decreased flexibility, numbness, or tingling). Physical discomfort was associated with time spent reprocessing (p = .041). Discomfort diminished after installation of automated endoscope cleaners and reprocessors (p = .001). Enhanced training and accountability, combined with increased automation, may ensure guideline adherence and patient safety while improving employee satisfaction and health.
NASA Astrophysics Data System (ADS)
Grinyok, A.; Boychuk, I.; Perelygin, D.; Dantsevich, I.
2018-03-01
A complex method of the simulation and production design of open rotor propellers was studied. An end-to-end diagram was proposed for the evaluating, designing and experimental testing the optimal geometry of the propeller surface, for the machine control path generation as well as for simulating the cutting zone force condition and its relationship with the treatment accuracy which was defined by the propeller elastic deformation. The simulation data provided the realization of the combined automated path control of the cutting tool.
Remote surface inspection system
NASA Astrophysics Data System (ADS)
Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.
1993-02-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
Remote surface inspection system
NASA Technical Reports Server (NTRS)
Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.
1993-01-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
CT liver volumetry using geodesic active contour segmentation with a level-set algorithm
NASA Astrophysics Data System (ADS)
Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard
2010-03-01
Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.
Ali, Abdulbaset; Hu, Bing; Ramahi, Omar
2015-05-15
This work presents a real life experiment of implementing an artificial intelligence model for detecting sub-millimeter cracks in metallic surfaces on a dataset obtained from a waveguide sensor loaded with metamaterial elements. Crack detection using microwave sensors is typically based on human observation of change in the sensor's signal (pattern) depicted on a high-resolution screen of the test equipment. However, as demonstrated in this work, implementing artificial intelligence to classify cracked from non-cracked surfaces has appreciable impact in terms of sensing sensitivity, cost, and automation. Furthermore, applying artificial intelligence for post-processing data collected from microwave sensors is a cornerstone for handheld test equipment that can outperform rack equipment with large screens and sophisticated plotting features. The proposed method was tested on a metallic plate with different cracks and the obtained experimental results showed good crack classification accuracy rates.
Reaction Mechanisms on Multiwell Potential Energy Surfaces in Combustion (and Atmospheric) Chemistry
Osborn, David L.
2017-03-15
Chemical reactions occurring on a potential energy surface with multiple wells are ubiquitous in low temperature combustion and the oxidation of volatile organic compounds in earth’s atmosphere. The rich variety of structural isomerizations that compete with collisional stabilization make characterizing such complex-forming reactions challenging. This review describes recent experimental and theoretical advances that deliver increasingly complete views of their reaction mechanisms. New methods for creating reactive intermediates coupled with multiplexed measurements provide many experimental observables simultaneously. Automated methods to explore potential energy surfaces can uncover hidden reactive pathways, while master equation methods enable a holistic treatment of both sequential andmore » well-skipping pathways. Our ability to probe and understand nonequilibrium effects and reaction sequences is increasing. These advances provide the fundamental science base for predictive models of combustion and the atmosphere that are crucial to address global challenges.« less
Reaction Mechanisms on Multiwell Potential Energy Surfaces in Combustion (and Atmospheric) Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, David L.
Chemical reactions occurring on a potential energy surface with multiple wells are ubiquitous in low temperature combustion and the oxidation of volatile organic compounds in earth’s atmosphere. The rich variety of structural isomerizations that compete with collisional stabilization make characterizing such complex-forming reactions challenging. This review describes recent experimental and theoretical advances that deliver increasingly complete views of their reaction mechanisms. New methods for creating reactive intermediates coupled with multiplexed measurements provide many experimental observables simultaneously. Automated methods to explore potential energy surfaces can uncover hidden reactive pathways, while master equation methods enable a holistic treatment of both sequential andmore » well-skipping pathways. Our ability to probe and understand nonequilibrium effects and reaction sequences is increasing. These advances provide the fundamental science base for predictive models of combustion and the atmosphere that are crucial to address global challenges.« less
Ali, Abdulbaset; Hu, Bing; Ramahi, Omar M.
2015-01-01
This work presents a real-life experiment implementing an artificial intelligence model for detecting sub-millimeter cracks in metallic surfaces on a dataset obtained from a waveguide sensor loaded with metamaterial elements. Crack detection using microwave sensors is typically based on human observation of change in the sensor's signal (pattern) depicted on a high-resolution screen of the test equipment. However, as demonstrated in this work, implementing artificial intelligence to classify cracked from non-cracked surfaces has appreciable impacts in terms of sensing sensitivity, cost, and automation. Furthermore, applying artificial intelligence for post-processing the data collected from microwave sensors is a cornerstone for handheld test equipment that can outperform rack equipment with large screens and sophisticated plotting features. The proposed method was tested on a metallic plate with different cracks, and the experimental results showed good crack classification accuracy rates. PMID:25988871
Apparatus and method for automated monitoring of airborne bacterial spores
NASA Technical Reports Server (NTRS)
Ponce, Adrian (Inventor)
2009-01-01
An apparatus and method for automated monitoring of airborne bacterial spores. The apparatus is provided with an air sampler, a surface for capturing airborne spores, a thermal lysis unit to release DPA from bacterial spores, a source of lanthanide ions, and a spectrometer for excitation and detection of the characteristic fluorescence of the aromatic molecules in bacterial spores complexed with lanthanide ions. In accordance with the method: computer-programmed steps allow for automation of the apparatus for the monitoring of airborne bacterial spores.
Array Automated Assembly Task Low Cost Silicon Solar Array Project, Phase 2
NASA Technical Reports Server (NTRS)
Rhee, S. S.; Jones, G. T.; Allison, K. L.
1978-01-01
Progress in the development of solar cells and module process steps for low-cost solar arrays is reported. Specific topics covered include: (1) a system to automatically measure solar cell electrical performance parameters; (2) automation of wafer surface preparation, printing, and plating; (3) laser inspection of mechanical defects of solar cells; and (4) a silicon antireflection coating system. Two solar cell process steps, laser trimming and holing automation and spray-on dopant junction formation, are described.
AN AUTOMATED SYSTEM FOR PRODUCING UNIFORM SURFACE DEPOSITS OF DRY PARTICLES
A laboratory system has been constructed that uniformly deposits dry particles onto any type of test surface. Devised as a quality assurance tool for the purpose of evaluating surface sampling methods for lead, it also may be used to generate test surfaces for any contaminant ...
Grid generation on trimmed Bezier and NURBS quilted surfaces
NASA Technical Reports Server (NTRS)
Woan, Chung-Jin; Clever, Willard C.; Tam, Clement K.
1995-01-01
This paper presents some recently added capabilities to RAGGS, Rockwell Automated Grid Generation System. Included are the trimmed surface handling and display capability and structures and unstructured grid generation on trimmed Bezier and NURBS (non-uniform rational B-spline surfaces) quilted surfaces. Samples are given to demonstrate the new capabilities.
NASA Astrophysics Data System (ADS)
Muller, Jan-Peter; Tao, Yu; Sidiropoulos, Panagiotis; Gwinner, Klaus; Willner, Konrad; Fanara, Lida; Waehlisch, Marita; van Gasselt, Stephan; Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Ivanov, Anton; Cantini, Federico; Wardlaw, Jessica; Morley, Jeremy; Sprinks, James; Giordano, Michele; Marsh, Stuart; Kim, Jungrack; Houghton, Robert; Bamford, Steven
2016-06-01
Understanding planetary atmosphere-surface exchange and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to overlay image data and derived information from different epochs, back in time to the mid 1970s, to examine changes through time, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, we have developed a fully automated multi-resolution DTM processing chain, called the Coregistration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP) [Tao et al., this conference], which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR [Gwinner et al., 2015] have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed by [Sidiropoulos & Muller, this conference]. Using the HRSC map products (both mosaics and orbital strips) as a map-base it is being applied to many of the 400,000 level-1 EDR images taken by the 4 NASA orbital cameras. In particular, the NASA Viking Orbiter camera (VO), Mars Orbiter Camera (MOC), Context Camera (CTX) as well as the High Resolution Imaging Science Experiment (HiRISE) back to 1976. A webGIS has been developed [van Gasselt et al., this conference] for displaying this time sequence of imagery and will be demonstrated showing an example from one of the HRSC quadrangle map-sheets. Automated quality control [Sidiropoulos & Muller, 2015] techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. For result verification these data mining techniques are then being employed within a citizen science project within the Zooniverse family. Examples of data mining and its verification will be presented.
Péharpré, D; Cliquet, F; Sagné, E; Renders, C; Costy, F; Aubert, M
1999-07-01
The rapid fluorescent focus inhibition test (RFFIT) and the fluorescent antibody virus neutralization test (FAVNT) are both diagnostic tests for determining levels of rabies neutralizing antibodies. An automated method for determining fluorescence has been implemented to reduce the work time required for fluorescent visual microscopic observations. The automated method offers several advantages over conventional visual observation, such as the ability to rapidly test many samples. The antibody titers obtained with automated techniques were similar to those obtained with both the RFFIT (n = 165, r = 0.93, P < 0.001) and the FAVNT (n = 52, r = 0.99, P < 0.001).
NASA Astrophysics Data System (ADS)
Antony, Bhavna J.; Abràmoff, Michael D.; Lee, Kyungmoo; Sonkova, Pavlina; Gupta, Priya; Kwon, Young; Niemeijer, Meindert; Hu, Zhihong; Garvin, Mona K.
2010-03-01
Optical coherence tomography (OCT), being a noninvasive imaging modality, has begun to find vast use in the diagnosis and management of ocular diseases such as glaucoma, where the retinal nerve fiber layer (RNFL) has been known to thin. Furthermore, the recent availability of the considerably larger volumetric data with spectral-domain OCT has increased the need for new processing techniques. In this paper, we present an automated 3-D graph-theoretic approach for the segmentation of 7 surfaces (6 layers) of the retina from 3-D spectral-domain OCT images centered on the optic nerve head (ONH). The multiple surfaces are detected simultaneously through the computation of a minimum-cost closed set in a vertex-weighted graph constructed using edge/regional information, and subject to a priori determined varying surface interaction and smoothness constraints. The method also addresses the challenges posed by presence of the large blood vessels and the optic disc. The algorithm was compared to the average manual tracings of two observers on a total of 15 volumetric scans, and the border positioning error was found to be 7.25 +/- 1.08 μm and 8.94 +/- 3.76 μm for the normal and glaucomatous eyes, respectively. The RNFL thickness was also computed for 26 normal and 70 glaucomatous scans where the glaucomatous eyes showed a significant thinning (p < 0.01, mean thickness 73.7 +/- 32.7 μm in normal eyes versus 60.4 +/- 25.2 μm in glaucomatous eyes).
Distribution automation applications of fiber optics
NASA Technical Reports Server (NTRS)
Kirkham, Harold; Johnston, A.; Friend, H.
1989-01-01
Motivations for interest and research in distribution automation are discussed. The communication requirements of distribution automation are examined and shown to exceed the capabilities of power line carrier, radio, and telephone systems. A fiber optic based communication system is described that is co-located with the distribution system and that could satisfy the data rate and reliability requirements. A cost comparison shows that it could be constructed at a cost that is similar to that of a power line carrier system. The requirements for fiber optic sensors for distribution automation are discussed. The design of a data link suitable for optically-powered electronic sensing is presented. Empirical results are given. A modeling technique that was used to understand the reflections of guided light from a variety of surfaces is described. An optical position-indicator design is discussed. Systems aspects of distribution automation are discussed, in particular, the lack of interface, communications, and data standards. The economics of distribution automation are examined.
Determining Tooth Occlusal Surface Relief Indicator by Means of Automated 3d Shape Analysis
NASA Astrophysics Data System (ADS)
Gaboutchian, A. V.; Knyaz, V. A.
2017-05-01
Determining occlusal surface relief indicator plays an important role in odontometric tooth shape analysis. An analysis of the parameters of surface relief indicators provides valuable information about closure of dental arches (occlusion) and changes in structure of teeth in lifetime. Such data is relevant for dentistry or anthropology applications. Descriptive techniques commonly used for surface relief evaluation have limited precision which, as a result, does not provide for reliability of conclusions about structure and functioning of teeth. Parametric techniques developed for such applications need special facilities and are time-consuming which limits their spread and ease to access. Nevertheless the use of 3D models, obtained by photogrammetric techniques, allows attaining required measurements accuracy and has a potential for process automation. We introduce new approaches for determining tooth occlusal surface relief indicator and provide data on efficiency in use of different indicators in natural attrition evaluation.
Prototyping an automated lumber processing system
Powsiri Klinkhachorn; Ravi Kothari; Henry A. Huber; Charles W. McMillin; K. Mukherjee; V. Barnekov
1993-01-01
The Automated Lumber Processing System (ALPS)is a multi-disciplinary continuing effort directed toward increasing the yield obtained from hardwood lumber boards during their process of remanufacture into secondary products (furniture, etc.). ALPS proposes a nondestructive vision system to scan a board for its dimension and the location and expanse of surface defects on...
NASA Astrophysics Data System (ADS)
Murray, Sophie A.; Guerra, Jordan A.; Zucca, Pietro; Park, Sung-Hong; Carley, Eoin P.; Gallagher, Peter T.; Vilmer, Nicole; Bothmer, Volker
2018-04-01
Coronal mass ejections (CMEs) and other solar eruptive phenomena can be physically linked by combining data from a multitude of ground-based and space-based instruments alongside models; however, this can be challenging for automated operational systems. The EU Framework Package 7 HELCATS project provides catalogues of CME observations and properties from the Heliospheric Imagers on board the two NASA/STEREO spacecraft in order to track the evolution of CMEs in the inner heliosphere. From the main HICAT catalogue of over 2,000 CME detections, an automated algorithm has been developed to connect the CMEs observed by STEREO to any corresponding solar flares and active-region (AR) sources on the solar surface. CME kinematic properties, such as speed and angular width, are compared with AR magnetic field properties, such as magnetic flux, area, and neutral line characteristics. The resulting LOWCAT catalogue is also compared to the extensive AR property database created by the EU Horizon 2020 FLARECAST project, which provides more complex magnetic field parameters derived from vector magnetograms. Initial statistical analysis has been undertaken on the new data to provide insight into the link between flare and CME events, and characteristics of eruptive ARs. Warning thresholds determined from analysis of the evolution of these parameters is shown to be a useful output for operational space weather purposes. Parameters of particular interest for further analysis include total unsigned flux, vertical current, and current helicity. The automated method developed to create the LOWCAT catalogue may also be useful for future efforts to develop operational CME forecasting.
Automation of immunohistochemical evaluation in breast cancer using image analysis
Prasad, Keerthana; Tiwari, Avani; Ilanthodi, Sandhya; Prabhu, Gopalakrishna; Pai, Muktha
2011-01-01
AIM: To automate breast cancer diagnosis and to study the inter-observer and intra-observer variations in the manual evaluations. METHODS: Breast tissue specimens from sixty cases were stained separately for estrogen receptor (ER), progesterone receptor (PR) and human epidermal growth factor receptor-2 (HER-2/neu). All cases were assessed by manual grading as well as image analysis. The manual grading was performed by an experienced expert pathologist. To study inter-observer and intra-observer variations, we obtained readings from another pathologist as the second observer from a different laboratory who has a little less experience than the first observer. We also took a second reading from the second observer to study intra-observer variations. Image analysis was carried out using in-house developed software (TissueQuant). A comparison of the results from image analysis and manual scoring of ER, PR and HER-2/neu was also carried out. RESULTS: The performance of the automated analysis in the case of ER, PR and HER-2/neu expressions was compared with the manual evaluations. The performance of the automated system was found to correlate well with the manual evaluations. The inter-observer variations were measured using Spearman correlation coefficient r and 95% confidence interval. In the case of ER expression, Spearman correlation r = 0.53, in the case of PR expression, r = 0.63, and in the case of HER-2/neu expression, r = 0.68. Similarly, intra-observer variations were also measured. In the case of ER, PR and HER-2/neu expressions, r = 0.46, 0.66 and 0.70, respectively. CONCLUSION: The automation of breast cancer diagnosis from immunohistochemically stained specimens is very useful for providing objective and repeatable evaluations. PMID:21611095
Surface and allied studies in silicon solar cells
NASA Technical Reports Server (NTRS)
Lindholm, F. A.
1983-01-01
Two main results are presented. The first deals with a simple method that determines the minority-carrier lifetime and the effective surface recombination velocity of the quasi-neutral base of silicon solar cells. The method requires the observation of only a single transient, and is amenable to automation for in-process monitoring in manufacturing. This method, which is called short-circuit current decay, avoids distortion in the observed transient and consequent inacccuracies that arise from the presence of mobile holes and electrons stored in the p/n junction spacecharge region at the initial instant of the transient. The second main result consists in a formulation of the relevant boundary-value problems that resembles that used in linear two-port network theory. This formulation enables comparisons to be made among various contending methods for measuring material parameters of p/n junction devices, and enables the option of putting the description in the time domain of the transient studies in the form of an infinite series, although closed-form solutions are also possible.
NASA Astrophysics Data System (ADS)
Fegyveresi, J. M.; Alley, R. B.; Muto, A.; Spencer, M. K.; Orsi, A. J.
2014-12-01
Observations at the WAIS Divide site show that near-surface snow is strongly altered by weather-related processes, producing features that are recognizable in the ice core. Prominent reflective "glazed" surface crusts develop frequently during the summer. Observations during austral summers 2008-09 through 2012-13, supplemented by Automated Weather Station data with insolation sensors, documented formation of such crusts during relatively low-wind, low-humidity, clear-sky periods with intense daytime sunshine. After formation, such glazed surfaces typically developed cracks in a polygonal pattern with few-meter spacing, likely from thermal contraction at night. Cracking was commonest when several clear days occurred in succession, and was generally followed by surface hoar growth. Temperature and radiation observations showed that solar heating often warmed the near-surface snow above the air temperature, contributing to mass transfer favoring crust formation. Subsequent investigation of the WDC06A deep ice core revealed that preserved surface crusts were seen in the core at an average rate of ~4.3 ± 2 yr-1 over the past 5500 years. They are about 40% more common in layers deposited during summers than during winters. The total summertime crust frequency also covaried with site temperature, with more present during warmer periods. We hypothesize that the mechanism for glaze formation producing single-grain-thick very-low-porosity thin crusts (i.e. "glazes") involves additional in-filling of open pores. The thermal conductivity of ice greatly exceeds that of air, so heat transport in firn is primarily conductive. Because heat flow is primarily through the grain structure, for a temperature inversion (colder upper surface) beneath a growing thin crust at the upper surface, pores will be colder than interconnected grains, favoring mass transport into those pores. Transport may occur by vapor, surface, or volume diffusion, although vapor diffusion and surface transport in pre-melted films are likely to dominate. On-site wintertime observations have not been made, but crust formation during winter may be favored by greater wind-packing, large meteorologically-forced temperature changes reaching as high as -15oC in midwinter, and perhaps longer intervals of surface stability.
Quality Assessment of Landsat Surface Reflectance Products Using MODIS Data
NASA Technical Reports Server (NTRS)
Feng, Min; Huang, Chengquan; Channan, Saurabh; Vermote, Eric; Masek, Jeffrey G.; Townshend, John R.
2012-01-01
Surface reflectance adjusted for atmospheric effects is a primary input for land cover change detection and for developing many higher level surface geophysical parameters. With the development of automated atmospheric correction algorithms, it is now feasible to produce large quantities of surface reflectance products using Landsat images. Validation of these products requires in situ measurements, which either do not exist or are difficult to obtain for most Landsat images. The surface reflectance products derived using data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS), however, have been validated more comprehensively. Because the MODIS on the Terra platform and the Landsat 7 are only half an hour apart following the same orbit, and each of the 6 Landsat spectral bands overlaps with a MODIS band, good agreements between MODIS and Landsat surface reflectance values can be considered indicators of the reliability of the Landsat products, while disagreements may suggest potential quality problems that need to be further investigated. Here we develop a system called Landsat-MODIS Consistency Checking System (LMCCS). This system automatically matches Landsat data with MODIS observations acquired on the same date over the same locations and uses them to calculate a set of agreement metrics. To maximize its portability, Java and open-source libraries were used in developing this system, and object-oriented programming (OOP) principles were followed to make it more flexible for future expansion. As a highly automated system designed to run as a stand-alone package or as a component of other Landsat data processing systems, this system can be used to assess the quality of essentially every Landsat surface reflectance image where spatially and temporally matching MODIS data are available. The effectiveness of this system was demonstrated using it to assess preliminary surface reflectance products derived using the Global Land Survey (GLS) Landsat images for the 2000 epoch. As surface reflectance likely will be a standard product for future Landsat missions, the approach developed in this study can be adapted as an operational quality assessment system for those missions.
NASA Technical Reports Server (NTRS)
Hook, Simon J.
2008-01-01
The presentation includes an introduction, Lake Tahoe site layout and measurements, Salton Sea site layout and measurements, field instrument calibration and cross-calculations, data reduction methodology and error budgets, and example results for MODIS. Summary and conclusions are: 1) Lake Tahoe CA/NV automated validation site was established in 1999 to assess radiometric accuracy of satellite and airborne mid and thermal infrared data and products. Water surface temperatures range from 4-25C.2) Salton Sea CA automated validation site was established in 2008 to broaden range of available water surface temperatures and atmospheric water vapor test cases. Water surface temperatures range from 15-35C. 3) Sites provide all information necessary for validation every 2 mins (bulk temperature, skin temperature, air temperature, wind speed, wind direction, net radiation, relative humidity). 4) Sites have been used to validate mid and thermal infrared data and products from: ASTER, AATSR, ATSR2, MODIS-Terra, MODIS-Aqua, Landsat 5, Landsat 7, MTI, TES, MASTER, MAS. 5) Approximately 10 years of data available to help validate AVHRR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Van Berkel, Gary J
A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmapsmore » of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.« less
Strategies Toward Automation of Overset Structured Surface Grid Generation
NASA Technical Reports Server (NTRS)
Chan, William M.
2017-01-01
An outline of a strategy for automation of overset structured surface grid generation on complex geometries is described. The starting point of the process consists of an unstructured surface triangulation representation of the geometry derived from a native CAD, STEP, or IGES definition, and a set of discretized surface curves that captures all geometric features of interest. The procedure for surface grid generation is decomposed into an algebraic meshing step, a hyperbolic meshing step, and a gap-filling step. This paper will focus primarily on the high-level plan with details on the algebraic step. The algorithmic procedure for the algebraic step involves analyzing the topology of the network of surface curves, distributing grid points appropriately on these curves, identifying domains bounded by four curves that can be meshed algebraically, concatenating the resulting grids into fewer patches, and extending appropriate boundaries of the concatenated grids to provide proper overlap. Results are presented for grids created on various aerospace vehicle components.
L, Frère; I, Paul-Pont; J, Moreau; P, Soudant; C, Lambert; A, Huvet; E, Rinnert
2016-12-15
Every step of microplastic analysis (collection, extraction and characterization) is time-consuming, representing an obstacle to the implementation of large scale monitoring. This study proposes a semi-automated Raman micro-spectroscopy method coupled to static image analysis that allows the screening of a large quantity of microplastic in a time-effective way with minimal machine operator intervention. The method was validated using 103 particles collected at the sea surface spiked with 7 standard plastics: morphological and chemical characterization of particles was performed in <3h. The method was then applied to a larger environmental sample (n=962 particles). The identification rate was 75% and significantly decreased as a function of particle size. Microplastics represented 71% of the identified particles and significant size differences were observed: polystyrene was mainly found in the 2-5mm range (59%), polyethylene in the 1-2mm range (40%) and polypropylene in the 0.335-1mm range (42%). Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zolnierczuk, Piotr A; Vacaliuc, Bogdan; Sundaram, Madhan
The Liquids Reflectometer instrument installed at the Spallation Neutron Source (SNS) enables observations of chemical kinetics, solid-state reactions and phase-transitions of thin film materials at both solid and liquid surfaces. Effective measurement of these behaviors requires each sample to be calibrated dynamically using the neutron beam and the data acquisition system in a feedback loop. Since the SNS is an intense neutron source, the time needed to perform the measurement can be the same as the alignment process, leading to a labor-intensive operation that is exhausting to users. An update to the instrument control system, completed in March 2013, implementedmore » the key features of automated sample alignment and robot-driven sample management, allowing for unattended operation over extended periods, lasting as long as 20 hours. We present a case study of the effort, detailing the mechanical, electrical and software modifications that were made as well as the lessons learned during the integration, verification and testing process.« less
Ebert, Lars Christian; Ptacek, Wolfgang; Breitbeck, Robert; Fürst, Martin; Kronreif, Gernot; Martinez, Rosa Maria; Thali, Michael; Flach, Patricia M
2014-06-01
In this paper we present the second prototype of a robotic system to be used in forensic medicine. The system is capable of performing automated surface documentation using photogrammetry, optical surface scanning and image-guided, post-mortem needle placement for tissue sampling, liquid sampling, or the placement of guide wires. The upgraded system includes workflow optimizations, an automatic tool-change mechanism, a new software module for trajectory planning and a fully automatic computed tomography-data-set registration algorithm. We tested the placement accuracy of the system by using a needle phantom with radiopaque markers as targets. The system is routinely used for surface documentation and resulted in 24 surface documentations over the course of 11 months. We performed accuracy tests for needle placement using a biopsy phantom, and the Virtobot placed introducer needles with an accuracy of 1.4 mm (±0.9 mm). The second prototype of the Virtobot system is an upgrade of the first prototype but mainly focuses on streamlining the workflow and increasing the level of automation and also has an easier user interface. These upgrades make the Virtobot a potentially valuable tool for case documentation in a scalpel-free setting that uses purely imaging techniques and minimally invasive procedures and is the next step toward the future of virtual autopsy.
DAME: planetary-prototype drilling automation.
Glass, B; Cannon, H; Branson, M; Hanagud, S; Paulsen, G
2008-06-01
We describe results from the Drilling Automation for Mars Exploration (DAME) project, including those of the summer 2006 tests from an Arctic analog site. The drill hardware is a hardened, evolved version of the Advanced Deep Drill by Honeybee Robotics. DAME has developed diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The DAME drill automation tested from 2004 through 2006 included adaptively controlled drilling operations and the downhole diagnosis of drilling faults. It also included dynamic recovery capabilities when unexpected failures or drilling conditions were discovered. DAME has developed and tested drill automation software and hardware under stressful operating conditions during its Arctic field testing campaigns at a Mars analog site.
DAME: Planetary-Prototype Drilling Automation
NASA Astrophysics Data System (ADS)
Glass, B.; Cannon, H.; Branson, M.; Hanagud, S.; Paulsen, G.
2008-06-01
We describe results from the Drilling Automation for Mars Exploration (DAME) project, including those of the summer 2006 tests from an Arctic analog site. The drill hardware is a hardened, evolved version of the Advanced Deep Drill by Honeybee Robotics. DAME has developed diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The DAME drill automation tested from 2004 through 2006 included adaptively controlled drilling operations and the downhole diagnosis of drilling faults. It also included dynamic recovery capabilities when unexpected failures or drilling conditions were discovered. DAME has developed and tested drill automation software and hardware under stressful operating conditions during its Arctic field testing campaigns at a Mars analog site.
Remote surface inspection system. [of large space platforms
NASA Technical Reports Server (NTRS)
Hayati, Samad; Balaram, J.; Seraji, Homayoun; Kim, Won S.; Tso, Kam S.
1993-01-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
Shimizu, Tsutomu; Yamaguchi, Takefumi; Satake, Yoshiyuki; Shimazaki, Jun
2015-03-01
The aim of this study was to investigate topographic "hot spots" on the anterior corneal surface before Descemet stripping automated endothelial keratoplasty (DSAEK) and their effects on postoperative visual acuity and hyperopic shift. Twenty-seven eyes of 27 patients with bullous keratopathy, who underwent DSAEK were studied. We defined a hot spot as a focal area with relatively high refractive power on the anterior corneal surface in eyes with bullous keratopathy. Refractive spherical equivalent, keratometric value, and corneal topography were retrospectively evaluated using anterior segment optical coherence tomography (AS-OCT). Hot spots were identified in 11 eyes (42.3%) before DSAEK and disappeared in 9 eyes of these eyes (81.8%) at 6 months after DSAEK. AS-OCT revealed focal epithelial thickening in the same areas as the hot spots. There was no significant difference in the postoperative visual acuity between eyes with and without hot spots (P > 0.05). The keratometric value of the anterior corneal surface significantly flattened from 45.7 ± 2.7 diopters (D) before DSAEK to 44.2 ± 2.7 D 1 month after DSAEK in eyes with hot spots (P = 0.01), whereas in eyes without hot spots, there were no significant differences in the keratometric values before and after DSAEK. At 6 months, the refractive change was +1.1 ± 1.3 D in eyes with hot spots and -0.2 ± 0.6 D in eyes without hot spots (P = 0.034). In eyes with focal epithelial thickening, topographic hot spots on the anterior corneal surface were observed using AS-OCT. The hot spots disappeared after DSAEK and had no influence on the postoperative visual acuity.
Automated Mapping and Characterization of RSL from HiRISE data with MAARSL
NASA Astrophysics Data System (ADS)
Bue, Brian; Wagstaff, Kiri; Stillman, David
2017-10-01
Recurring slope lineae (RSL) are narrow (0.5-5m) low-albedo features on Mars that recur, fade, and incrementally lengthen on steep slopes throughout the year. Determining the processes that generate RSL requires detailed analysis of high-resolution orbital images to measure RSL surface properties and seasonal variation. However, conducting this analysis manually is labor intensive, time consuming, and infeasible given the large number of relevant sites. This abstract describes the Mapping and Automated Analysis of RSL (MAARSL) system, which we designed to aid large-scale analysis of seasonal RSL properties. MAARSL takes an ordered sequence of high spatial resolution, orthorectified, and coregistered orbital image data (e.g., MRO HiRISE images) and a corresponding Digital Terrain Model (DTM) as input and performs three primary functions: (1) detect and delineate candidate RSL in each image, (2) compute statistics of surface morphology and observed radiance for each candidate, and (3) measure temporal variation between candidates in adjacent images.The main challenge in automatic image-based RSL detection is discriminating true RSL from other low-albedo regions such as shadows or changes in surface materials is . To discriminate RSL from shadows, MAARSL constructs a linear illumination model for each image based on the DTM and position and orientation of the instrument at image acquisition time. We filter out any low-albedo regions that appear to be shadows via a least-squares fit between the modeled illumination and the observed intensity in each image. False detections occur in areas where the 1m/pixel HiRISE DTM poorly captures the variability of terrain observed in the 0.25m/pixel HiRISE images. To remove these spurious detections, we developed an interactive machine learning graphical interface that uses expert input to filter and validate the RSL candidates. This tool yielded 636 candidates from a well-studied sequence of 18 HiRISE images of Garni crater in Valles Marineris with minimal manual effort. We describe our analysis of RSL candidates at Garni crater and Coprates Montes and ongoing studies of other regions where RSL occur.
NASA Astrophysics Data System (ADS)
Gilson, Gaëlle; Jiskoot, Hester
2017-04-01
Arctic sea fog hasn't been extensively studied despite its importance for environmental impact such as on traffic safety and on glacier ablation in coastal Arctic regions. Understanding fog processes can improve nowcasting of environmental impact in such remote regions where few observational data exist. To understand fog's physical, macrophysical and radiative properties, it is important to determine accurate Arctic fog climatology. Our previous study suggested that fog peaks in July over East Greenland and associates with sea ice break-up and a sea breeze with wind speeds between 1-4 m/s. The goal of this study is to understand Arctic coastal fog macrophysical properties and quantify its vertical extent. Radiosonde profiles were extracted from the Integrated Global Radiosonde Archive (IGRA) between 1980-2012, coincident with manual and automated fog observations at three synoptic weather stations along the coast of East Greenland. A new method using air mass saturation ratio and thermodynamic stability was developed to derive fog top height from IGRA radiosonde profiles. Soundings were classified into nine categories, based on surface and low-level saturation ratio, inversion type, and the fog top height relative to the inversion base. Results show that Arctic coastal fog mainly occurs under thermodynamically stable conditions characterized by deep and strong low-level inversions. Fog thickness is commonly about 100-400 m, often reaching the top of the boundary layer. Fog top height is greater at northern stations, where daily fog duration is also longer and often lasts throughout the day. Fog thickness is likely correlated to sea ice concentration density during sea ice break-up. Overall, it is hypothesized that our sounding classes represent development or dissipation stages of advection fog, or stratus lowering and fog lifting processes. With a new automated method, it is planned to retrieve fog height from IGRA data over Arctic terrain around the entire North Atlantic region. These results will serve as a basis for the incorporation of fog and temperature inversions into glacier surface energy balance models and can aid in improving the parameterization of fog for nowcasting methods for aviation applications.
Process Development for Automated Solar Cell and Module Production. Task 4: Automated Array Assembly
NASA Technical Reports Server (NTRS)
1979-01-01
A baseline sequence for the manufacture of solar cell modules was specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells. The cells were then series connected on a ribbon and bonded into a finished glass tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.
A Graphical Operator Interface for a Telerobotic Inspection System
NASA Technical Reports Server (NTRS)
Kim, W. S.; Tso, K. S.; Hayati, S.
1993-01-01
Operator interface has recently emerged as an important element for efficient and safe operatorinteractions with the telerobotic system. Recent advances in graphical user interface (GUI) andgraphics/video merging technologies enable development of more efficient, flexible operatorinterfaces. This paper describes an advanced graphical operator interface newly developed for aremote surface inspection system at Jet Propulsion Laboratory. The interface has been designed sothat remote surface inspection can be performed by a single operator with an integrated robot controland image inspection capability. It supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.
An operator interface design for a telerobotic inspection system
NASA Technical Reports Server (NTRS)
Kim, Won S.; Tso, Kam S.; Hayati, Samad
1993-01-01
The operator interface has recently emerged as an important element for efficient and safe interactions between human operators and telerobotics. Advances in graphical user interface and graphics technologies enable us to produce very efficient operator interface designs. This paper describes an efficient graphical operator interface design newly developed for remote surface inspection at NASA-JPL. The interface, designed so that remote surface inspection can be performed by a single operator with an integrated robot control and image inspection capability, supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.
Efficacy of an automated ultraviolet C device in a shared hospital bathroom.
Cooper, Jesse; Bryce, Elizabeth; Astrakianakis, George; Stefanovic, Aleksandra; Bartlett, Karen
2016-12-01
Toilet flushing can contribute to disease transmission by generating aerosolized bacteria and viruses that can land on nearby surfaces or follow air currents. Aerobic and anaerobic bacterial bioaerosol loads, and bacterial counts on 2 surfaces in a bathroom with a permanently installed, automated ultraviolet C (UVC) irradiation device, were significantly lower than in a comparable bathroom without the UVC device. Permanently installed UVC lights may be a useful supplementary decontamination tool in shared patient bathrooms. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Automatic aneurysm neck detection using surface Voronoi diagrams.
Cárdenes, Rubén; Pozo, José María; Bogunovic, Hrvoje; Larrabide, Ignacio; Frangi, Alejandro F
2011-10-01
A new automatic approach for saccular intracranial aneurysm isolation is proposed in this work. Due to the inter- and intra-observer variability in manual delineation of the aneurysm neck, a definition based on a minimum cost path around the aneurysm sac is proposed that copes with this variability and is able to make consistent measurements along different data sets, as well as to automate and speedup the analysis of cerebral aneurysms. The method is based on the computation of a minimal path along a scalar field obtained on the vessel surface, to find the aneurysm neck in a robust and fast manner. The computation of the scalar field on the surface is obtained using a fast marching approach with a speed function based on the exponential of the distance from the centerline bifurcation between the aneurysm dome and the parent vessels. In order to assure a correct topology of the aneurysm sac, the neck computation is constrained to a region defined by a surface Voronoi diagram obtained from the branches of the vessel centerline. We validate this method comparing our results in 26 real cases with manual aneurysm isolation obtained using a cut-plane, and also with results obtained using manual delineations from three different observers by comparing typical morphological measures. © 2011 IEEE
A Case Study of Reverse Engineering Integrated in an Automated Design Process
NASA Astrophysics Data System (ADS)
Pescaru, R.; Kyratsis, P.; Oancea, G.
2016-11-01
This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.
Identifying and locating surface defects in wood: Part of an automated lumber processing system
Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa
1983-01-01
Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...
Machine vision for various manipulation tasks
NASA Astrophysics Data System (ADS)
Domae, Yukiyasu
2017-03-01
Bin-picking, re-grasping, pick-and-place, kitting, etc. There are many manipulation tasks in the fields of automation of factory, warehouse and so on. The main problem of the automation is that the target objects (items/parts) have various shapes, weights and surface materials. In my talk, I will show latest machine vision systems and algorithms against the problem.
NASA Technical Reports Server (NTRS)
Schroeder, Lyle C.; Bailey, M. C.; Mitchell, John L.
1992-01-01
Methods for increasing the electromagnetic (EM) performance of reflectors with rough surfaces were tested and evaluated. First, one quadrant of the 15-meter hoop-column antenna was retrofitted with computer-driven and controlled motors to allow automated adjustment of the reflector surface. The surface errors, measured with metric photogrammetry, were used in a previously verified computer code to calculate control motor adjustments. With this system, a rough antenna surface (rms of approximately 0.180 inch) was corrected in two iterations to approximately the structural surface smoothness limit of 0.060 inch rms. The antenna pattern and gain improved significantly as a result of these surface adjustments. The EM performance was evaluated with a computer program for distorted reflector antennas which had been previously verified with experimental data. Next, the effects of the surface distortions were compensated for in computer simulations by superimposing excitation from an array feed to maximize antenna performance relative to an undistorted reflector. Results showed that a 61-element array could produce EM performance improvements equal to surface adjustments. When both mechanical surface adjustment and feed compensation techniques were applied, the equivalent operating frequency increased from approximately 6 to 18 GHz.
Reduced Microvascular Density in Omental Biopsies of Children with Chronic Kidney Disease
Grabe, Niels; Lahrmann, Bernd; Nasser, Hamoud; Freise, Christian; Schneider, Axel; Lingnau, Anja; Degenhardt, Petra; Ranchin, Bruno; Sallay, Peter; Cerkauskiene, Rimante; Malina, Michal; Ariceta, Gema; Schmitt, Claus Peter; Querfeld, Uwe
2016-01-01
Background Endothelial dysfunction is an early manifestation of cardiovascular disease (CVD) and consistently observed in patients with chronic kidney disease (CKD). We hypothesized that CKD is associated with systemic damage to the microcirculation, preceding macrovascular pathology. To assess the degree of “uremic microangiopathy”, we have measured microvascular density in biopsies of the omentum of children with CKD. Patients and Methods Omental tissue was collected from 32 healthy children (0–18 years) undergoing elective abdominal surgery and from 23 age-matched cases with stage 5 CKD at the time of catheter insertion for initiation of peritoneal dialysis. Biopsies were analyzed by independent observers using either a manual or an automated imaging system for the assessment of microvascular density. Quantitative immunohistochemistry was performed for markers of autophagy and apoptosis, and for the abundance of the angiogenesis-regulating proteins VEGF-A, VEGF-R2, Angpt1 and Angpt2. Results Microvascular density was significantly reduced in uremic children compared to healthy controls, both by manual imaging with a digital microscope (median surface area 0.61% vs. 0.95%, p<0.0021 and by automated quantification (total microvascular surface area 0.89% vs. 1.17% p = 0.01). Density measured by manual imaging was significantly associated with age, height, weight and body surface area in CKD patients and healthy controls. In multivariate analysis, age and serum creatinine level were the only independent, significant predictors of microvascular density (r2 = 0.73). There was no immunohistochemical evidence for apoptosis or autophagy. Quantitative staining showed similar expression levels of the angiogenesis regulators VEGF-A, VEGF-receptor 2 and Angpt1 (p = 0.11), but Angpt2 was significantly lower in CKD children (p = 0.01). Conclusions Microvascular density is profoundly reduced in omental biopsies of children with stage 5 CKD and associated with diminished Angpt2 signaling. Microvascular rarefaction could be an early systemic manifestation of CKD-induced cardiovascular disease. PMID:27846250
[Automated analyser of organ cultured corneal endothelial mosaic].
Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L
2002-05-01
Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
Dust emission from wet and dry playas in the Mojave Desert, USA
Reynolds, R.L.; Yount, J.C.; Reheis, M.; Goldstein, H.; Chavez, P.; Fulton, R.; Whitney, J.; Fuller, C.; Forester, R.M.
2007-01-01
The interactions between playa hydrology and playa-surface sediments are important factors that control the type and amount of dust emitted from playas as a result of wind erosion. The production of evaporite minerals during evaporative loss of near-surface ground water results in both the creation and maintenance of several centimeters or more of loose sediment on and near the surfaces of wet playas. Observations that characterize the texture, mineralogic composition and hardness of playa surfaces at Franklin Lake, Soda Lake and West Cronese Lake playas in the Mojave Desert (California), along with imaging of dust emission using automated digital photography, indicate that these kinds of surface sediment are highly susceptible to dust emission. The surfaces of wet playas are dynamic - surface texture and sediment availability to wind erosion change rapidly, primarily in response to fluctuations in water-table depth, rainfall and rates of evaporation. In contrast, dry playas are characterized by ground water at depth. Consequently, dry playas commonly have hard surfaces that produce little or no dust if undisturbed except for transient silt and clay deposited on surfaces by wind and water. Although not the dominant type of global dust, salt-rich dusts from wet playas may be important with respect to radiative properties of dust plumes, atmospheric chemistry, windborne nutrients and human health.
Development and application of an automated precision solar radiometer
NASA Astrophysics Data System (ADS)
Qiu, Gang-gang; Li, Xin; Zhang, Quan; Zheng, Xiao-bing; Yan, Jing
2016-10-01
Automated filed vicarious calibration is becoming a growing trend for satellite remote sensor, which require a solar radiometer have to automatic measure reliable data for a long time whatever the weather conditions and transfer measurement data to the user office. An automated precision solar radiometer has been developed. It is used in measuring the solar spectral irradiance received at the Earth surface. The instrument consists of 8 parallel separate silicon-photodiode-based channels with narrow band-pass filters from the visible to near-IR regions. Each channel has a 2.0° full-angle Filed of View (FOV). The detectors and filters are temperature stabilized using a Thermal Energy Converter at 30+/-0.2°. The instrument is pointed toward the sun via an auto-tracking system that actively tracks the sun within a +/-0.1°. It collects data automatically and communicates with user terminal through BDS (China's BeiDou Navigation Satellite System) while records data as a redundant in internal memory, including working state and error. The solar radiometer is automated in the sense that it requires no supervision throughout the whole process of working. It calculates start-time and stop-time every day matched with the time of sunrise and sunset, and stop working once the precipitation. Calibrated via Langley curves and simultaneous observed with CE318, the different of Aerosol Optical Depth (AOD) is within 5%. The solar radiometer had run in all kinds of harsh weather condition in Gobi in Dunhuang and obtain the AODs nearly eight months continuously. This paper presents instrument design analysis, atmospheric optical depth retrievals as well as the experiment result.
Hydrometer calibration by hydrostatic weighing with automated liquid surface positioning
NASA Astrophysics Data System (ADS)
Aguilera, Jesus; Wright, John D.; Bean, Vern E.
2008-01-01
We describe an automated apparatus for calibrating hydrometers by hydrostatic weighing (Cuckow's method) in tridecane, a liquid of known, stable density, and with a relatively low surface tension and contact angle against glass. The apparatus uses a laser light sheet and a laser power meter to position the tridecane surface at the hydrometer scale mark to be calibrated with an uncertainty of 0.08 mm. The calibration results have an expanded uncertainty (with a coverage factor of 2) of 100 parts in 106 or less of the liquid density. We validated the apparatus by comparisons using water, toluene, tridecane and trichloroethylene, and found agreement within 40 parts in 106 or less. The new calibration method is consistent with earlier, manual calibrations performed by NIST. When customers use calibrated hydrometers, they may encounter uncertainties of 370 parts in 106 or larger due to surface tension, contact angle and temperature effects.
Piccinelli, Marina; Faber, Tracy L; Arepalli, Chesnal D; Appia, Vikram; Vinten-Johansen, Jakob; Schmarkey, Susan L; Folks, Russell D; Garcia, Ernest V; Yezzi, Anthony
2014-02-01
Accurate alignment between cardiac CT angiographic studies (CTA) and nuclear perfusion images is crucial for improved diagnosis of coronary artery disease. This study evaluated in an animal model the accuracy of a CTA fully automated biventricular segmentation algorithm, a necessary step for automatic and thus efficient PET/CT alignment. Twelve pigs with acute infarcts were imaged using Rb-82 PET and 64-slice CTA. Post-mortem myocardium mass measurements were obtained. Endocardial and epicardial myocardial boundaries were manually and automatically detected on the CTA and both segmentations used to perform PET/CT alignment. To assess the segmentation performance, image-based myocardial masses were compared to experimental data; the hand-traced profiles were used as a reference standard to assess the global and slice-by-slice robustness of the automated algorithm in extracting myocardium, LV, and RV. Mean distances between the automated and the manual 3D segmented surfaces were computed. Finally, differences in rotations and translations between the manual and automatic surfaces were estimated post-PET/CT alignment. The largest, smallest, and median distances between interactive and automatic surfaces averaged 1.2 ± 2.1, 0.2 ± 1.6, and 0.7 ± 1.9 mm. The average angular and translational differences in CT/PET alignments were 0.4°, -0.6°, and -2.3° about x, y, and z axes, and 1.8, -2.1, and 2.0 mm in x, y, and z directions. Our automatic myocardial boundary detection algorithm creates surfaces from CTA that are similar in accuracy and provide similar alignments with PET as those obtained from interactive tracing. Specific difficulties in a reliable segmentation of the apex and base regions will require further improvements in the automated technique.
Tool to assess contents of ARM surface meteorology network netCDF files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudt, A.; Kwan, T.; Tichler, J.
The Atmospheric Radiation Measurement (ARM) Program, supported by the US Department of Energy, is a major program of atmospheric measurement and modeling designed to improve the understanding of processes and properties that affect atmospheric radiation, with a particular focus on the influence of clouds and the role of cloud radiative feedback in the climate system. The ARM Program will use three highly instrumented primary measurement sites. Deployment of instrumentation at the first site, located in the Southern Great Plains of the United States, began in May of 1992. The first phase of deployment at the second site in the Tropicalmore » Western Pacific is scheduled for late in 1995. The third site will be in the North Slope of Alaska and adjacent Arctic Ocean. To meet the scientific objectives of ARM, observations from the ARM sites are combined with data from other sources; these are called external data. Among these external data sets are surface meteorological observations from the Oklahoma Mesonet, a Kansas automated weather network, the Wind Profiler Demonstration Network (WPDN), and the National Weather Service (NWS) surface stations. Before combining these data with the Surface Meteorological Observations Station (SMOS) ARM data, it was necessary to assess the contents and quality of both the ARM and the external data sets. Since these data sets had previously been converted to netCDF format for use by the ARM Science Team, a tool was written to assess the contents of the netCDF files.« less
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation
2013-01-01
The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.
Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid
2013-08-09
: The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.
Automated pulmonary lobar ventilation measurements using volume-matched thoracic CT and MRI
NASA Astrophysics Data System (ADS)
Guo, F.; Svenningsen, S.; Bluemke, E.; Rajchl, M.; Yuan, J.; Fenster, A.; Parraga, G.
2015-03-01
Objectives: To develop and evaluate an automated registration and segmentation pipeline for regional lobar pulmonary structure-function measurements, using volume-matched thoracic CT and MRI in order to guide therapy. Methods: Ten subjects underwent pulmonary function tests and volume-matched 1H and 3He MRI and thoracic CT during a single 2-hr visit. CT was registered to 1H MRI using an affine method that incorporated block-matching and this was followed by a deformable step using free-form deformation. The resultant deformation field was used to deform the associated CT lobe mask that was generated using commercial software. 3He-1H image registration used the same two-step registration method and 3He ventilation was segmented using hierarchical k-means clustering. Whole lung and lobar 3He ventilation and ventilation defect percent (VDP) were generated by mapping ventilation defects to CT-defined whole lung and lobe volumes. Target CT-3He registration accuracy was evaluated using region- , surface distance- and volume-based metrics. Automated whole lung and lobar VDP was compared with semi-automated and manual results using paired t-tests. Results: The proposed pipeline yielded regional spatial agreement of 88.0+/-0.9% and surface distance error of 3.9+/-0.5 mm. Automated and manual whole lung and lobar ventilation and VDP were not significantly different and they were significantly correlated (r = 0.77, p < 0.0001). Conclusion: The proposed automated pipeline can be used to generate regional pulmonary structural-functional maps with high accuracy and robustness, providing an important tool for image-guided pulmonary interventions.
NASA Astrophysics Data System (ADS)
Marke, T.; Crewell, S.; Loehnert, U.; Rascher, U.; Schween, J. H.
2015-12-01
This study aims at identifying spatial and temporal patterns of surface-atmosphere exchange parameters from highly-resolved and long-term observations. For this purpose, a combination of continuous ground-based measurements and dedicated aircraft campaigns using state-of-the-art remote sensing instrumentation at the Jülich Observatory for Cloud Evolution (JOYCE) is available. JOYCE provides a constantly growing multi-year data set for detailed insight into boundary layer processes and patterns related to surface conditions since 2011. The JOYCE site is embedded in a rural environment with different crop types. The availability of a scanning microwave radiometer and cloud radar is a unique component of JOYCE. The hemispheric scans of the ground-based radiometer allow the identification and quantification of horizontal gradients in water vapor and liquid water path measurements. How these gradients are connected to near-surface fluxes and the topography depending on the mean wind flow and surface fluxes is investigated by exploring the long-term data set. Additionally, situations with strong coupling to the surface can be identified by observing the atmospheric turbulence and stability within the boundary layer, using different lidar systems. Furthermore, the influence of thin liquid water clouds, which are typical for the boundary layer development, on the radiation field and the interaction with the vegetation is examined. Applying a synergistic statistical retrieval approach, using passive microwave and infrared observations, shows an improvement in retrieving thin liquid cloud microphysical properties. The role of vegetation is assessed by exploiting the time series of the sun-induced chlorophyll fluorescence (SIF) signal measured at the ground level using automated measurements. For selected case studies, a comparison to maps of hyperspectral reflectance and SIF obtained from an airborne high-resolution imaging spectrometer is realized.
Automated real-time detection of defects during machining of ceramics
Ellingson, W.A.; Sun, J.
1997-11-18
Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known ``feature masks`` representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified. 14 figs.
Automated real-time detection of defects during machining of ceramics
Ellingson, William A.; Sun, Jiangang
1997-01-01
Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known "feature masks" representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified.
An Automated Sample Processing System for Planetary Exploration
NASA Technical Reports Server (NTRS)
Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther
2012-01-01
An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.
NASA Technical Reports Server (NTRS)
Thornhill, J. W.
1977-01-01
The development of a process for fabricating 2 x 4 cm back surface field silicon solar cells having screen printed wraparound contacts is described. This process was specifically designed to be amenable for incorporation into the automated nonvacuum production line. Techniques were developed to permit the use of screen printing for producing improved back surface field structures, wraparound dielectric layers, and wraparound contacts. The optimized process sequence was then used to produce 1852 finished cells. Tests indicated an average conversion efficiency of 11% at AMO and 28 C, with an average degradation of maximum power output of 1.5% after boiling water immersion or thermal shock cycling. Contact adherence was satisfactory after these tests, as well as long term storage at high temperature and high humidity.
Automated segmentation of intraretinal layers from macular optical coherence tomography images
NASA Astrophysics Data System (ADS)
Haeker, Mona; Sonka, Milan; Kardon, Randy; Shah, Vinay A.; Wu, Xiaodong; Abràmoff, Michael D.
2007-03-01
Commercially-available optical coherence tomography (OCT) systems (e.g., Stratus OCT-3) only segment and provide thickness measurements for the total retina on scans of the macula. Since each intraretinal layer may be affected differently by disease, it is desirable to quantify the properties of each layer separately. Thus, we have developed an automated segmentation approach for the separation of the retina on (anisotropic) 3-D macular OCT scans into five layers. Each macular series consisted of six linear radial scans centered at the fovea. Repeated series (up to six, when available) were acquired for each eye and were first registered and averaged together, resulting in a composite image for each angular location. The six surfaces defining the five layers were then found on each 3-D composite image series by transforming the segmentation task into that of finding a minimum-cost closed set in a geometric graph constructed from edge/regional information and a priori-determined surface smoothness and interaction constraints. The method was applied to the macular OCT scans of 12 patients with unilateral anterior ischemic optic neuropathy (corresponding to 24 3-D composite image series). The boundaries were independently defined by two human experts on one raw scan of each eye. Using the average of the experts' tracings as a reference standard resulted in an overall mean unsigned border positioning error of 6.7 +/- 4.0 μm, with five of the six surfaces showing significantly lower mean errors than those computed between the two observers (p < 0.05, pixel size of 50 × 2 μm).
An automated skin segmentation of Breasts in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.
Lee, Chia-Yen; Chang, Tzu-Fang; Chang, Nai-Yun; Chang, Yeun-Chung
2018-04-18
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is used to diagnose breast disease. Obtaining anatomical information from DCE-MRI requires the skin be manually removed so that blood vessels and tumors can be clearly observed by physicians and radiologists; this requires considerable manpower and time. We develop an automated skin segmentation algorithm where the surface skin is removed rapidly and correctly. The rough skin area is segmented by the active contour model, and analyzed in segments according to the continuity of the skin thickness for accuracy. Blood vessels and mammary glands are retained, which remedies the defect of removing some blood vessels in active contours. After three-dimensional imaging, the DCE-MRIs without the skin can be used to see internal anatomical information for clinical applications. The research showed the Dice's coefficients of the 3D reconstructed images using the proposed algorithm and the active contour model for removing skins are 93.2% and 61.4%, respectively. The time performance of segmenting skins automatically is about 165 times faster than manually. The texture information of the tumors position with/without the skin is compared by the paired t-test yielded all p < 0.05, which suggested the proposed algorithm may enhance observability of tumors at the significance level of 0.05.
Highly automated driving, secondary task performance, and driver state.
Merat, Natasha; Jamson, A Hamish; Lai, Frank C H; Carsten, Oliver
2012-10-01
A driving simulator study compared the effect of changes in workload on performance in manual and highly automated driving. Changes in driver state were also observed by examining variations in blink patterns. With the addition of a greater number of advanced driver assistance systems in vehicles, the driver's role is likely to alter in the future from an operator in manual driving to a supervisor of highly automated cars. Understanding the implications of such advancements on drivers and road safety is important. A total of 50 participants were recruited for this study and drove the simulator in both manual and highly automated mode. As well as comparing the effect of adjustments in driving-related workload on performance, the effect of a secondary Twenty Questions Task was also investigated. In the absence of the secondary task, drivers' response to critical incidents was similar in manual and highly automated driving conditions. The worst performance was observed when drivers were required to regain control of driving in the automated mode while distracted by the secondary task. Blink frequency patterns were more consistent for manual than automated driving but were generally suppressed during conditions of high workload. Highly automated driving did not have a deleterious effect on driver performance, when attention was not diverted to the distracting secondary task. As the number of systems implemented in cars increases, an understanding of the implications of such automation on drivers' situation awareness, workload, and ability to remain engaged with the driving task is important.
A Successful Automated Search for Crouching Giants
NASA Astrophysics Data System (ADS)
Cabanela, J. E.; Dickey, J. M.
2000-12-01
Much effort has been expended during the last two decades on the search for Low Surface Brightness galaxies (LSBs), the galaxies Disney called ``Crouching Giants,'' which may be a dominant mass repository in the universe. The difficulty in gathering information on a significant population of LSBs lies in the time-consuming nature of identifying LSB candidates. To date, all survey-based searches for LSBs have involved manual inspections of plate-based material or optical CCD observations. We have conducted the first successful automated search for HI-rich galaxies (including LSBs) using the Minnesota Automated Plate Scanner (APS) Catalog of the POSS I. We identified HI-rich candidates by selecting galaxies located on the ``blue edge'' of an O-E vs. E color-magnitude diagram from the APS Catalog. Subsequent 21-cm observations on the upgraded Arecibo 305m dish showed that over 50% of our observed candidates were HI-rich with M{H{ I}}/LB ranging from 0.1 to 4.8 (in solar units). These M{H{ I}}/LB values are comparable to those of LSB candidates selected by manual means. Comparison of our candidate galaxies with known LSB galaxies shows that they have similar bivariate brightness distributions as well as other optical properties. Furthermore, examination of existing LSB catalogs shows that over 65% of LSBs are located on the ``blue edge,'' whereas only 10% of the general APS galaxy population has O-E values this low. Known LSB galaxies on the O-E ``blue edge'' include several LSBs with red B-V colors from O'Neil, Bothun, and Schombert (2000), indicating our bandpasses are critical in the segregation of these LSB candidates from the general population. We have determined the physical basis for the success of these simple search criteria, which is tied to the low current star formation rate of LSBs. The details of the search algorithm and guidelines of how to apply it to other existing surveys, such as the SDSS, will be provided.
Improvements in AVHRR Daytime Cloud Detection Over the ARM NSA Site
NASA Technical Reports Server (NTRS)
Chakrapani, V.; Spangenberg, D. A.; Doelling, D. R.; Minnis, P.; Trepte, Q. Z.; Arduini, R. F.
2001-01-01
Clouds play an important role in the radiation budget over Arctic and Antarctic. Because of limited surface observing capabilities, it is necessary to detect clouds over large areas using satellite imagery. At low and mid-latitudes, satellite-observed visible (VIS; 0.65 micrometers) and infrared (IR; 11 micrometers) radiance data are used to derive cloud fraction, temperature, and optical depth. However, the extreme variability in the VIS surface albedo makes the detection of clouds from satellite a difficult process in polar regions. The IR data often show that the surface is nearly the same temperature or even colder than clouds, further complicating cloud detection. Also, the boundary layer can have large areas of haze, thin fog, or diamond dust that are not seen in standard satellite imagery. Other spectral radiances measured by satellite imagers provide additional information that can be used to more accurately discriminate clouds from snow and ice. Most techniques currently use a fixed reflectance or temperature threshold to decide between clouds and clear snow. Using a subjective approach, Minnis et al. (2001) found that the clear snow radiance signatures vary as a function of viewing and illumination conditions as well as snow condition. To routinely process satellite imagery over polar regions with an automated algorithm, it is necessary to account for this angular variability and the change in the background reflectance as snow melts, vegetation grows over land, and melt ponds form on pack ice. This paper documents the initial satellite-based cloud product over the Atmospheric Radiation Measurement (ARM) North Slope of Alaska (NSA) site at Barrow for use by the modeling community. Cloud amount and height are determined subjectively using an adaptation of the methodology of Minnis et al. (2001) and the radiation fields arc determined following the methods of Doelling et al. (2001) as applied to data taken during the Surface Heat and Energy Budget of the Arctic (SHEBA). The procedures and data produced in this empirically based analysis will also facilitate the development of the automated algorithm for future processing of satellite data over the ARM NSA domain. Results are presented for May, June, and July 1998. ARM surface data are use to partially validate the results taken directly over the ARM site.
Dallabernardina, Pietro; Ruprecht, Colin; Smith, Peter J; Hahn, Michael G; Urbanowicz, Breeanna R; Pfrengle, Fabian
2017-12-06
We report the automated glycan assembly of oligosaccharides related to the plant cell wall hemicellulosic polysaccharide xyloglucan. The synthesis of galactosylated xyloglucan oligosaccharides was enabled by introducing p-methoxybenzyl (PMB) as a temporary protecting group for automated glycan assembly. The generated oligosaccharides were printed as microarrays, and the binding of a collection of xyloglucan-directed monoclonal antibodies (mAbs) to the oligosaccharides was assessed. We also demonstrated that the printed glycans can be further enzymatically modified while appended to the microarray surface by Arabidopsis thaliana xyloglucan xylosyltransferase 2 (AtXXT2).
Design automation for complex CMOS/SOS LSI hybrid substrates
NASA Technical Reports Server (NTRS)
Ramondetta, P. W.; Smiley, J. W.
1976-01-01
A design automated approach used to develop thick-film hybrid packages is described. The hybrid packages produced combine thick-film and silicon on sapphire (SOS) laser surface interaction technologies to bring the on-chip performance level of SOS to the subsystem level. Packing densities are improved by a factor of eight over ceramic dual in-line packing; interchip wiring capacitance is low. Due to significant time savings, the design automated approach presented can be expected to yield a 3:1 reduction in cost over the use of manual methods for the initial design of a hybrid.
Lightning Jump Algorithm Development for the GOES·R Geostationary Lightning Mapper
NASA Technical Reports Server (NTRS)
Schultz. E.; Schultz. C.; Chronis, T.; Stough, S.; Carey, L.; Calhoun, K.; Ortega, K.; Stano, G.; Cecil, D.; Bateman, M.;
2014-01-01
Current work on the lightning jump algorithm to be used in GOES-R Geostationary Lightning Mapper (GLM)'s data stream is multifaceted due to the intricate interplay between the storm tracking, GLM proxy data, and the performance of the lightning jump itself. This work outlines the progress of the last year, where analysis and performance of the lightning jump algorithm with automated storm tracking and GLM proxy data were assessed using over 700 storms from North Alabama. The cases analyzed coincide with previous semi-objective work performed using total lightning mapping array (LMA) measurements in Schultz et al. (2011). Analysis shows that key components of the algorithm (flash rate and sigma thresholds) have the greatest influence on the performance of the algorithm when validating using severe storm reports. Automated objective analysis using the GLM proxy data has shown probability of detection (POD) values around 60% with false alarm rates (FAR) around 73% using similar methodology to Schultz et al. (2011). However, when applying verification methods similar to those employed by the National Weather Service, POD values increase slightly (69%) and FAR values decrease (63%). The relationship between storm tracking and lightning jump has also been tested in a real-time framework at NSSL. This system includes fully automated tracking by radar alone, real-time LMA and radar observations and the lightning jump. Results indicate that the POD is strong at 65%. However, the FAR is significantly higher than in Schultz et al. (2011) (50-80% depending on various tracking/lightning jump parameters) when using storm reports for verification. Given known issues with Storm Data, the performance of the real-time jump algorithm is also being tested with high density radar and surface observations from the NSSL Severe Hazards Analysis & Verification Experiment (SHAVE).
Burt, Stephen
2016-09-28
A wide range of surface and near-surface meteorological observations were made at the University of Reading's Atmospheric Observatory in central southern England (latitude 51.441° N, longitude 0.938° W, altitude 66 m above mean sea level) during the deep partial eclipse on the morning of 20 March 2015. Observations of temperature, humidity, radiation, wind speed and direction, and atmospheric pressure were made by computerized logging equipment at 1 Hz, supplemented by an automated cloud base recorder sampling at 1 min intervals and a high-resolution (approx. 10 m vertical interval) atmospheric sounding by radiosonde launched from the same location during the eclipse. Sources and details of each instrumental measurement are described briefly, followed by a summary of observed and derived measurements by meteorological parameter. Atmospheric boundary layer responses to the solar eclipse were muted owing to the heavily overcast conditions which prevailed at the observing location, but instrumental records of the event documented a large (approx. 80%) reduction in global solar radiation, a fall in air temperature of around 0.6°C, a decrease in cloud base height, and a slight increase in atmospheric stability during the eclipse. Changes in surface atmospheric moisture content and barometric pressure were largely insignificant during the event.This article is part of the themed issue 'Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse'. © 2016 The Author(s).
2016-01-01
A wide range of surface and near-surface meteorological observations were made at the University of Reading’s Atmospheric Observatory in central southern England (latitude 51.441° N, longitude 0.938° W, altitude 66 m above mean sea level) during the deep partial eclipse on the morning of 20 March 2015. Observations of temperature, humidity, radiation, wind speed and direction, and atmospheric pressure were made by computerized logging equipment at 1 Hz, supplemented by an automated cloud base recorder sampling at 1 min intervals and a high-resolution (approx. 10 m vertical interval) atmospheric sounding by radiosonde launched from the same location during the eclipse. Sources and details of each instrumental measurement are described briefly, followed by a summary of observed and derived measurements by meteorological parameter. Atmospheric boundary layer responses to the solar eclipse were muted owing to the heavily overcast conditions which prevailed at the observing location, but instrumental records of the event documented a large (approx. 80%) reduction in global solar radiation, a fall in air temperature of around 0.6°C, a decrease in cloud base height, and a slight increase in atmospheric stability during the eclipse. Changes in surface atmospheric moisture content and barometric pressure were largely insignificant during the event. This article is part of the themed issue ‘Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse’. PMID:27550762
Agile Science Operations: A New Approach for Primitive Exploration Bodies
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Thompson, David R.; Castillo-Rogez, Julie C.; Doyle, Richard; Estlin, Tara; Mclaren, David
2012-01-01
Primitive body exploration missions such as potential Comet Surface Sample Return or Trojan Tour and Rendezvous would challenge traditional operations practices. Earth-based observations would provide only basic understanding before arrival and many science goals would be defined during the initial rendezvous. It could be necessary to revise trajectories and observation plans to quickly characterize the target for safe, effective observations. Detection of outgassing activity and monitoring of comet surface activity are even more time constrained, with events occurring faster than round-trip light time. "Agile science operations" address these challenges with contingency plans that recognize the intrinsic uncertainty in the operating environment and science objectives. Planning for multiple alternatives can significantly improve the time required to repair and validate spacecraft command sequences. When appropriate, time-critical decisions can be automated and shifted to the spacecraft for immediate access to instrument data. Mirrored planning systems on both sides of the light-time gap permit transfer of authority back and forth as needed. We survey relevant science objectives, identifying time bottlenecks and the techniques that could be used to speed missions' reaction to new science data. Finally, we discuss the results of a trade study simulating agile observations during flyby and comet rendezvous scenarios. These experiments quantify instrument coverage of key surface features as a function of planning turnaround time. Careful application of agile operations techniques can play a significant role in realizing the Decadal Survey plan for primitive body exploration
ICOADS: A Foundational Database with a new Release
NASA Astrophysics Data System (ADS)
Angel, W.; Freeman, E.; Woodruff, S. D.; Worley, S. J.; Brohan, P.; Dumenil-Gates, L.; Kent, E. C.; Smith, S. R.
2016-02-01
The International Comprehensive Ocean-Atmosphere Data Set (ICOADS) offers surface marine data spanning the past three centuries and is the world's largest collection of marine surface in situ observations with approximately 300 million unique records from 1662 to the present in a common International Maritime Meteorological Archive (IMMA) format. Simple gridded monthly summary products (including netCDF) for 2° latitude x 2° longitude boxes back to 1800 and 1° x 1° boxes since 1960 are computed for each month. ICOADS observations made available in the IMMA format are taken primarily from ships (merchant, ocean research, fishing, navy, etc.) and moored and drifting buoys. Each report contains individual observations of meteorological and oceanographic variables, such as sea surface and air temperatures, winds, pressure, humidity, wet bulb, dew point, ocean waves and cloudiness. A monthly summary for an area box includes ten statistics (e.g. mean, median, standard deviation, etc.) for 22 observed and computed variables (e.g. sea surface and air temperature, wind, pressure, humidity, cloudiness, etc.). ICOADS is the most complete and heterogeneous collection of surface marine data in existence. A major new historical update, Release 3.0 (R3.0), now in production (with availability anticipated in mid-2016) will contain a variety of important updates. These updates will include unique IDs (UIDs), new IMMA attachments, ICOADS Value-Added Database (IVAD), and numerous new or improved historical and contemporary data sources. UIDs are assigned to each individual marine report, which will greatly facilitate interaction between users and data developers, and affords record traceability. A new Near-Surface Oceanographic (Nocn) attachment has been developed to include oceanographic profile elements, such as sea surface salinity, sea surface temperatures, and their associated measurement depths. Additionally, IVAD allows a feedback mechanism of data adjustments which can be stored within each IMMA report. R3.0 includes near-surface ocean profile measurements from sources such as the World Ocean Database (WOD), Shipboard Automated Meteorological and Oceanographic System (SAMOS), as well as many others. An in-depth look at the improvements and the data inputs planned for R3.0 will be further discussed.
USDA-ARS?s Scientific Manuscript database
Biologically active compounds originating from agricultural, residential, and industrial sources have been detected in surface waters, which have invoked concern of their potential ecological and human health effects. Automated and grab surface water samples, passive water samples - Polar Organic Co...
Earth Observations from the International Space Station: Benefits for Humanity
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2015-01-01
The International Space Station (ISS) is a unique terrestrial remote sensing platform for observation of the Earth's land surface, oceans, and atmosphere. Unlike automated remote-sensing platforms it has a human crew; is equipped with both internal and externally-mounted active and passive remote sensing instruments; and has an inclined, low-Earth orbit that provides variable views and lighting (day and night) over 95 percent of the inhabited surface of the Earth. As such, it provides a useful complement to autonomous, sun-synchronous sensor systems in higher altitude polar orbits. Beginning in May 2012, NASA ISS sensor systems have been available to respond to requests for data through the International Charter, Space and Major Disasters, also known as the "International Disaster Charter" or IDC. Data from digital handheld cameras, multispectral, and hyperspectral imaging systems has been acquired in response to IDC activations and delivered to requesting agencies through the United States Geological Survey. The characteristics of the ISS for Earth observation will be presented, including past, current, and planned NASA, International Partner, and commercial remote sensing systems. The role and capabilities of the ISS for humanitarian benefit, specifically collection of remotely sensed disaster response data, will be discussed.
Automated registration of tail bleeding in rats.
Johansen, Peter B; Henriksen, Lars; Andresen, Per R; Lauritzen, Brian; Jensen, Kåre L; Juhl, Trine N; Tranholm, Mikael
2008-05-01
An automated system for registration of tail bleeding in rats using a camera and a user-designed PC-based software program has been developed. The live and processed images are displayed on the screen and are exported together with a text file for later statistical processing of the data allowing calculation of e.g. number of bleeding episodes, bleeding times and bleeding areas. Proof-of-principle was achieved when the camera captured the blood stream after infusion of rat whole blood into saline. Suitability was assessed by recording of bleeding profiles in heparin-treated rats, demonstrating that the system was able to capture on/off bleedings and that the data transfer and analysis were conducted successfully. Then, bleeding profiles were visually recorded by two independent observers simultaneously with the automated recordings after tail transection in untreated rats. Linear relationships were found in the number of bleedings, demonstrating, however, a statistically significant difference in the recording of bleeding episodes between observers. Also, the bleeding time was longer for visual compared to automated recording. No correlation was found between blood loss and bleeding time in untreated rats, but in heparinized rats a correlation was suggested. Finally, the blood loss correlated with the automated recording of bleeding area. In conclusion, the automated system has proven suitable for replacing visual recordings of tail bleedings in rats. Inter-observer differences can be eliminated, monotonous repetitive work avoided, and a higher through-put of animals in less time achieved. The automated system will lead to an increased understanding of the nature of bleeding following tail transection in different rodent models.
Boers, A M; Marquering, H A; Jochem, J J; Besselink, N J; Berkhemer, O A; van der Lugt, A; Beenen, L F; Majoie, C B
2013-08-01
Cerebral infarct volume as observed in follow-up CT is an important radiologic outcome measure of the effectiveness of treatment of patients with acute ischemic stroke. However, manual measurement of CIV is time-consuming and operator-dependent. The purpose of this study was to develop and evaluate a robust automated measurement of the CIV. The CIV in early follow-up CT images of 34 consecutive patients with acute ischemic stroke was segmented with an automated intensity-based region-growing algorithm, which includes partial volume effect correction near the skull, midline determination, and ventricle and hemorrhage exclusion. Two observers manually delineated the CIV. Interobserver variability of the manual assessments and the accuracy of the automated method were evaluated by using the Pearson correlation, Bland-Altman analysis, and Dice coefficients. The accuracy was defined as the correlation with the manual assessment as a reference standard. The Pearson correlation for the automated method compared with the reference standard was similar to the manual correlation (R = 0.98). The accuracy of the automated method was excellent with a mean difference of 0.5 mL with limits of agreement of -38.0-39.1 mL, which were more consistent than the interobserver variability of the 2 observers (-40.9-44.1 mL). However, the Dice coefficients were higher for the manual delineation. The automated method showed a strong correlation and accuracy with the manual reference measurement. This approach has the potential to become the standard in assessing the infarct volume as a secondary outcome measure for evaluating the effectiveness of treatment.
Automatic detection of surface changes on Mars - a status report
NASA Astrophysics Data System (ADS)
Sidiropoulos, Panagiotis; Muller, Jan-Peter
2016-10-01
Orbiter missions have acquired approximately 500,000 high-resolution visible images of the Martian surface, covering an area approximately 6 times larger than the overall area of Mars. This data abundance allows the scientific community to examine the Martian surface thoroughly and potentially make exciting new discoveries. However, the increased data volume, as well as its complexity, generate problems at the data processing stages, which are mainly related to a number of unresolved issues that batch-mode planetary data processing presents. As a matter of fact, the scientific community is currently struggling to scale the common ("one-at-a-time" processing of incoming products by expert scientists) paradigm to tackle the large volumes of input data. Moreover, expert scientists are more or less forced to use complex software in order to extract input information for their research from raw data, even though they are not data scientists themselves.Our work within the STFC and EU FP7 i-Mars projects aims at developing automated software that will process all of the acquired data, leaving domain expert planetary scientists to focus on their final analysis and interpretation. Moreover, after completing the development of a fully automated pipeline that processes automatically the co-registration of high-resolution NASA images to ESA/DLR HRSC baseline, our main goal has shifted to the automated detection of surface changes on Mars. In particular, we are developing a pipeline that uses as an input multi-instrument image pairs, which are processed by an automated pipeline, in order to identify changes that are correlated with Mars surface dynamic phenomena. The pipeline has currently been tested in anger on 8,000 co-registered images and by the time of DPS/EPSC we expect to have processed many tens of thousands of image pairs, producing a set of change detection results, a subset of which will be shown in the presentation.The research leading to these results has received funding from the STFC "MSSL Consolidated Grant under "Planetary Surface Data Mining" ST/K000977/1 and partial support from the European Union's Seventh Framework Programme (FP7/2007-2013) under iMars grant agreement number 607379
FRAMES Metadata Reporting Templates for Ecohydrological Observations, version 1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christianson, Danielle; Varadharajan, Charuleka; Christoffersen, Brad
FRAMES is a a set of Excel metadata files and package-level descriptive metadata that are designed to facilitate and improve capture of desired metadata for ecohydrological observations. The metadata are bundled with data files into a data package and submitted to a data repository (e.g. the NGEE Tropics Data Repository) via a web form. FRAMES standardizes reporting of diverse ecohydrological and biogeochemical data for synthesis across a range of spatiotemporal scales and incorporates many best data science practices. This version of FRAMES supports observations for primarily automated measurements collected by permanently located sensors, including sap flow (tree water use), leafmore » surface temperature, soil water content, dendrometry (stem diameter growth increment), and solar radiation. Version 1.1 extend the controlled vocabulary and incorporates functionality to facilitate programmatic use of data and FRAMES metadata (R code available at NGEE Tropics Data Repository).« less
Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze
2009-01-01
This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019
Transparent Conveyor of Dielectric Liquids or Particles
NASA Technical Reports Server (NTRS)
Calle, Carlos I.; Mantovani, James G.
2009-01-01
The concept of a transparent conveyor of small loose dielectric parti cles or small amounts of dielectric liquids has emerged as an outgro wth of an effort to develop efficient, reliable means of automated re moval of dust from solar cells and from windows of optical instrumen ts. This concept is based on the previously reported concept of an e lectrodynamic screen, according to which a grid-like electric field is established on and near a surface and is moved along the surface p erpendicularly to the grid lines. The resulting electrodynamic force s on loose dielectric particles or dielectric liquid drops in the vic inity would move the particles or drops along the surface. In the or iginal dust-removal application, dust particles would thus be swept out of the affected window area. Other potential applications may occ ur in nanotechnology -- for example, involving mixing of two or more fluids and/or nanoscale particles under optical illumination and/or optical observation.
Automation of temperature control for large-array microwave surface applicators.
Zhou, L; Fessenden, P
1993-01-01
An adaptive temperature control system has been developed for the microstrip antenna array applicators used for large area superficial hyperthermia. A recursive algorithm which allows rapid power updating even for large antenna arrays and accounts for coupling between neighbouring antennas has been developed, based on a first-order difference equation model. Surface temperatures from the centre of each antenna element are the primary feedback information. Also used are temperatures from additional surface probes placed within the treatment field to protect locations vulnerable to excessive temperatures. In addition, temperatures at depth are observed by mappers and utilized to restrain power to reduce treatment-related complications. Experiments on a tissue-equivalent phantom capable of dynamic differential cooling have successfully verified this temperature control system. The results with the 25 (5 x 5) antenna array have demonstrated that during dynamic water cooling changes and other experimentally simulated disturbances, the controlled temperatures converge to desired temperature patterns with a precision close to the resolution of the thermometry system (0.1 degree C).
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J; Kertesz, Vilmos; Gan, Jinping
2016-03-25
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites were studied. Major organs (brain, lung, liver, kidney and muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed the same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. In addition, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement. Copyright © 2015 Elsevier B.V. All rights reserved.
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.; ...
2015-11-03
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
Statistically Comparing the Performance of Multiple Automated Raters across Multiple Items
ERIC Educational Resources Information Center
Kieftenbeld, Vincent; Boyer, Michelle
2017-01-01
Automated scoring systems are typically evaluated by comparing the performance of a single automated rater item-by-item to human raters. This presents a challenge when the performance of multiple raters needs to be compared across multiple items. Rankings could depend on specifics of the ranking procedure; observed differences could be due to…
Comparability of automated human induced pluripotent stem cell culture: a pilot study.
Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J
2016-12-01
Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.
Apparatus for automated testing of biological specimens
Layne, Scott P.; Beugelsdijk, Tony J.
1999-01-01
An apparatus for performing automated testing of infections biological specimens is disclosed. The apparatus comprise a process controller for translating user commands into test instrument suite commands, and a test instrument suite comprising a means to treat the specimen to manifest an observable result, and a detector for measuring the observable result to generate specimen test results.
ERIC Educational Resources Information Center
Yu, Eunjeong; Moon, Kwangsu; Oah, Shezeen; Lee, Yohaeng
2013-01-01
This study evaluated the effectiveness of an automated observation and feedback system in improving safe sitting postures. Participants were four office workers. The dependent variables were the percentages of time participants spent in five safe body positions during experimental sessions. We used a multiple-baseline design counterbalanced across…
Automated spectral classification and the GAIA project
NASA Technical Reports Server (NTRS)
Lasala, Jerry; Kurtz, Michael J.
1995-01-01
Two dimensional spectral types for each of the stars observed in the global astrometric interferometer for astrophysics (GAIA) mission would provide additional information for the galactic structure and stellar evolution studies, as well as helping in the identification of unusual objects and populations. The classification of the large quantity generated spectra requires that automated techniques are implemented. Approaches for the automatic classification are reviewed, and a metric-distance method is discussed. In tests, the metric-distance method produced spectral types with mean errors comparable to those of human classifiers working at similar resolution. Data and equipment requirements for an automated classification survey, are discussed. A program of auxiliary observations is proposed to yield spectral types and radial velocities for the GAIA-observed stars.
The Automation-by-Expertise-by-Training Interaction.
Strauch, Barry
2017-03-01
I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.
Automated drug dispensing system reduces medication errors in an intensive care setting.
Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick
2010-12-01
We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; p<.05); however, no significant difference was observed before automated dispensing system implementation (20.4% and 19.3%, respectively; not significant). Before-and-after comparisons in the study unit also showed a significantly reduced percentage of total opportunities for error (20.4% and 13.5%; p<.01). An analysis of detailed opportunities for error showed a significant impact of the automated dispensing system in reducing preparation errors (p<.05). Most errors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.
An automated A-value measurement tool for accurate cochlear duct length estimation.
Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K
2018-01-22
There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p < 0.01 and r = 0.69, p < 0.01, respectively). An automated A-value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit cochlear implant recipients in the future.
Automated Eddy Current Inspection on Space Shuttle Hardware
NASA Technical Reports Server (NTRS)
Hartmann, John; Felker, Jeremy
2007-01-01
Over the life time of the Space Shuttle program, metal parts used for the Reusable Solid Rocket Motors (RSRMs) have been nondestructively inspected for cracks and surface breaking discontinuities using magnetic particle (steel) and penetrant methods. Although these inspections adequately screened for critical sized cracks in most regions of the hardware, it became apparent after detection of several sub-critical flaws that the processes were very dependent on operator attentiveness and training. Throughout the 1990's, eddy current inspections were added to areas that had either limited visual access or were more fracture critical. In the late 1990's. a project was initiated to upgrade NDE inspections with the overall objective of improving inspection reliability and control. An automated eddy current inspection system was installed in 2001. A figure shows one of the inspection bays with the robotic axis of the system highlighted. The system was programmed to inspect the various case, nozzle, and igniter metal components that make up an RSRM. both steel and aluminum. For the past few years, the automated inspection system has been a part of the baseline inspection process for steel components. Although the majority of the RSRM metal part inventory ts free of detectable surface flaws, a few small, sub-critical manufacturing defects have been detected with the automated system. This paper will summarize the benefits that have been realized with the current automated eddy current system, as well as the flaws that have been detected.
sFIDA automation yields sub-femtomolar limit of detection for Aβ aggregates in body fluids.
Herrmann, Yvonne; Kulawik, Andreas; Kühbach, Katja; Hülsemann, Maren; Peters, Luriano; Bujnicki, Tuyen; Kravchenko, Kateryna; Linnartz, Christina; Willbold, Johannes; Zafiu, Christian; Bannach, Oliver; Willbold, Dieter
2017-03-01
Alzheimer's disease (AD) is a neurodegenerative disorder with yet non-existent therapeutic and limited diagnostic options. Reliable biomarker-based AD diagnostics are of utmost importance for the development and application of therapeutic substances. We have previously introduced a platform technology designated 'sFIDA' for the quantitation of amyloid β peptide (Aβ) aggregates as AD biomarker. In this study we implemented the sFIDA assay on an automated platform to enhance robustness and performance of the assay. In sFIDA (surface-based fluorescence intensity distribution analysis) Aβ species are immobilized by a capture antibody to a glass surface. Aβ aggregates are then multiply loaded with fluorescent antibodies and quantitated by high resolution fluorescence microscopy. As a model system for Aβ aggregates, we used Aβ-conjugated silica nanoparticles (Aβ-SiNaPs) diluted in PBS buffer and cerebrospinal fluid, respectively. Automation of the assay was realized on a liquid handling system in combination with a microplate washer. The automation of the sFIDA assay results in improved intra-assay precision, linearity and sensitivity in comparison to the manual application, and achieved a limit of detection in the sub-femtomolar range. Automation improves the precision and sensitivity of the sFIDA assay, which is a prerequisite for high-throughput measurements and future application of the technology in routine AD diagnostics. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Automated extraction of single H atoms with STM: tip state dependency
NASA Astrophysics Data System (ADS)
Møller, Morten; Jarvis, Samuel P.; Guérinet, Laurent; Sharp, Peter; Woolley, Richard; Rahe, Philipp; Moriarty, Philip
2017-02-01
The atomistic structure of the tip apex plays a crucial role in performing reliable atomic-scale surface and adsorbate manipulation using scanning probe techniques. We have developed an automated extraction routine for controlled removal of single hydrogen atoms from the H:Si(100) surface. The set of atomic extraction protocols detect a variety of desorption events during scanning tunneling microscope (STM)-induced modification of the hydrogen-passivated surface. The influence of the tip state on the probability for hydrogen removal was examined by comparing the desorption efficiency for various classifications of STM topographs (rows, dimers, atoms, etc). We find that dimer-row-resolving tip apices extract hydrogen atoms most readily and reliably (and with least spurious desorption), while tip states which provide atomic resolution counter-intuitively have a lower probability for single H atom removal.
Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery
NASA Astrophysics Data System (ADS)
Wright, Nicholas C.; Polashenski, Chris M.
2018-04-01
Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.
Annual Greenland Accumulation Rates (2009-2012) from Airborne Snow Radar
NASA Technical Reports Server (NTRS)
Koenig, Lora S.; Ivanoff, Alvaro; Alexander, Patrick M.; MacGregor, Joseph A.; Fettweis, Xavier; Panzer, Ben; Paden, John D.; Forster, Richard R.; Das, Indrani; McConnell, Joseph R.;
2016-01-01
Contemporary climate warming over the Arctic is accelerating mass loss from the Greenland Ice Sheet through increasing surface melt, emphasizing the need to closely monitor its surface mass balance in order to improve sea-level rise predictions. Snow accumulation is the largest component of the ice sheet's surface mass balance, but in situ observations thereof are inherently sparse and models are difficult to evaluate at large scales. Here, we quantify recent Greenland accumulation rates using ultra-wideband (2-6.5 gigahertz) airborne snow radar data collected as part of NASA's Operation IceBridge between 2009 and 2012. We use a semi-automated method to trace the observed radiostratigraphy and then derive annual net accumulation rates for 2009-2012. The uncertainty in these radar-derived accumulation rates is on average 14 percent. A comparison of the radarderived accumulation rates and contemporaneous ice cores shows that snow radar captures both the annual and longterm mean accumulation rate accurately. A comparison with outputs from a regional climate model (MAR - Modele Atmospherique Regional for Greenland and vicinity) shows that this model matches radar-derived accumulation rates in the ice sheet interior but produces higher values over southeastern Greenland. Our results demonstrate that snow radar can efficiently and accurately map patterns of snow accumulation across an ice sheet and that it is valuable for evaluating the accuracy of surface mass balance models.
VizieR Online Data Catalog: HD 50138 short-term variability (Borges Fernandes+, 2012)
NASA Astrophysics Data System (ADS)
Borges Fernandes, M.; Kraus, M.; Nickeler, D. H.; De Cat, P.; Lampens, P.; Pereira, C. B.; Oksala, M. E.
2012-10-01
A total of 72 files related to the optical spectra taken within 8 observing nights with HERMES spectrograph, covering the photospheric HeI 4026Å and SiII 4128Å and 4131Å lines, the circumstellar [OI] 6364Å line, and the lines formed in the upper layers of the stellar atmosphere or very close to the stellar surface: SiII 6347Å and 6371Å, HeI 6678Å, H10, H9 and Halpha lines. The data were reduced by automated data reduction pipeline and cosmic ray and heliocentric velocity corrected (but not telluric corrected). (2 data files).
Brief communication: Landslide motion from cross correlation of UAV-derived morphological attributes
NASA Astrophysics Data System (ADS)
Peppa, Maria V.; Mills, Jon P.; Moore, Phil; Miller, Pauline E.; Chambers, Jonathan E.
2017-12-01
Unmanned aerial vehicles (UAVs) can provide observations of high spatio-temporal resolution to enable operational landslide monitoring. In this research, the construction of digital elevation models (DEMs) and orthomosaics from UAV imagery is achieved using structure-from-motion (SfM) photogrammetric procedures. The study examines the additional value that the morphological attribute of openness
, amongst others, can provide to surface deformation analysis. Image-cross-correlation functions and DEM subtraction techniques are applied to the SfM outputs. Through the proposed integrated analysis, the automated quantification of a landslide's motion over time is demonstrated, with implications for the wider interpretation of landslide kinematics via UAV surveys.
Investigating How Contact Angle Effects the Interaction between Water and a Hydrophobic Surface
NASA Astrophysics Data System (ADS)
Poynor, Adele; Neidig, Caitlyn
2012-02-01
By definition hydrophobic substances hate water. What happens when water is forced into contact with a hydrophobic surface? One theory is that an ultra-thin low-density region forms near the surface. Contact angle is a measure of how hydrophobic a surface is. We have employed an automated home-built Surface Plasmon Resonance (SPR) apparatus to investigate the effect of varying the contact angle on the depletion layer
On Feature Extraction from Large Scale Linear LiDAR Data
NASA Astrophysics Data System (ADS)
Acharjee, Partha Pratim
Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are presented. Significant power demand is located in urban areas, where, theoretically, a large amount of building surface area is also available for solar panel installation. Therefore, property owners and power generation companies can benefit from a citywide solar potential map, which can provide available estimated annual solar energy at a given location. An efficient solar potential measurement is a prerequisite for an effective solar energy system in an urban area. In addition, the solar potential calculation from rooftops and building facades could open up a wide variety of options for solar panel installations. However, complex urban scenes make it hard to estimate the solar potential, partly because of shadows cast by the buildings. LiDAR-based 3D city models could possibly be the right technology for solar potential mapping. Although, most of the current LiDAR-based local solar potential assessment algorithms mainly address rooftop potential calculation, whereas building facades can contribute a significant amount of viable surface area for solar panel installation. In this paper, we introduce a new algorithm to calculate solar potential of both rooftop and building facades. Solar potential received by the rooftops and facades over the year are also investigated in the test area.
Kaminsky, Jan; Rodt, Thomas; Gharabaghi, Alireza; Forster, Jan; Brand, Gerd; Samii, Madjid
2005-06-01
The FE-modeling of complex anatomical structures is not solved satisfyingly so far. Voxel-based as opposed to contour-based algorithms allow an automated mesh generation based on the image data. Nonetheless their geometric precision is limited. We developed an automated mesh-generator that combines the advantages of voxel-based generation with improved representation of the geometry by displacement of nodes on the object-surface. Models of an artificial 3D-pipe-section and a skullbase were generated with different mesh-densities using the newly developed geometric, unsmoothed and smoothed voxel generators. Compared to the analytic calculation of the 3D-pipe-section model the normalized RMS error of the surface stress was 0.173-0.647 for the unsmoothed voxel models, 0.111-0.616 for the smoothed voxel models with small volume error and 0.126-0.273 for the geometric models. The highest element-energy error as a criterion for the mesh quality was 2.61x10(-2) N mm, 2.46x10(-2) N mm and 1.81x10(-2) N mm for unsmoothed, smoothed and geometric voxel models, respectively. The geometric model of the 3D-skullbase resulted in the lowest element-energy error and volume error. This algorithm also allowed the best representation of anatomical details. The presented geometric mesh-generator is universally applicable and allows an automated and accurate modeling by combining the advantages of the voxel-technique and of improved surface-modeling.
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Homogenisation of the strain distribution in stretch formed parts to improve part properties
NASA Astrophysics Data System (ADS)
Schmitz, Roman; Winkelmann, Mike; Bailly, David; Hirt, Gerhard
2018-05-01
Inhomogeneous strain and sheet thickness distributions can be observed in complex sheet metal parts manufactured by stretch forming. In literature, this problem is solved by flexible clampings adapted to the part geometry. In this paper, an approach, which does not rely on extensive tooling, is presented. The strain distribution in the sheet is influenced by means of hole patterns. Holes are introduced into the sheet area between clamping and part next to areas where high strains are expected. When deforming the sheet, high strains are shifted out of the part area. In a local area around the holes, high strains concentrate perpendicular to the drawing direction. Thus, high strains in the part area are reduced and the strain distribution is homogenised. To verify this approach, an FE-model of a stretch forming process of a conical part is implemented in LS-Dyna. The model is validated by corresponding experiments. In the first step, the positioning of the holes is applied manually based on the numerically determined strain distribution and experience. In order to automate the positioning of the holes, an optimisation method is applied in a second step. The presented approach implemented in LS-OPT uses the response surface method to identify the positioning and radius of the holes homogenising the strain in a defined area of the sheet. Due to nonlinear increase of computational complexity with increasing number of holes, the maximum number of holes is set to three. With both, the manual and the automated method, hole patterns were found which allow for a relative reduction of maximum strains and for a homogenisation of the strain distribution. Comparing the manual and automated positioning of holes, the pattern determined by automated optimisation shows better results in terms of homogenising the strain distribution.
Automated synovium segmentation in doppler ultrasound images for rheumatoid arthritis assessment
NASA Astrophysics Data System (ADS)
Yeung, Pak-Hei; Tan, York-Kiat; Xu, Shuoyu
2018-02-01
We need better clinical tools to improve monitoring of synovitis, synovial inflammation in the joints, in rheumatoid arthritis (RA) assessment. Given its economical, safe and fast characteristics, ultrasound (US) especially Doppler ultrasound is frequently used. However, manual scoring of synovitis in US images is subjective and prone to observer variations. In this study, we propose a new and robust method for automated synovium segmentation in the commonly affected joints, i.e. metacarpophalangeal (MCP) and metatarsophalangeal (MTP) joints, which would facilitate automation in quantitative RA assessment. The bone contour in the US image is firstly detected based on a modified dynamic programming method, incorporating angular information for detecting curved bone surface and using image fuzzification to identify missing bone structure. K-means clustering is then performed to initialize potential synovium areas by utilizing the identified bone contour as boundary reference. After excluding invalid candidate regions, the final segmented synovium is identified by reconnecting remaining candidate regions using level set evolution. 15 MCP and 15 MTP US images were analyzed in this study. For each image, segmentations by our proposed method as well as two sets of annotations performed by an experienced clinician at different time-points were acquired. Dice's coefficient is 0.77+/-0.12 between the two sets of annotations. Similar Dice's coefficients are achieved between automated segmentation and either the first set of annotations (0.76+/-0.12) or the second set of annotations (0.75+/-0.11), with no significant difference (P = 0.77). These results verify that the accuracy of segmentation by our proposed method and by clinician is comparable. Therefore, reliable synovium identification can be made by our proposed method.
Peripheral refractive correction and automated perimetric profiles.
Wild, J M; Wood, J M; Crews, S J
1988-06-01
The effect of peripheral refractive error correction on the automated perimetric sensitivity profile was investigated on a sample of 10 clinically normal, experienced observers. Peripheral refractive error was determined at eccentricities of 0 degree, 20 degrees and 40 degrees along the temporal meridian of the right eye using the Canon Autoref R-1, an infra-red automated refractor, under the parametric conditions of the Octopus automated perimeter. Perimetric sensitivity was then undertaken at these eccentricities (stimulus sizes 0 and III) with and without the appropriate peripheral refractive correction using the Octopus 201 automated perimeter. Within the measurement limits of the experimental procedures employed, perimetric sensitivity was not influenced by peripheral refractive correction.
Airplane Mesh Development with Grid Density Studies
NASA Technical Reports Server (NTRS)
Cliff, Susan E.; Baker, Timothy J.; Thomas, Scott D.; Lawrence, Scott L.; Rimlinger, Mark J.
1999-01-01
Automatic Grid Generation Wish List Geometry handling, including CAD clean up and mesh generation, remains a major bottleneck in the application of CFD methods. There is a pressing need for greater automation in several aspects of the geometry preparation in order to reduce set up time and eliminate user intervention as much as possible. Starting from the CAD representation of a configuration, there may be holes or overlapping surfaces which require an intensive effort to establish cleanly abutting surface patches, and collections of many patches may need to be combined for more efficient use of the geometrical representation. Obtaining an accurate and suitable body conforming grid with an adequate distribution of points throughout the flow-field, for the flow conditions of interest, is often the most time consuming task for complex CFD applications. There is a need for a clean unambiguous definition of the CAD geometry. Ideally this would be carried out automatically by smart CAD clean up software. One could also define a standard piece-wise smooth surface representation suitable for use by computational methods and then create software to translate between the various CAD descriptions and the standard representation. Surface meshing remains a time consuming, user intensive procedure. There is a need for automated surface meshing, requiring only minimal user intervention to define the overall density of mesh points. The surface mesher should produce well shaped elements (triangles or quadrilaterals) whose size is determined initially according to the surface curvature with a minimum size for flat pieces, and later refined by the user in other regions if necessary. Present techniques for volume meshing all require some degree of user intervention. There is a need for fully automated and reliable volume mesh generation. In addition, it should be possible to create both surface and volume meshes that meet guaranteed measures of mesh quality (e.g. minimum and maximum angle, stretching ratios, etc.).
Garson, Christopher D; Li, Bing; Acton, Scott T; Hossack, John A
2008-06-01
The active surface technique using gradient vector flow allows semi-automated segmentation of ventricular borders. The accuracy of the algorithm depends on the optimal selection of several key parameters. We investigated the use of conservation of myocardial volume for quantitative assessment of each of these parameters using synthetic and in vivo data. We predicted that for a given set of model parameters, strong conservation of volume would correlate with accurate segmentation. The metric was most useful when applied to the gradient vector field weighting and temporal step-size parameters, but less effective in guiding an optimal choice of the active surface tension and rigidity parameters.
Sonsmann, F K; Strunk, M; Gediga, K; John, C; Schliemann, S; Seyfarth, F; Elsner, P; Diepgen, T L; Kutz, G; John, S M
2014-05-01
To date, there are no legally binding requirements concerning product testing in cosmetics. This leads to various manufacturer-specific test methods and absent transparent information on skin cleansing products. A standardized in vivo test procedure for assessment of cleansing efficacy and corresponding barrier impairment by the cleaning process is needed, especially in the occupational context where repeated hand washing procedures may be performed at short intervals. For the standardization of the cleansing procedure, an Automated Cleansing Device (ACiD) was designed and evaluated. Different smooth washing surfaces of the equipment for ACiD (incl. goat hair, felt, felt covered with nitrile caps) were evaluated regarding their skin compatibility. ACiD allows an automated, fully standardized skin washing procedure. Felt covered with nitrile as washing surface of the rotating washing units leads to a homogenous cleansing result and does not cause detectable skin irritation, neither clinically nor as assessed by skin bioengineering methods (transepidermal water loss, chromametry). Automated Cleansing Device may be useful for standardized evaluation of the cleansing effectiveness and parallel assessment of the corresponding irritancy potential of industrial skin cleansers. This will allow objectifying efficacy and safety of industrial skin cleansers, thus enabling market transparency and facilitating rational choice of products. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Using All-Sky Imaging to Improve Telescope Scheduling (Abstract)
NASA Astrophysics Data System (ADS)
Cole, G. M.
2017-12-01
(Abstract only) Automated scheduling makes it possible for a small telescope to observe a large number of targets in a single night. But when used in areas which have less-than-perfect sky conditions such automation can lead to large numbers of observations of clouds and haze. This paper describes the development of a "sky-aware" telescope automation system that integrates the data flow from an SBIG AllSky340c camera with an enhanced dispatch scheduler to make optimum use of the available observing conditions for two highly instrumented backyard telescopes. Using the minute-by-minute time series image stream and a self-maintained reference database, the software maintains a file of sky brightness, transparency, stability, and forecasted visibility at several hundred grid positions. The scheduling software uses this information in real time to exclude targets obscured by clouds and select the best observing task, taking into account the requirements and limits of each instrument.
Automation and Robotics for Human Mars Exploration (AROMA)
NASA Technical Reports Server (NTRS)
Hofmann, Peter; von Richter, Andreas
2003-01-01
Automation and Robotics (A&R) systems are a key technology for Mars exploration. All over the world initiatives in this field aim at developing new A&R systems and technologies for planetary surface exploration. From December 2000 to February 2002 Kayser-Threde GmbH, Munich, Germany lead a study called AROMA (Automation and Robotics for Human Mars Exploration) under ESA contract in order to define a reference architecture of A&R elements in support of a human Mars exploration program. One of the goals of this effort is to initiate new developments and to maintain the competitiveness of European industry within this field. c2003 Published by Elsevier Science Ltd.
Automated Tow Placement Processing and Characterization of Composites
NASA Technical Reports Server (NTRS)
Prabhakaran, R.
2004-01-01
The project had one of the initial objectives as automated tow placement (ATP), in which a robot was used to place a collimated band of pre-impregnated ribbons or a wide preconsolidated tape onto a tool surface. It was proposed to utilize the Automated Tow Placement machine that was already available and to fabricate carbon fiber reinforced PEEK (polyether-ether-ketone) matrix composites. After initial experiments with the fabrication of flat plates, composite cylinders were to be fabricated. Specimens from the fabricated parts were to be tested for mechanical characterization. A second objective was to conduct various types of tests for characterizing composite specimens cured by different fabrication processes.
Automation and Robotics for Human Mars Exploration (AROMA).
Hofmann, Peter; von Richter, Andreas
2003-01-01
Automation and Robotics (A&R) systems are a key technology for Mars exploration. All over the world initiatives in this field aim at developing new A&R systems and technologies for planetary surface exploration. From December 2000 to February 2002 Kayser-Threde GmbH, Munich, Germany lead a study called AROMA (Automation and Robotics for Human Mars Exploration) under ESA contract in order to define a reference architecture of A&R elements in support of a human Mars exploration program. One of the goals of this effort is to initiate new developments and to maintain the competitiveness of European industry within this field. c2003 Published by Elsevier Science Ltd.
Using Automation to Improve Surface Irrigation Management
USDA-ARS?s Scientific Manuscript database
In the Lower Mississippi Water Resource Area (WRA 08), also called the Mid-South, 2 million ha of cropland (80% of the irrigated farmland) employ surface irrigation, almost equally divided between furrow (52%) and controlled flooding (48%). Because Mid-South farmers experience less-than-optimal surf...
Surface plasmon resonance (SPR) detection of Staphylococcal Enterotoxin A in food samples
USDA-ARS?s Scientific Manuscript database
An automated and rapid method for detection of staphylococcal enterotoxins (SE) is needed. A sandwich assay was developed using a surface plasmon resonance (SPR) biosensor for detection of staphylococcal enterotoxin A (SEA) at subpicomolar concentration. Assay conditions were optimized for capturing...
Automated Infrared Inspection Of Jet Engine Turbine Blades
NASA Astrophysics Data System (ADS)
Bantel, T.; Bowman, D.; Halase, J.; Kenue, S.; Krisher, R.; Sippel, T.
1986-03-01
The detection of blocked surface cooling holes in hollow jet engine turbine blades and vanes during either manufacture or overhaul can be crucial to the integrity and longevity of the parts when in service. A fully automated infrared inspection system is being established under a tri-service's Manufacturing Technology (ManTech) contract administered by the Air Force to inspect these surface cooling holes for blockages. The method consists of viewing the surface holes of the blade with a scanning infrared radiometer when heated air is flushed through the blade. As the airfoil heats up, the resultant infrared images are written directly into computer memory where image analysis is performed. The computer then makes a determination of whether or not the holes are open from the inner plenum to the exterior surface and ultimately makes an accept/reject decision based on previously programmed criteria. A semiautomatic version has already been implemented and is more cost effective and more reliable than the previous manual inspection methods.
Tripathi, Ashish; Emmons, Erik D; Wilcox, Phillip G; Guicheteau, Jason A; Emge, Darren K; Christesen, Steven D; Fountain, Augustus W
2011-06-01
We have previously demonstrated the use of wide-field Raman chemical imaging (RCI) to detect and identify the presence of trace explosives in contaminated fingerprints. In this current work we demonstrate the detection of trace explosives in contaminated fingerprints on strongly Raman scattering surfaces such as plastics and painted metals using an automated background subtraction routine. We demonstrate the use of partial least squares subtraction to minimize the interfering surface spectral signatures, allowing the detection and identification of explosive materials in the corrected Raman images. The resulting analyses are then visually superimposed on the corresponding bright field images to physically locate traces of explosives. Additionally, we attempt to address the question of whether a complete RCI of a fingerprint is required for trace explosive detection or whether a simple non-imaging Raman spectrum is sufficient. This investigation further demonstrates the ability to nondestructively identify explosives on fingerprints present on commonly found surfaces such that the fingerprint remains intact for further biometric analysis.
Automated Detection of Small Bodies by Space Based Observation
NASA Astrophysics Data System (ADS)
Bidstrup, P. R.; Grillmayer, G.; Andersen, A. C.; Haack, H.; Jorgensen, J. L.
The number of known comets and asteroids is increasing every year. Up till now this number is including approximately 250,000 of the largest minor planets, as they are usually referred. These discoveries are due to the Earth-based observation which has intensified over the previous decades. Additionally larger telescopes and arrays of telescopes are being used for exploring our Solar System. It is believed that all near- Earth and Main-Belt asteroids of diameters above 10 to 30 km have been discovered, leaving these groups of objects as observationally complete. However, the cataloguing of smaller bodies is incomplete as only a very small fraction of the expected number has been discovered. It is estimated that approximately 1010 main belt asteroids in the size range 1 m to 1 km are too faint to be observed using Earth-based telescopes. In order to observe these small bodies, space-based search must be initiated to remove atmospheric disturbances and to minimize the distance to the asteroids and thereby minimising the requirement for long camera integration times. A new method of space-based detection of moving non-stellar objects is currently being developed utilising the Advanced Stellar Compass (ASC) built for spacecraft attitude determination by Ørsted, Danish Technical University. The ASC serves as a backbone technology in the project as it is capable of fully automated distinction of known and unknown celestial objects. By only processing objects of particular interest, i.e. moving objects, it will be possible to discover small bodies with a minimum of ground control, with the ultimate ambition of a fully automated space search probe. Currently, the ASC is being mounted on the Flying Laptop satellite of the Institute of Space Systems, Universität Stuttgart. It will, after a launch into a low Earth polar orbit in 2008, test the detection method with the ASC equipment that already had significant in-flight experience. A future use of the ASC based automated detection of small bodies is currently on a preliminary stage and known as the Bering project - a deep space survey to the asteroid Main-Belt. With a successful detection method, the Bering mission is expected to discover approximately 6 new small objects per day and 1 will thus during the course of a few years discover 5,000-10,000 new sub-kilometer asteroids. Discovery of new small bodies can: 1) Provide further links between groups of meteorites. 2) Constrain the cratering rate at planetary surfaces and thus allow significantly improved cratering ages for terrains on Mars and other planets. 3) Help determine processes that transfer small asteroids from orbits in the asteroid Main-Belt to the inner Solar System. 2
Whole surface image reconstruction for machine vision inspection of fruit
NASA Astrophysics Data System (ADS)
Reese, D. Y.; Lefcourt, A. M.; Kim, M. S.; Lo, Y. M.
2007-09-01
Automated imaging systems offer the potential to inspect the quality and safety of fruits and vegetables consumed by the public. Current automated inspection systems allow fruit such as apples to be sorted for quality issues including color and size by looking at a portion of the surface of each fruit. However, to inspect for defects and contamination, the whole surface of each fruit must be imaged. The goal of this project was to develop an effective and economical method for whole surface imaging of apples using mirrors and a single camera. Challenges include mapping the concave stem and calyx regions. To allow the entire surface of an apple to be imaged, apples were suspended or rolled above the mirrors using two parallel music wires. A camera above the apples captured 90 images per sec (640 by 480 pixels). Single or multiple flat or concave mirrors were mounted around the apple in various configurations to maximize surface imaging. Data suggest that the use of two flat mirrors provides inadequate coverage of a fruit but using two parabolic concave mirrors allows the entire surface to be mapped. Parabolic concave mirrors magnify images, which results in greater pixel resolution and reduced distortion. This result suggests that a single camera with two parabolic concave mirrors can be a cost-effective method for whole surface imaging.
Stacked endoplasmic reticulum sheets are connected by helicoidal membrane motifs
Terasaki, Mark; Shemesh, Tom; Kasthuri, Narayanan; Klemm, Robin W.; Schalek, Richard; Hayworth, Kenneth J.; Hand, Arthur R.; Yankova, Maya; Huber, Greg; Lichtman, Jeff W.; Rapoport, Tom A.; Kozlov, Michael M.
2013-01-01
The endoplasmic reticulum (ER) often forms stacked membrane sheets, an arrangement that is likely required to accommodate a maximum of membrane-bound polysomes for secretory protein synthesis. How sheets are stacked is unknown. Here, we used novel staining and automated ultra-thin sectioning electron microscopy methods to analyze stacked ER sheets in neuronal cells and secretory salivary gland cells of mice. Our results show that stacked ER sheets form a continuous membrane system in which the sheets are connected by twisted membrane surfaces with helical edges of left- or right-handedness. The three-dimensional structure of tightly stacked ER sheets resembles a parking garage, in which the different levels are connected by helicoidal ramps. A theoretical model explains the experimental observations and indicates that the structure corresponds to a minimum of elastic energy of sheet edges and surfaces. The structure allows the dense packing of ER sheets in the restricted space of a cell. PMID:23870120
NASA Technical Reports Server (NTRS)
MCKissick, Burnell T. (Technical Monitor); Plassman, Gerald E.; Mall, Gerald H.; Quagliano, John R.
2005-01-01
Linear multivariable regression models for predicting day and night Eddy Dissipation Rate (EDR) from available meteorological data sources are defined and validated. Model definition is based on a combination of 1997-2000 Dallas/Fort Worth (DFW) data sources, EDR from Aircraft Vortex Spacing System (AVOSS) deployment data, and regression variables primarily from corresponding Automated Surface Observation System (ASOS) data. Model validation is accomplished through EDR predictions on a similar combination of 1994-1995 Memphis (MEM) AVOSS and ASOS data. Model forms include an intercept plus a single term of fixed optimal power for each of these regression variables; 30-minute forward averaged mean and variance of near-surface wind speed and temperature, variance of wind direction, and a discrete cloud cover metric. Distinct day and night models, regressing on EDR and the natural log of EDR respectively, yield best performance and avoid model discontinuity over day/night data boundaries.
Automated oil spill detection with multispectral imagery
NASA Astrophysics Data System (ADS)
Bradford, Brian N.; Sanchez-Reyes, Pedro J.
2011-06-01
In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.
Tumor Burden Analysis on Computed Tomography by Automated Liver and Tumor Segmentation
Linguraru, Marius George; Richbourg, William J.; Liu, Jianfei; Watt, Jeremy M.; Pamulapati, Vivek; Wang, Shijun; Summers, Ronald M.
2013-01-01
The paper presents the automated computation of hepatic tumor burden from abdominal CT images of diseased populations with images with inconsistent enhancement. The automated segmentation of livers is addressed first. A novel three-dimensional (3D) affine invariant shape parameterization is employed to compare local shape across organs. By generating a regular sampling of the organ's surface, this parameterization can be effectively used to compare features of a set of closed 3D surfaces point-to-point, while avoiding common problems with the parameterization of concave surfaces. From an initial segmentation of the livers, the areas of atypical local shape are determined using training sets. A geodesic active contour corrects locally the segmentations of the livers in abnormal images. Graph cuts segment the hepatic tumors using shape and enhancement constraints. Liver segmentation errors are reduced significantly and all tumors are detected. Finally, support vector machines and feature selection are employed to reduce the number of false tumor detections. The tumor detection true position fraction of 100% is achieved at 2.3 false positives/case and the tumor burden is estimated with 0.9% error. Results from the test data demonstrate the method's robustness to analyze livers from difficult clinical cases to allow the temporal monitoring of patients with hepatic cancer. PMID:22893379
Brown, Treva T.; LeJeune, Zorabel M.; Liu, Kai; Hardin, Sean; Li, Jie-Ren; Rupnik, Kresimir; Garno, Jayne C.
2010-01-01
Controllers for scanning probe instruments can be programmed for automated lithography to generate desired surface arrangements of nanopatterns of organic thin films, such as n-alkanethiol self-assembled monolayers (SAMs). In this report, atomic force microscopy (AFM) methods of lithography known as nanoshaving and nanografting are used to write nanopatterns within organic thin films. Commercial instruments provide software to control the length, direction, speed, and applied force of the scanning motion of the tip. For nanoshaving, higher forces are applied to an AFM tip to selectively remove regions of the matrix monolayer, exposing bare areas of the gold substrate. Nanografting is accomplished by force-induced displacement of molecules of a matrix SAM, followed immediately by the surface self-assembly of n-alkanethiol molecules from solution. Advancements in AFM automation enable rapid protocols for nanolithography, which can be accomplished within the tight time restraints of undergraduate laboratories. Example experiments with scanning probe lithography (SPL) will be described in this report that were accomplished by undergraduate students during laboratory course activities and research internships in the chemistry department of Louisiana State University. Students were introduced to principles of surface analysis and gained “hands-on” experience with nanoscale chemistry. PMID:21483651
NASA Technical Reports Server (NTRS)
Thompson David S.; Soni, Bharat K.
2001-01-01
An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.
Automated lattice data generation
NASA Astrophysics Data System (ADS)
Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.
2018-03-01
The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.
Compact, Automated, Frequency-Agile Microspectrofluorimeter
NASA Technical Reports Server (NTRS)
Fernandez, Salvador M.; Guignon, Ernest F.
1995-01-01
Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.
Automated surface inspection for steel products using computer vision approach.
Xi, Jiaqi; Shentu, Lifeng; Hu, Jikang; Li, Mian
2017-01-10
Surface inspection is a critical step in ensuring the product quality in the steel-making industry. In order to relieve inspectors of laborious work and improve the consistency of inspection, much effort has been dedicated to the automated inspection using computer vision approaches over the past decades. However, due to non-uniform illumination conditions and similarity between the surface textures and defects, the present methods are usually applicable to very specific cases. In this paper a new framework for surface inspection has been proposed to overcome these limitations. By investigating the image formation process, a quantitative model characterizing the impact of illumination on the image quality is developed, based on which the non-uniform brightness in the image can be effectively removed. Then a simple classifier is designed to identify the defects among the surface textures. The significance of this approach lies in its robustness to illumination changes and wide applicability to different inspection scenarios. The proposed approach has been successfully applied to the real-time surface inspection of round billets in real manufacturing. Implemented on a conventional industrial PC, the algorithm can proceed at 12.5 frames per second with the successful detection rate being over 90% for turned and skinned billets.
The Automation of Nowcast Model Assessment Processes
2016-09-01
that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data
Boshkovikj, Veselin; Fluke, Christopher J; Crawford, Russell J; Ivanova, Elena P
2014-02-28
There has been a growing interest in understanding the ways in which bacteria interact with nano-structured surfaces. As a result, there is a need for innovative approaches to enable researchers to visualize the biological processes taking place, despite the fact that it is not possible to directly observe these processes. We present a novel approach for the three-dimensional visualization of bacterial interactions with nano-structured surfaces using the software package Autodesk Maya. Our approach comprises a semi-automated stage, where actual surface topographic parameters, obtained using an atomic force microscope, are imported into Maya via a custom Python script, followed by a 'creative stage', where the bacterial cells and their interactions with the surfaces are visualized using available experimental data. The 'Dynamics' and 'nDynamics' capabilities of the Maya software allowed the construction and visualization of plausible interaction scenarios. This capability provides a practical aid to knowledge discovery, assists in the dissemination of research results, and provides an opportunity for an improved public understanding. We validated our approach by graphically depicting the interactions between the two bacteria being used for modeling purposes, Staphylococcus aureus and Pseudomonas aeruginosa, with different titanium substrate surfaces that are routinely used in the production of biomedical devices.
NASA Astrophysics Data System (ADS)
Boshkovikj, Veselin; Fluke, Christopher J.; Crawford, Russell J.; Ivanova, Elena P.
2014-02-01
There has been a growing interest in understanding the ways in which bacteria interact with nano-structured surfaces. As a result, there is a need for innovative approaches to enable researchers to visualize the biological processes taking place, despite the fact that it is not possible to directly observe these processes. We present a novel approach for the three-dimensional visualization of bacterial interactions with nano-structured surfaces using the software package Autodesk Maya. Our approach comprises a semi-automated stage, where actual surface topographic parameters, obtained using an atomic force microscope, are imported into Maya via a custom Python script, followed by a `creative stage', where the bacterial cells and their interactions with the surfaces are visualized using available experimental data. The `Dynamics' and `nDynamics' capabilities of the Maya software allowed the construction and visualization of plausible interaction scenarios. This capability provides a practical aid to knowledge discovery, assists in the dissemination of research results, and provides an opportunity for an improved public understanding. We validated our approach by graphically depicting the interactions between the two bacteria being used for modeling purposes, Staphylococcus aureus and Pseudomonas aeruginosa, with different titanium substrate surfaces that are routinely used in the production of biomedical devices.
Boshkovikj, Veselin; Fluke, Christopher J.; Crawford, Russell J.; Ivanova, Elena P.
2014-01-01
There has been a growing interest in understanding the ways in which bacteria interact with nano-structured surfaces. As a result, there is a need for innovative approaches to enable researchers to visualize the biological processes taking place, despite the fact that it is not possible to directly observe these processes. We present a novel approach for the three-dimensional visualization of bacterial interactions with nano-structured surfaces using the software package Autodesk Maya. Our approach comprises a semi-automated stage, where actual surface topographic parameters, obtained using an atomic force microscope, are imported into Maya via a custom Python script, followed by a ‘creative stage', where the bacterial cells and their interactions with the surfaces are visualized using available experimental data. The ‘Dynamics' and ‘nDynamics' capabilities of the Maya software allowed the construction and visualization of plausible interaction scenarios. This capability provides a practical aid to knowledge discovery, assists in the dissemination of research results, and provides an opportunity for an improved public understanding. We validated our approach by graphically depicting the interactions between the two bacteria being used for modeling purposes, Staphylococcus aureus and Pseudomonas aeruginosa, with different titanium substrate surfaces that are routinely used in the production of biomedical devices. PMID:24577105
Nerandzic, Michelle M; Cadnum, Jennifer L; Pultz, Michael J; Donskey, Curtis J
2010-07-08
Environmental surfaces play an important role in transmission of healthcare-associated pathogens. There is a need for new disinfection methods that are effective against Clostridium difficile spores, but also safe, rapid, and automated. The Tru-D Rapid Room Disinfection device is a mobile, fully-automated room decontamination technology that utilizes ultraviolet-C irradiation to kill pathogens. We examined the efficacy of environmental disinfection using the Tru-D device in the laboratory and in rooms of hospitalized patients. Cultures for C. difficile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant Enterococcus (VRE) were collected from commonly touched surfaces before and after use of Tru-D. On inoculated surfaces, application of Tru-D at a reflected dose of 22,000 microWs/cm(2) for approximately 45 minutes consistently reduced recovery of C. difficile spores and MRSA by >2-3 log10 colony forming units (CFU)/cm2 and of VRE by >3-4 log10 CFU/cm(2). Similar killing of MRSA and VRE was achieved in approximately 20 minutes at a reflected dose of 12,000 microWs/cm(2), but killing of C. difficile spores was reduced. Disinfection of hospital rooms with Tru-D reduced the frequency of positive MRSA and VRE cultures by 93% and of C. difficile cultures by 80%. After routine hospital cleaning of the rooms of MRSA carriers, 18% of sites under the edges of bedside tables (i.e., a frequently touched site not easily amenable to manual application of disinfectant) were contaminated with MRSA, versus 0% after Tru-D (P < 0.001). The system required <5 minutes to set up and did not require continuous monitoring. The Tru-D Rapid Room Disinfection device is a novel, automated, and efficient environmental disinfection technology that significantly reduces C. difficile, VRE and MRSA contamination on commonly touched hospital surfaces.
Total Column Greenhouse Gas Monitoring in Central Munich: Automation and Measurements
NASA Astrophysics Data System (ADS)
Chen, Jia; Heinle, Ludwig; Paetzold, Johannes C.; Le, Long
2016-04-01
It is challenging to use in-situ surface measurements of CO2 and CH4 to derive emission fluxes in urban regions. Surface concentrations typically have high variance due to the influence of nearby sources, and they are strongly modulated by mesoscale transport phenomena that are difficult to simulate in atmospheric models. The integrated amount of a tracer through the whole atmosphere is a direct measure of the mass loading of the atmosphere given by emissions. Column measurements are insensitive to vertical redistribution of tracer mass, e.g. due to growth of the planetary boundary layer, and are also less influenced by nearby point sources, whose emissions are concentrated in a thin layer near the surface. Column observations are more compatible with the scale of atmospheric models and hence provide stronger constraints for inverse modeling. In Munich we are aiming at establishing a regional sensor network with differential column measurements, i.e. total column measurements of CO2 and CH4 inside and outside of the city. The inner-city station is equipped with a compact solar-tracking Fourier transform spectrometer (Bruker EM27/SUN) in the campus of Technische Universität München, and our measurements started in Aug. 2015. The measurements over seasons will be shown, as well as preliminary emission studies using these observations. To deploy the compact spectrometers for stationary monitoring of the urban emissions, an automatic protection and control system is mandatory and a challenging task. It will allow solar measurements whenever the sun is out and reliable protection of the instrument when it starts to rain. We have developed a simplified and highly reliable concept for the enclosure, aiming for a fully automated data collection station without the need of local human interactions. Furthermore, we are validating and combining the OCO-2 satellite-based measurements with our ground-based measurements. For this purpose, we have developed a software tool that permits spatial, temporal and quality data filtering and selection from the OCO-2 database. We observed inconsistencies between nadir and glint measurements nearby Munich on consecutive days with similar weather conditions in August 2015. To visualize our regional sensor network, we have developed software to generate KML-Files, which enables us to display and browse the results of our measurement site, OCO-2 measurements as well as future satellite tracks.
What's a Manager to Do about Office Automation?
ERIC Educational Resources Information Center
Sherron, Gene
1984-01-01
Some observations about office technology in higher education are presented. University of Maryland plans concerning its approach to office automation are discussed. Seventeen features considered "mandatories" for any system that might be acquired are identified. (Author/MLW)
NASA Technical Reports Server (NTRS)
Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.
2009-01-01
Devices for lifting, translating and precisely placing payloads are critical for efficient Earth-based construction operations. Both recent and past studies have demonstrated that devices with similar functionality will be needed to support lunar outpost operations. Although several designs have been developed for Earth based applications, these devices lack unique design characteristics necessary for transport to and use on the harsh lunar surface. These design characteristics include: a) lightweight components, b) compact packaging for launch, c) automated deployment, d) simple in-field reconfiguration and repair, and e) support for tele-operated or automated operations. Also, because the cost to transport mass to the lunar surface is very high, the number of devices that can be dedicated to surface operations will be limited. Thus, in contrast to Earth-based construction, where many single-purpose devices dominate a construction site, a lunar outpost will require a limited number of versatile devices that provide operational benefit from initial construction through sustained operations. The first generation test-bed of a new high performance device, the Lunar Surface Manipulation System (LSMS) has been designed, built and field tested. The LSMS has many unique features resulting in a mass efficient solution to payload handling on the lunar surface. Typically, the LSMS device mass is estimated at approximately 3% of the mass of the heaviest payload lifted at the tip, or 1.8 % of the mass of the heaviest mass lifted at the elbow or mid-span of the boom for a high performance variant incorporating advanced structural components. Initial operational capabilities of the LSMS were successfully demonstrated during field tests at Moses Lake, Washington using a tele-operated approach. Joint angle sensors have been developed for the LSMS to improve operator situational awareness. These same sensors provide the necessary information to support fully automated operations, greatly expanding the operational versatility of the LSMS. This paper develops the equations describing the forward and inverse relation between LSMS joint angles and Cartesian coordinates of the LSMS tip. These equations allow a variety of schemes to be used to maneuver the LSMS to optimize the maneuver. One such scheme will be described in detail that eliminates undesirable swinging of the payload at the conclusion of a maneuver, even when the payload is suspended from a passive rigid link. The swinging is undesirable when performing precision maneuvers, such as aligning an object for mating or positioning a camera. Use of the equations described here enables automated control of the LSMS greatly improving its operational versatility.
Note: Automated optical focusing on encapsulated devices for scanning light stimulation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bitzer, L. A.; Benson, N., E-mail: niels.benson@uni-due.de; Schmechel, R.
Recently, a scanning light stimulation system with an automated, adaptive focus correction during the measurement was introduced. Here, its application on encapsulated devices is discussed. This includes the changes an encapsulating optical medium introduces to the focusing process as well as to the subsequent light stimulation measurement. Further, the focusing method is modified to compensate for the influence of refraction and to maintain a minimum beam diameter on the sample surface.
The KUT meteor radar: An educational low cost meteor observation system by radio forward scattering
NASA Astrophysics Data System (ADS)
Madkour, W.; Yamamoto, M.
2016-01-01
The Kochi University of Technology (KUT) meteor radar is an educational low cost observation system built at Kochi, Japan by successive graduate students since 2004. The system takes advantage of the continuous VHF- band beacon signal emitted from Fukui National College of Technology (FNCT) for scientific usage all over Japan by receiving the forward scattered signals. The system uses the classical forward scattering setup similar to the setup described by the international meteor organization (IMO), gradually developed from the most basic single antenna setup to the multi-site meteor path determination setup. The primary objective is to automate the observation of the meteor parameters continuously to provide amounts of data sufficient for statistical analysis. The developed software system automates the observation of the astronomical meteor parameters such as meteor direction, velocity and trajectory. Also, automated counting of meteor echoes and their durations are used to observe mesospheric ozone concentration by analyzing the duration distribution of different meteor showers. The meteor parameters observed and the methodology used for each are briefly summarized.
Semi-automated Image Processing for Preclinical Bioluminescent Imaging.
Slavine, Nikolai V; McColl, Roderick W
Bioluminescent imaging is a valuable noninvasive technique for investigating tumor dynamics and specific biological molecular events in living animals to better understand the effects of human disease in animal models. The purpose of this study was to develop and test a strategy behind automated methods for bioluminescence image processing from the data acquisition to obtaining 3D images. In order to optimize this procedure a semi-automated image processing approach with multi-modality image handling environment was developed. To identify a bioluminescent source location and strength we used the light flux detected on the surface of the imaged object by CCD cameras. For phantom calibration tests and object surface reconstruction we used MLEM algorithm. For internal bioluminescent sources we used the diffusion approximation with balancing the internal and external intensities on the boundary of the media and then determined an initial order approximation for the photon fluence we subsequently applied a novel iterative deconvolution method to obtain the final reconstruction result. We find that the reconstruction techniques successfully used the depth-dependent light transport approach and semi-automated image processing to provide a realistic 3D model of the lung tumor. Our image processing software can optimize and decrease the time of the volumetric imaging and quantitative assessment. The data obtained from light phantom and lung mouse tumor images demonstrate the utility of the image reconstruction algorithms and semi-automated approach for bioluminescent image processing procedure. We suggest that the developed image processing approach can be applied to preclinical imaging studies: characteristics of tumor growth, identify metastases, and potentially determine the effectiveness of cancer treatment.
CIS-lunar space infrastructure lunar technologies: Executive summary
NASA Technical Reports Server (NTRS)
Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.
1989-01-01
Technologies necessary for the creation of a cis-Lunar infrastructure, namely: (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologies, are explored. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by Automation and Robotics (LOAR). Under direction from the NASA Office of Exploration, automation and robotics were extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a buddy system, these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of Lunar resources, and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar environmentally controlled life support system. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the Lunar surface. Physicochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ Lunar resources will be both tested and used within this bio-volume. Second phase development on the Lunar surface calls for manned operations. Repairs and re-configuration of the initial framework will ensue. An autonomously-initiated manned Lunar oasis can become an essential component of the United States space program.
An automated quasi-continuous capillary refill timing device
Blaxter, L L; Morris, D E; Crowe, J A; Henry, C; Hill, S; Sharkey, D; Vyas, H; Hayes-Gill, B R
2016-01-01
Capillary refill time (CRT) is a simple means of cardiovascular assessment which is widely used in clinical care. Currently, CRT is measured through manual assessment of the time taken for skin tone to return to normal colour following blanching of the skin surface. There is evidence to suggest that manually assessed CRT is subject to bias from ambient light conditions, a lack of standardisation of both blanching time and manually applied pressure, subjectiveness of return to normal colour, and variability in the manual assessment of time. We present a novel automated system for CRT measurement, incorporating three components: a non-invasive adhesive sensor incorporating a pneumatic actuator, a diffuse multi-wavelength reflectance measurement device, and a temperature sensor; a battery operated datalogger unit containing a self contained pneumatic supply; and PC based data analysis software for the extraction of refill time, patient skin surface temperature, and sensor signal quality. Through standardisation of the test, it is hoped that some of the shortcomings of manual CRT can be overcome. In addition, an automated system will facilitate easier integration of CRT into electronic record keeping and clinical monitoring or scoring systems, as well as reducing demands on clinicians. Summary analysis of volunteer (n = 30) automated CRT datasets are presented, from 15 healthy adults and 15 healthy children (aged from 5 to 15 years), as their arms were cooled from ambient temperature to 5°C. A more detailed analysis of two typical datasets is also presented, demonstrating that the response of automated CRT to cooling matches that of previously published studies. PMID:26642080
Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L
2017-08-07
To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.
Real-Time Ocean Prediction System for the East Coast of India
NASA Astrophysics Data System (ADS)
Warrior, H. V.
2016-02-01
The primary objective of the research work reported in this abstract was to develop a Realtime Environmental model for Ocean Dispersion and Impact (as part of an already in-place Decision Support System) for the purpose of radiological safety for the area along Kalpakkam (East Indian) coast. This system involves combining real-time ocean observations with numerical models of ocean processes to provide hindcasts, nowcasts and forecasts of currents, tides and waves. In this work we present the development of an Automated Coupled Atmospheric - Ocean Model (we call it IIT-CAOM) used to forecast the sea surface currents, sea surface temperature (SST) and salinity etc of the Bay of Bengal region under the influence of transient and unsteady atmospheric conditions. This method uses a coupling of Atmosphere and Ocean model. The models used here are the WRF for atmospheric simulations and POM for the ocean counterpart. It has a 3 km X 3 km resolution. This Coupled Model uses GFS (Global Forecast System) Data or FNL (Final Analyses) Data as initial conditions for jump-starting the atmospheric model. The Atmospheric model is run first thus extracting air temperature, wind speed and relative humidity. The heat flux subroutine computes the net heat flux, using above mentioned parameters data. The net heat flux feeds to the ocean model by simply adding net heat flux subroutine to the ocean model code without changing the model original structure. The online forecast of the IIT-CAOM is currently available in the web. The whole system has been automized and runs without any more manual support. The IIT-CAOM simulations have been carried out for Kalpakkam region, which is located on the East coast of India, about 70 km south of Chennai in Tamilnadu State and a three day forecast of sea surface currents, sea surface temperature (SST) and salinity, etc have been obtained.
Array automated assembly, phase 2
NASA Technical Reports Server (NTRS)
Taylor, W. E.
1978-01-01
An analysis was made of cost tradeoffs for shaping modified square wafers from cylindrical crystals. Tests were conducted of the effectiveness of texture etching for removal of surface damage on sawed wafers. A single step texturing etch appeared adequate for removal of surface damage on wafers cut with multiple blade reciprocating slurry saws.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-30
... Automation, Inc. (``Amistar'') of San Marcos, California; Techno Soft Systemnics, Inc. (``Techno Soft'') of... the claim terms ``test,'' ``match score surface,'' and ``gradient direction,'' all of his infringement... complainants' proposed construction for the claim terms ``test,'' ``match score surface,'' and ``gradient...
Observed Thermal Impacts of Wind Farms Over Northern Illinois.
Slawsky, Lauren M; Zhou, Liming; Baidya Roy, Somnath; Xia, Geng; Vuille, Mathias; Harris, Ronald A
2015-06-25
This paper assesses impacts of three wind farms in northern Illinois using land surface temperature (LST) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments onboard the Terra and Aqua satellites for the period 2003-2013. Changes in LST between two periods (before and after construction of the wind turbines) and between wind farm pixels and nearby non-wind-farm pixels are quantified. An areal mean increase in LST by 0.18-0.39 °C is observed at nighttime over the wind farms, with the geographic distribution of this warming effect generally spatially coupled with the layout of the wind turbines (referred to as the spatial coupling), while there is no apparent impact on daytime LST. The nighttime LST warming effect varies with seasons, with the strongest warming in winter months of December-February, and the tightest spatial coupling in summer months of June-August. Analysis of seasonal variations in wind speed and direction from weather balloon sounding data and Automated Surface Observing System hourly observations from nearby stations suggest stronger winds correspond to seasons with greater warming and larger downwind impacts. The early morning soundings in Illinois are representative of the nighttime boundary layer and exhibit strong temperature inversions across all seasons. The strong and relatively shallow inversion in summer leaves warm air readily available to be mixed down and spatially well coupled with the turbine. Although the warming effect is strongest in winter, the spatial coupling is more erratic and spread out than in summer. These results suggest that the observed warming signal at nighttime is likely due to the net downward transport of heat from warmer air aloft to the surface, caused by the turbulent mixing in the wakes of the spinning turbine rotor blades.
Observed Thermal Impacts of Wind Farms Over Northern Illinois
Slawsky, Lauren M.; Zhou, Liming; Baidya Roy, Somnath; Xia, Geng; Vuille, Mathias; Harris, Ronald A.
2015-01-01
This paper assesses impacts of three wind farms in northern Illinois using land surface temperature (LST) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments onboard the Terra and Aqua satellites for the period 2003–2013. Changes in LST between two periods (before and after construction of the wind turbines) and between wind farm pixels and nearby non-wind-farm pixels are quantified. An areal mean increase in LST by 0.18–0.39 °C is observed at nighttime over the wind farms, with the geographic distribution of this warming effect generally spatially coupled with the layout of the wind turbines (referred to as the spatial coupling), while there is no apparent impact on daytime LST. The nighttime LST warming effect varies with seasons, with the strongest warming in winter months of December-February, and the tightest spatial coupling in summer months of June-August. Analysis of seasonal variations in wind speed and direction from weather balloon sounding data and Automated Surface Observing System hourly observations from nearby stations suggest stronger winds correspond to seasons with greater warming and larger downwind impacts. The early morning soundings in Illinois are representative of the nighttime boundary layer and exhibit strong temperature inversions across all seasons. The strong and relatively shallow inversion in summer leaves warm air readily available to be mixed down and spatially well coupled with the turbine. Although the warming effect is strongest in winter, the spatial coupling is more erratic and spread out than in summer. These results suggest that the observed warming signal at nighttime is likely due to the net downward transport of heat from warmer air aloft to the surface, caused by the turbulent mixing in the wakes of the spinning turbine rotor blades. PMID:26121613
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Kitasaka, Takayuki; Furukawa, Kazuhiro; Watanabe, Osamu; Ando, Takafumi; Goto, Hidemi; Mori, Kensaku
2011-03-01
The purpose of this paper is to present a new method to detect ulcers, which is one of the symptoms of Crohn's disease, from CT images. Crohn's disease is an inflammatory disease of the digestive tract. Crohn's disease commonly affects the small intestine. An optical or a capsule endoscope is used for small intestine examinations. However, these endoscopes cannot pass through intestinal stenosis parts in some cases. A CT image based diagnosis allows a physician to observe whole intestine even if intestinal stenosis exists. However, because of the complicated shape of the small and large intestines, understanding of shapes of the intestines and lesion positions are difficult in the CT image based diagnosis. Computer-aided diagnosis system for Crohn's disease having automated lesion detection is required for efficient diagnosis. We propose an automated method to detect ulcers from CT images. Longitudinal ulcers make rough surface of the small and large intestinal wall. The rough surface consists of combination of convex and concave parts on the intestinal wall. We detect convex and concave parts on the intestinal wall by a blob and an inverse-blob structure enhancement filters. A lot of convex and concave parts concentrate on roughed parts. We introduce a roughness value to differentiate convex and concave parts concentrated on the roughed parts from the other on the intestinal wall. The roughness value effectively reduces false positives of ulcer detection. Experimental results showed that the proposed method can detect convex and concave parts on the ulcers.
Inspection of imprint lithography patterns for semiconductor and patterned media
NASA Astrophysics Data System (ADS)
Resnick, Douglas J.; Haase, Gaddi; Singh, Lovejeet; Curran, David; Schmid, Gerard M.; Luo, Kang; Brooks, Cindy; Selinidis, Kosta; Fretwell, John; Sreenivasan, S. V.
2010-03-01
Imprint lithography has been shown to be an effective technique for replication of nano-scale features. Acceptance of imprint lithography for manufacturing will require demonstration that it can attain defect levels commensurate with the requirements of cost-effective device production. This work summarizes the results of defect inspections of semiconductor masks, wafers and hard disks patterned using Jet and Flash Imprint Lithography (J-FILTM). Inspections were performed with optical and e-beam based automated inspection tools. For the semiconductor market, a test mask was designed which included dense features (with half pitches ranging between 32 nm and 48 nm) containing an extensive array of programmed defects. For this work, both e-beam inspection and optical inspection were used to detect both random defects and the programmed defects. Analytical SEMs were then used to review the defects detected by the inspection. Defect trends over the course of many wafers were observed with another test mask using a KLA-T 2132 optical inspection tool. The primary source of defects over 2000 imprints were particle related. For the hard drive market, it is important to understand the defectivity of both the template and the imprinted disk. This work presents a methodology for automated pattern inspection and defect classification for imprint-patterned media. Candela CS20 and 6120 tools from KLA-Tencor map the optical properties of the disk surface, producing highresolution grayscale images of surface reflectivity, scattered light, phase shift, etc. Defects that have been identified in this manner are further characterized according to the morphology
Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J
2017-01-01
To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully automatic vessel wall quantification was developed and validated on healthy volunteers as well as patients with increased vessel wall thickness. This method holds promise for helping in efficient image interpretation for large-scale cohort studies. 4 J. Magn. Reson. Imaging 2017;45:215-228. © 2016 International Society for Magnetic Resonance in Medicine.
Automated Figuring and Polishing of Replication Mandrels for X-Ray Telescopes
NASA Technical Reports Server (NTRS)
Krebs, Carolyn (Technical Monitor); Content, David; Fleetwood, Charles; Wright, Geraldine; Arsenovic, Petar; Collela, David; Kolos, Linette
2003-01-01
In support of the Constellation X mission the Optics Branch at Goddard Space Flight Center is developing technology for precision figuring and polishing of mandrels used to produce replicated mirrors that will be used in X-Ray telescopes. Employing a specially built machine controlled in 2 axes by a computer, we are doing automated polishing/figuring of 15 cm long, 20 cm diameter cylindrical, conical and Wolter mandrels. A battery of tests allow us to fully characterize all important aspects of the mandrels, including surface figure and finish, mid-frequency errors, diameters and cone angle. Parts are currently being produced with surface roughnesses at the .5nm RMS level, and half-power diameter slope error less than 2 arcseconds.
Printing quality control automation
NASA Astrophysics Data System (ADS)
Trapeznikova, O. V.
2018-04-01
One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.
Semi-Automated Identification of Rocks in Images
NASA Technical Reports Server (NTRS)
Bornstein, Benjamin; Castano, Andres; Anderson, Robert
2006-01-01
Rock Identification Toolkit Suite is a computer program that assists users in identifying and characterizing rocks shown in images returned by the Mars Explorer Rover mission. Included in the program are components for automated finding of rocks, interactive adjustments of outlines of rocks, active contouring of rocks, and automated analysis of shapes in two dimensions. The program assists users in evaluating the surface properties of rocks and soil and reports basic properties of rocks. The program requires either the Mac OS X operating system running on a G4 (or more capable) processor or a Linux operating system running on a Pentium (or more capable) processor, plus at least 128MB of random-access memory.
Automated mixed traffic vehicle design AMTV 2
NASA Technical Reports Server (NTRS)
Johnston, A. R.; Marks, R. A.; Cassell, P. L.
1982-01-01
The design of an improved and enclosed Automated Mixed Traffic Transit (AMTT) vehicle is described. AMTT is an innovative concept for low-speed tram-type transit in which suitable vehicles are equipped with sensors and controls to permit them to operate in an automated mode on existing road or walkway surfaces. The vehicle chassis and body design are presented in terms of sketches and photographs. The functional design of the sensing and control system is presented, and modifications which could be made to the baseline design for improved performance, in particular to incorporate a 20-mph capability, are also discussed. The vehicle system is described at the block-diagram-level of detail. Specifications and parameter values are given where available.
Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas
2014-01-01
Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.
Comparison of landmark-based and automatic methods for cortical surface registration
Pantazis, Dimitrios; Joshi, Anand; Jiang, Jintao; Shattuck, David; Bernstein, Lynne E.; Damasio, Hanna; Leahy, Richard M.
2009-01-01
Group analysis of structure or function in cerebral cortex typically involves as a first step the alignment of the cortices. A surface based approach to this problem treats the cortex as a convoluted surface and coregisters across subjects so that cortical landmarks or features are aligned. This registration can be performed using curves representing sulcal fundi and gyral crowns to constrain the mapping. Alternatively, registration can be based on the alignment of curvature metrics computed over the entire cortical surface. The former approach typically involves some degree of user interaction in defining the sulcal and gyral landmarks while the latter methods can be completely automated. Here we introduce a cortical delineation protocol consisting of 26 consistent landmarks spanning the entire cortical surface. We then compare the performance of a landmark-based registration method that uses this protocol with that of two automatic methods implemented in the software packages FreeSurfer and BrainVoyager. We compare performance in terms of discrepancy maps between the different methods, the accuracy with which regions of interest are aligned, and the ability of the automated methods to correctly align standard cortical landmarks. Our results show similar performance for ROIs in the perisylvian region for the landmark based method and FreeSurfer. However, the discrepancy maps showed larger variability between methods in occipital and frontal cortex and also that automated methods often produce misalignment of standard cortical landmarks. Consequently, selection of the registration approach should consider the importance of accurate sulcal alignment for the specific task for which coregistration is being performed. When automatic methods are used, the users should ensure that sulci in regions of interest in their studies are adequately aligned before proceeding with subsequent analysis. PMID:19796696
NASA Technical Reports Server (NTRS)
Vaughan, R. Greg; Hook, Simon J.
2006-01-01
ASTER thermal infrared data over Mt. St Helens were used to characterize its thermal behavior from Jun 2000 to Feb 2006. Prior to the Oct 2004 eruption, the average crater temperature varied seasonally between -12 and 6 C. After the eruption, maximum single-pixel temperature increased from 10 C (Oct 2004) to 96 C (Aug 2005), then showed a decrease to Feb 2006. The initial increase in temperature was correlated with dome morphology and growth rate and the subsequent decrease was interpreted to relate to both seasonal trends and a decreased growth rate/increased cooling rate, possibly suggesting a significant change in the volcanic system. A single-pixel ASTER thermal anomaly first appeared on Oct 1, 2004, eleven hours after the first eruption - 10 days before new lava was exposed at the surface. By contrast, an automated algorithm for detecting thermal anomalies in MODIS data did not trigger an alert until Dec 18. However, a single-pixel thermal anomaly first appeared in MODIS channel 23 (4 um) on Oct 13, 12 days after the first eruption - 2 days after lava was exposed. The earlier thermal anomaly detected with ASTER data is attributed to the higher spatial resolution (90 m) compared with MODIS (1 m) and the earlier visual observation of anomalous pixels compared to the automated detection method suggests that local spatial statistics and background radiance data could improve automated detection methods.
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J.; Paulsen, Jane S.; Miller, Michael I.
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans. PMID:29867332
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J; Paulsen, Jane S; Miller, Michael I
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans.
Surface-enhanced Raman scattering (SERS) dosimeter and probe
Vo-Dinh, Tuan
1995-01-01
A dosimeter and probe for measuring exposure to chemical and biological compounds is disclosed. The dosimeter or probe includes a collector which may be analyzed by surface-enhanced Raman spectroscopy. The collector comprises a surface-enhanced Raman scattering-active material having a coating applied thereto to improve the adsorption properties of the collector. The collector may also be used in automated sequential devises, in probe array devices.
A method for modeling contact dynamics for automated capture mechanisms
NASA Technical Reports Server (NTRS)
Williams, Philip J.
1991-01-01
Logicon Control Dynamics develops contact dynamics models for space-based docking and berthing vehicles. The models compute contact forces for the physical contact between mating capture mechanism surfaces. Realistic simulation requires proportionality constants, for calculating contact forces, to approximate surface stiffness of contacting bodies. Proportionality for rigid metallic bodies becomes quite large. Small penetrations of surface boundaries can produce large contact forces.
Automated acoustic matrix deposition for MALDI sample preparation.
Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M
2006-02-01
Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.
Isolation of circulating tumor cells from pancreatic cancer by automated filtration
Brychta, Nora; Drosch, Michael; Driemel, Christiane; Fischer, Johannes C.; Neves, Rui P.; Esposito, Irene; Knoefel, Wolfram; Möhlendick, Birte; Hille, Claudia; Stresemann, Antje; Krahn, Thomas; Kassack, Matthias U.; Stoecklein, Nikolas H.; von Ahsen, Oliver
2017-01-01
It is now widely recognized that the isolation of circulating tumor cells based on cell surface markers might be hindered by variability in their protein expression. Especially in pancreatic cancer, isolation based only on EpCAM expression has produced very diverse results. Methods that are independent of surface markers and therefore independent of phenotypical changes in the circulating cells might increase CTC recovery also in pancreatic cancer. We compared an EpCAM-dependent (IsoFlux) and a size-dependent (automated Siemens Healthineers filtration device) isolation method for the enrichment of pancreatic cancer CTCs. The recovery rate of the filtration based approach is dramatically superior to the EpCAM-dependent approach especially for cells with low EpCAM-expression (filtration: 52%, EpCAM-dependent: 1%). As storage and shipment of clinical samples is important for centralized analyses, we also evaluated the use of frozen diagnostic leukapheresis (DLA) as source for isolating CTCs and subsequent genetic analysis such as KRAS mutation detection analysis. Using frozen DLA samples of pancreatic cancer patients we detected CTCs in 42% of the samples by automated filtration. PMID:29156783
Isolation of circulating tumor cells from pancreatic cancer by automated filtration.
Brychta, Nora; Drosch, Michael; Driemel, Christiane; Fischer, Johannes C; Neves, Rui P; Esposito, Irene; Knoefel, Wolfram; Möhlendick, Birte; Hille, Claudia; Stresemann, Antje; Krahn, Thomas; Kassack, Matthias U; Stoecklein, Nikolas H; von Ahsen, Oliver
2017-10-17
It is now widely recognized that the isolation of circulating tumor cells based on cell surface markers might be hindered by variability in their protein expression. Especially in pancreatic cancer, isolation based only on EpCAM expression has produced very diverse results. Methods that are independent of surface markers and therefore independent of phenotypical changes in the circulating cells might increase CTC recovery also in pancreatic cancer. We compared an EpCAM-dependent (IsoFlux) and a size-dependent (automated Siemens Healthineers filtration device) isolation method for the enrichment of pancreatic cancer CTCs. The recovery rate of the filtration based approach is dramatically superior to the EpCAM-dependent approach especially for cells with low EpCAM-expression (filtration: 52%, EpCAM-dependent: 1%). As storage and shipment of clinical samples is important for centralized analyses, we also evaluated the use of frozen diagnostic leukapheresis (DLA) as source for isolating CTCs and subsequent genetic analysis such as KRAS mutation detection analysis. Using frozen DLA samples of pancreatic cancer patients we detected CTCs in 42% of the samples by automated filtration.
Predicting Flows of Rarefied Gases
NASA Technical Reports Server (NTRS)
LeBeau, Gerald J.; Wilmoth, Richard G.
2005-01-01
DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Votava, P.; Golden, K.; Hashimoto, H.; Jolly, M.; White, M.; Running, S.; Coughlan, J.
2003-12-01
The latest generation of NASA Earth Observing System satellites has brought a new dimension to continuous monitoring of the living part of the Earth System, the Biosphere. EOS data can now provide weekly global measures of vegetation productivity and ocean chlorophyll, and many related biophysical factors such as land cover changes or snowmelt rates. However, information with the highest economic value would be forecasting impending conditions of the biosphere that would allow advanced decision-making to mitigate dangers, or exploit positive trends. We have developed a software system called the Terrestrial Observation and Prediction System (TOPS) to facilitate rapid analysis of ecosystem states/functions by integrating EOS data with ecosystem models, surface weather observations and weather/climate forecasts. Land products from MODIS (Moderate Resolution Imaging Spectroradiometer) including land cover, albedo, snow, surface temperature, leaf area index are ingested into TOPS for parameterization of models and for verifying model outputs such as snow cover and vegetation phenology. TOPS is programmed to gather data from observing networks such as USDA soil moisture, AMERIFLUX, SNOWTEL to further enhance model predictions. Key technologies enabling TOPS implementation include the ability to understand and process heterogeneous-distributed data sets, automated planning and execution of ecosystem models, causation analysis for understanding model outputs. Current TOPS implementations at local (vineyard) to global scales (global net primary production) can be found at http://www.ntsg.umt.edu/tops.
Automation Hooks Architecture for Flexible Test Orchestration - Concept Development and Validation
NASA Technical Reports Server (NTRS)
Lansdowne, C. A.; Maclean, John R.; Winton, Chris; McCartney, Pat
2011-01-01
The Automation Hooks Architecture Trade Study for Flexible Test Orchestration sought a standardized data-driven alternative to conventional automated test programming interfaces. The study recommended composing the interface using multicast DNS (mDNS/SD) service discovery, Representational State Transfer (Restful) Web Services, and Automatic Test Markup Language (ATML). We describe additional efforts to rapidly mature the Automation Hooks Architecture candidate interface definition by validating it in a broad spectrum of applications. These activities have allowed us to further refine our concepts and provide observations directed toward objectives of economy, scalability, versatility, performance, severability, maintainability, scriptability and others.
NASA Astrophysics Data System (ADS)
Karsten, L. R.; Gochis, D.; Dugger, A. L.; McCreight, J. L.; Barlage, M. J.; Fall, G. M.; Olheiser, C.
2017-12-01
Since version 1.0 of the National Water Model (NWM) has gone operational in Summer 2016, several upgrades to the model have occurred to improve hydrologic prediction for the continental United States. Version 1.1 of the NWM (Spring 2017) includes upgrades to parameter datasets impacting land surface hydrologic processes. These parameter datasets were upgraded using an automated calibration workflow that utilizes the Dynamic Data Search (DDS) algorithm to adjust parameter values using observed streamflow. As such, these upgrades to parameter values took advantage of various observations collected for snow analysis. In particular, in-situ SNOTEL observations in the Western US, volunteer in-situ observations across the entire US, gamma-derived snow water equivalent (SWE) observations courtesy of the NWS NOAA Corps program, gridded snow depth and SWE products from the Jet Propulsion Laboratory (JPL) Airborne Snow Observatory (ASO), gridded remotely sensed satellite-based snow products (MODIS,AMSR2,VIIRS,ATMS), and gridded SWE from the NWS Snow Data Assimilation System (SNODAS). This study explores the use of these observations to quantify NWM error and improvements from version 1.0 to version 1.1, along with subsequent work since then. In addition, this study explores the use of snow observations for use within the automated calibration workflow. Gridded parameter fields impacting the accumulation and ablation of snow states in the NWM were adjusted and calibrated using gridded remotely sensed snow states, SNODAS products, and in-situ snow observations. This calibration adjustment took place over various ecological regions in snow-dominated parts of the US for a retrospective period of time to capture a variety of climatological conditions. Specifically, the latest calibrated parameters impacting streamflow were held constant and only parameters impacting snow physics were tuned using snow observations and analysis. The adjusted parameter datasets were then used to force the model over an independent period for analysis against both snow and streamflow observations to see if improvements took place. The goal of this work is to further improve snow physics in the NWM, along with identifying areas where further work will take place in the future, such as data assimilation or further forcing improvements.
Automation of Coordinated Planning Between Observatories: The Visual Observation Layout Tool (VOLT)
NASA Technical Reports Server (NTRS)
Maks, Lori; Koratkar, Anuradha; Kerbel, Uri; Pell, Vince
2002-01-01
Fulfilling the promise of the era of great observatories, NASA now has more than three space-based astronomical telescopes operating in different wavebands. This situation provides astronomers with the unique opportunity of simultaneously observing a target in multiple wavebands with these observatories. Currently scheduling multiple observatories simultaneously, for coordinated observations, is highly inefficient. Coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Because they are time-consuming and expensive to schedule, observatories often limit the number of coordinated observations that can be conducted. In order to exploit new paradigms for observatory operation, the Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center has developed a tool called the Visual Observation Layout Tool (VOLT). The main objective of VOLT is to provide a visual tool to automate the planning of coordinated observations by multiple astronomical observatories. Four of NASA's space-based astronomical observatories - the Hubble Space Telescope (HST), Far Ultraviolet Spectroscopic Explorer (FUSE), Rossi X-ray Timing Explorer (RXTE) and Chandra - are enthusiastically pursuing the use of VOLT. This paper will focus on the purpose for developing VOLT, as well as the lessons learned during the infusion of VOLT into the planning and scheduling operations of these observatories.
Driving Performance After Self-Regulated Control Transitions in Highly Automated Vehicles.
Eriksson, Alexander; Stanton, Neville A
2017-12-01
This study aims to explore whether driver-paced, noncritical transitions of control may counteract some of the aftereffects observed in the contemporary literature, resulting in higher levels of vehicle control. Research into control transitions in highly automated driving has focused on urgent scenarios where drivers are given a relatively short time span to respond to a request to resume manual control, resulting in seemingly scrambled control when manual control is resumed. Twenty-six drivers drove two scenarios with an automated driving feature activated. Drivers were asked to read a newspaper or monitor the system and relinquish or resume control from the automation when prompted by vehicle systems. Driving performance in terms of lane positioning and steering behavior was assessed for 20 seconds post resuming control to capture the resulting level of control. It was found that lane positioning was virtually unaffected for the duration of the 20-second time span in both automated conditions compared to the manual baseline when drivers resumed manual control; however, significant increases in the standard deviation of steering input were found for both automated conditions compared to baseline. No significant differences were found between the two automated conditions. The results indicate that when drivers self-paced the transfer back to manual control they exhibit less of the detrimental effects observed in system-paced conditions. It was shown that self-paced transitions could reduce the risk of accidents near the edge of the operational design domain. Vehicle manufacturers must consider these benefits when designing contemporary systems.
NASA Astrophysics Data System (ADS)
Shulski, Martha D.; Seeley, Mark W.
2004-11-01
Models were utilized to determine the snow accumulation season (SAS) and to quantify windblown snow for the purpose of snowdrift control for locations in Minnesota. The models require mean monthly temperature, snowfall, density of snow, and wind frequency distribution statistics. Temperature and precipitation data were obtained from local cooperative observing sites, and wind data came from Automated Surface Observing System (ASOS)/Automated Weather Observing System (AWOS) sites in the region. The temperature-based algorithm used to define the SAS reveals a geographic variability in the starting and ending dates of the season, which is determined by latitude and elevation. Mean seasonal snowfall shows a geographic distribution that is affected by topography and proximity to Lake Superior. Mean snowfall density also exhibits variability, with lower-density snow events displaced to higher-latitude positions. Seasonal wind frequencies show a strong bimodal distribution with peaks from the northwest and southeast vector direction, with an exception for locations in close proximity to the Lake Superior shoreline. In addition, for western and south-central Minnesota there is a considerably higher frequency of wind speeds above the mean snow transport threshold of 7 m s-1. As such, this area is more conducive to higher potential snow transport totals. Snow relocation coefficients in this area are in the range of 0.4 0.9, and, according to the empirical models used in this analysis, this range implies that actual snow transport is 40% 90% of the total potential in south-central and western areas of the state.
Rapid SAW Sensor Development Tools
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2007-01-01
The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.
AFTI/F16 Automated Maneuvering Attack System Test Reports/Special Technologies and Outlook.
1986-07-11
Multiplex Data Bus A-A Air-To-Air A-S Air-to-Surface AFTI Advanced Fighter Technology Integration SYSTEM DESIGN AGL Above-Ground-Level AMAS Automated...Maneuvering Attack System Design requirements for the AFTI/F-16 are driven AMUX Avionics Multiplex Data Bus by realistic air combat scenarios and are...the avionics subsystem IFIM and avionics systems are single-thread, much of the sensed various flight control sensors. Additionally, along with data
Race, Caitlin M.; Kwon, Lydia E.; Foreman, Myles T.; ...
2017-11-24
Here, we report on the implementation of an automated platform for detecting the presence of an antibody biomarker for human papillomavirus-associated oropharyngeal cancer from a single droplet of serum, in which a nanostructured photonic crystal surface is used to amplify the output of a fluorescence-linked immunosorbent assay. The platform is comprised of a microfluidic cartridge with integrated photonic crystal chips that interfaces with an assay instrument that automates the introduction of reagents, wash steps, and surface drying. Upon assay completion, the cartridge interfaces with a custom laser-scanning instrument that couples light into the photonic crystal at the optimal resonance conditionmore » for fluorescence enhancement. The instrument is used to measure the fluorescence intensity values of microarray spots corresponding to the biomarkers of interest, in addition to several experimental controls that verify correct functioning of the assay protocol. In this work, we report both dose-response characterization of the system using anti-E7 antibody introduced at known concentrations into serum and characterization of a set of clinical samples from which results were compared with a conventional enzyme-linked immunosorbent assay (ELISA) performed in microplate format. Finally, the demonstrated capability represents a simple, rapid, automated, and high-sensitivity method for multiplexed detection of protein biomarkers from a low-volume test sample.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Race, Caitlin M.; Kwon, Lydia E.; Foreman, Myles T.
Here, we report on the implementation of an automated platform for detecting the presence of an antibody biomarker for human papillomavirus-associated oropharyngeal cancer from a single droplet of serum, in which a nanostructured photonic crystal surface is used to amplify the output of a fluorescence-linked immunosorbent assay. The platform is comprised of a microfluidic cartridge with integrated photonic crystal chips that interfaces with an assay instrument that automates the introduction of reagents, wash steps, and surface drying. Upon assay completion, the cartridge interfaces with a custom laser-scanning instrument that couples light into the photonic crystal at the optimal resonance conditionmore » for fluorescence enhancement. The instrument is used to measure the fluorescence intensity values of microarray spots corresponding to the biomarkers of interest, in addition to several experimental controls that verify correct functioning of the assay protocol. In this work, we report both dose-response characterization of the system using anti-E7 antibody introduced at known concentrations into serum and characterization of a set of clinical samples from which results were compared with a conventional enzyme-linked immunosorbent assay (ELISA) performed in microplate format. Finally, the demonstrated capability represents a simple, rapid, automated, and high-sensitivity method for multiplexed detection of protein biomarkers from a low-volume test sample.« less
J plots: a new method for characterizing structures in the interstellar medium
NASA Astrophysics Data System (ADS)
Jaffa, S. E.; Whitworth, A. P.; Clarke, S. D.; Howard, A. D. P.
2018-06-01
Large-scale surveys have brought about a revolution in astronomy. To analyse the resulting wealth of data, we need automated tools to identify, classify, and quantify the important underlying structures. We present here a method for classifying and quantifying a pixelated structure, based on its principal moments of inertia. The method enables us to automatically detect, and objectively compare, centrally condensed cores, elongated filaments, and hollow rings. We illustrate the method by applying it to (i) observations of surface density from Hi-GAL, and (ii) simulations of filament growth in a turbulent medium. We limit the discussion here to 2D data; in a future paper, we will extend the method to 3D data.
Satellite Calibration With LED Detectors at Mud Lake
NASA Technical Reports Server (NTRS)
Hiller, Jonathan D.
2005-01-01
Earth-monitoring instruments in orbit must be routinely calibrated in order to accurately analyze the data obtained. By comparing radiometric measurements taken on the ground in conjunction with a satellite overpass, calibration curves are derived for an orbiting instrument. A permanent, automated facility is planned for Mud Lake, Nevada (a large, homogeneous, dry lakebed) for this purpose. Because some orbiting instruments have low resolution (250 meters per pixel), inexpensive radiometers using LEDs as sensors are being developed to array widely over the lakebed. LEDs are ideal because they are inexpensive, reliable, and sense over a narrow bandwidth. By obtaining and averaging widespread data, errors are reduced and long-term surface changes can be more accurately observed.
Human Systems Integration and Automation Issues in Small Unmanned Aerial Vehicles
NASA Technical Reports Server (NTRS)
McCauley, Michael E.; Matsangas, Panagiotis
2004-01-01
The goal of this report is to identify Human System Integration (HSI) and automation issues that contribute to improved effectiveness and efficiency in the operation of U.S. military Small Unmanned Aerial Vehicles (SUAVs). HSI issues relevant to SUAV operations are reviewed and observations from field trials are summarized. Short-term improvements are suggested research issues are identified and an overview is provided of automation technologies applicable to future SUAV design.
Automating data analysis during the inspection of boiler tubes using line scanning thermography
NASA Astrophysics Data System (ADS)
Ley, Obdulia; Momeni, Sepand; Ostroff, Jason; Godinez, Valery
2012-05-01
Failures in boiler waterwalls can occur when a relatively small amount of corrosion and loss of metal have been experienced. This study presents our efforts towards the application of Line Scanning Thermography (LST) for the analysis of thinning in boiler waterwall tubing. LST utilizes a line heat source to thermally excite the surface to be inspected and an infrared detector to record the transient surface temperature increase observed due to the presence of voids, thinning or other defects. In waterwall boiler tubes the defects that can be detected using LST correspond to corrosion pitting, hydrogen damage and wall thinning produced by inadequate burner heating or problems with the water chemistry. In this paper we discuss how the LST technique is implemented to determine thickness from the surface temperature data, and we describe our efforts towards developing a semiautomatic analysis tool to speed up the time between scanning, reporting and implementing repairs. We compare the density of data produced by the common techniques used to assess wall thickness and the data produced by LST.
UAS Reports (UREPs): EnablingExchange of Observation Data Between UAS Operations
NASA Technical Reports Server (NTRS)
Rios, Joseph; Smith, David; Smith, Irene
2017-01-01
As the volume of small unmanned aircraft systems (UAS) operations increases, the lack of weather products to support these operations becomes more problematic. One early solution to obtaining more information about weather conditions is to allow operators to share their observations and measurements with other airspace users. This is analogous to the AIREP and PIREP reporting systems in traditional aviation wherein pilots report weather phenomena they have observed or experienced to provide better situational awareness to other pilots. Given the automated nature of the small (under 55 lbs.) UAS platforms and operations, automated reporting of relevant information should also be supported. To promote automated exchange of these data, a well-defined data schema needs to be established along with the mechanisms for sending and retrieving the data. This paper examines this concept and offers an initial definition of the necessary elements to allow for immediate implementation and use.
Interpreting diel hysteresis between soil respiration and temperature
C. Phillips; N. Nickerson; D. Risk; B.J. Bond
2011-01-01
Increasing use of automated soil respiration chambers in recent years has demonstrated complex diel relationships between soil respiration and temperature that are not apparent from less frequent measurements. Soil surface flux is often lagged from soil temperature by several hours, which results in semielliptical hysteresis loops when surface flux is plotted as a...
40 CFR 63.4482 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES National Emission Standards for Hazardous Air Pollutants for Surface Coating of Plastic Parts and... paragraphs (b)(1) through (4) of this section that are used for surface coating of plastic parts and products... automated equipment and containers used for conveying waste materials generated by a coating operation. (c...
A MODIS-based automated flood monitoring system for southeast asia
NASA Astrophysics Data System (ADS)
Ahamed, A.; Bolten, J. D.
2017-09-01
Flood disasters in Southeast Asia result in significant loss of life and economic damage. Remote sensing information systems designed to spatially and temporally monitor floods can help governments and international agencies formulate effective disaster response strategies during a flood and ultimately alleviate impacts to population, infrastructure, and agriculture. Recent destructive flood events in the Lower Mekong River Basin occurred in 2000, 2011, 2013, and 2016 (http://ffw.mrcmekong.org/historical_rec.htm, April 24, 2017). The large spatial distribution of flooded areas and lack of proper gauge data in the region makes accurate monitoring and assessment of impacts of floods difficult. Here, we discuss the utility of applying satellite-based Earth observations for improving flood inundation monitoring over the flood-prone Lower Mekong River Basin. We present a methodology for determining near real-time surface water extent associated with current and historic flood events by training surface water classifiers from 8-day, 250-m Moderate-resolution Imaging Spectroradiometer (MODIS) data spanning the length of the MODIS satellite record. The Normalized Difference Vegetation Index (NDVI) signature of permanent water bodies (MOD44W; Carroll et al., 2009) is used to train surface water classifiers which are applied to a time period of interest. From this, an operational nowcast flood detection component is produced using twice daily imagery acquired at 3-h latency which performs image compositing routines to minimize cloud cover. Case studies and accuracy assessments against radar-based observations for historic flood events are presented. The customizable system has been transferred to regional organizations and near real-time derived surface water products are made available through a web interface platform. Results highlight the potential of near real-time observation and impact assessment systems to serve as effective decision support tools for governments, international agencies, and disaster responders.
GIS-based automated management of highway surface crack inspection system
NASA Astrophysics Data System (ADS)
Chung, Hung-Chi; Shinozuka, Masanobu; Soeller, Tony; Girardello, Roberto
2004-07-01
An automated in-situ road surface distress surveying and management system, AMPIS, has been developed on the basis of video images within the framework of GIS software. Video image processing techniques are introduced to acquire, process and analyze the road surface images obtained from a moving vehicle. ArcGIS platform is used to integrate the routines of image processing and spatial analysis in handling the full-scale metropolitan highway surface distress detection and data fusion/management. This makes it possible to present user-friendly interfaces in GIS and to provide efficient visualizations of surveyed results not only for the use of transportation engineers to manage road surveying documentations, data acquisition, analysis and management, but also for financial officials to plan maintenance and repair programs and further evaluate the socio-economic impacts of highway degradation and deterioration. A review performed in this study on fundamental principle of Pavement Management System (PMS) and its implementation indicates that the proposed approach of using GIS concept and its tools for PMS application will reshape PMS into a new information technology-based system that can provide convenient and efficient pavement inspection and management.
An Automated Road Roughness Detection from Mobile Laser Scanning Data
NASA Astrophysics Data System (ADS)
Kumar, P.; Angelats, E.
2017-05-01
Rough roads influence the safety of the road users as accident rate increases with increasing unevenness of the road surface. Road roughness regions are required to be efficiently detected and located in order to ensure their maintenance. Mobile Laser Scanning (MLS) systems provide a rapid and cost-effective alternative by providing accurate and dense point cloud data along route corridor. In this paper, an automated algorithm is presented for detecting road roughness from MLS data. The presented algorithm is based on interpolating smooth intensity raster surface from LiDAR point cloud data using point thinning process. The interpolated surface is further processed using morphological and multi-level Otsu thresholding operations to identify candidate road roughness regions. The candidate regions are finally filtered based on spatial density and standard deviation of elevation criteria to detect the roughness along the road surface. The test results of road roughness detection algorithm on two road sections are presented. The developed approach can be used to provide comprehensive information to road authorities in order to schedule maintenance and ensure maximum safety conditions for road users.
Development and validation of satellite-based estimates of surface visibility
NASA Astrophysics Data System (ADS)
Brunner, J.; Pierce, R. B.; Lenzen, A.
2016-02-01
A satellite-based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5 % for classifying clear (V ≥ 30 km), moderate (10 km ≤ V < 30 km), low (2 km ≤ V < 10 km), and poor (V < 2 km) visibilities and shows the most skill during June through September, when Heidke skill scores are between 0.2 and 0.4. We demonstrate that the aerosol (clear-sky) component of the GOES-R ABI visibility retrieval can be used to augment measurements from the United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.
Development and validation of satellite based estimates of surface visibility
NASA Astrophysics Data System (ADS)
Brunner, J.; Pierce, R. B.; Lenzen, A.
2015-10-01
A satellite based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5% for classifying Clear (V ≥ 30 km), Moderate (10 km ≤ V < 30 km), Low (2 km ≤ V < 10 km) and Poor (V < 2 km) visibilities and shows the most skill during June through September, when Heidke skill scores are between 0.2 and 0.4. We demonstrate that the aerosol (clear sky) component of the GOES-R ABI visibility retrieval can be used to augment measurements from the United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network, and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.
Using deep neural networks to augment NIF post-shot analysis
NASA Astrophysics Data System (ADS)
Humbird, Kelli; Peterson, Luc; McClarren, Ryan; Field, John; Gaffney, Jim; Kruse, Michael; Nora, Ryan; Spears, Brian
2017-10-01
Post-shot analysis of National Ignition Facility (NIF) experiments is the process of determining which simulation inputs yield results consistent with experimental observations. This analysis is typically accomplished by running suites of manually adjusted simulations, or Monte Carlo sampling surrogate models that approximate the response surfaces of the physics code. These approaches are expensive and often find simulations that match only a small subset of observables simultaneously. We demonstrate an alternative method for performing post-shot analysis using inverse models, which map directly from experimental observables to simulation inputs with quantified uncertainties. The models are created using a novel machine learning algorithm which automates the construction and initialization of deep neural networks to optimize predictive accuracy. We show how these neural networks, trained on large databases of post-shot simulations, can rigorously quantify the agreement between simulation and experiment. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Texas Automated Buoy System 1995-2005 and Beyond
NASA Astrophysics Data System (ADS)
Guinasso, N. L.; Bender, L. C.; Walpert, J. N.; Lee, L. L.; Campbell, L.; Hetland, R. D.; Howard, M. K.; Martin, R. D.
2005-05-01
TABS was established in l995 to provide data to assess oil spill movement along Texas coast for the Texas General Land Office Oil Spill Prevention and Response Program. A system of nine automated buoys provide wind and current data in near real time. Two of these buoys are supported by the Flower Garden Banks Joint Industry Program. A TABS web site provides a public interface to view and download the data. A real time data analysis web page presents a wide variety of useful data products derived from the field measurements. Integration efforts now underway include transfer of buoy data to the National Data Buoy Center for quality control and incorporation into the Global Telecommunications Stream. The TGLO ocean circulation nowcast/forecast modeling system has been in continuous operation since 1998. Two models, POM and ROMS, are used to produce forecasts of near-surface wind driven currents up to 48 hours into the future. Both models are driven using wind fields obtained from the NAM (formerly Eta) forecast models operated by NOAA NCEP. Wind and current fields are displayed on websites in both static and animated forms and are updated four times per day. Under funding from the SURA/SCOOP program we are; 1) revamping the system to conform with the evolving Data Management and Communications (DMAC) framework adopted by the NSF Orion and OCEAN.US IOOS programs, 2) producing model-data comparisons, and 3) integrating the wind and current fields into the GNOME oil trajectory model used by NOAA/Hazmat. Academic research is planned to assimilate near real-time observations from TABS buoys and some 30-40 ADCP instruments scheduled to be mounted on offshore oil platforms in early 2005. Texas Automated Buoy System (TABS) and its associated modeling efforts provide a reliable source of accurate, up-to-date information on currents along the Texas coast. As the nation embarks on the development of an Integrated Ocean Observing System (IOOS), TABS will be an active participant as a foundational regional component to the national backbone of ocean observations.
Temporally-resolved Study of Atmosphere-lake Net CO2 Exchange at Lochaber Lake, Nova Scotia, Canada
NASA Astrophysics Data System (ADS)
Spafford, L. A.; Risk, D. A.
2016-12-01
Lakes are carbon gateways with immense processing capacity, acting as either sinks or sources for CO2. As climate change exacerbates weather extremes, carbon stored within permafrost and soils is liberated to water systems, altering aquatic carbon budgets and light availability for photosynthesis. The functional response of lakes to climate change is uncertain, and continuous data of lake respiration and its drivers are lacking. This study used high-frequency measurements of CO2 exchange during a growing season by a novel technique to quantify the net flux of carbon at a small deep oligotrophic lake in eastern Nova Scotia, Canada, and to examine the influence of environmental forcings. We installed 3 floating Forced Diffusion dynamic membrane chambers on the lake, coupled to a valving multiplexer and a single Vaisala GMP 343 CO2 analyzer. This low-power system sampled lake-atmosphere CO2 exchange at several points from shore every hour for over 100 days in the growing season. At the same frequency we also collected automated measurements of wind velocity, photosynthetically active radiation (PAR), dissolved CO2, air and water temperature. Manual measurement campaigns measured chlorophyll `a', DOC, surface methane (CH4), and CO2 flux by manual static floating chamber to confirm the automated measurements. The lake was a net source for carbon, on average emitting 0.038 µmol CO2/m2/s or 4.967 g CO2/s over the entire lake, but we did observe significant temporal variation across diel cycles, and along with changing weather. Approximately 48 hours after every rain event, we observed an increase in littoral CO2 release by the lake. Wind speed, air temperature, and distance from shore were also drivers of variation, as the littoral zone tended to release less CO2 during the course of our study. This work shows the variable influence of environmental drivers of lake carbon flux, as well as the utility of low-power automated chambers for observing aquatic net CO2 exchange.
Observation of sea-ice dynamics using synthetic aperture radar images: Automated analysis
NASA Technical Reports Server (NTRS)
Vesecky, John F.; Samadani, Ramin; Smith, Martha P.; Daida, Jason M.; Bracewell, Ronald N.
1988-01-01
The European Space Agency's ERS-1 satellite, as well as others planned to follow, is expected to carry synthetic-aperture radars (SARs) over the polar regions beginning in 1989. A key component in utilization of these SAR data is an automated scheme for extracting the sea-ice velocity field from a time sequence of SAR images of the same geographical region. Two techniques for automated sea-ice tracking, image pyramid area correlation (hierarchical correlation) and feature tracking, are described. Each technique is applied to a pair of Seasat SAR sea-ice images. The results compare well with each other and with manually tracked estimates of the ice velocity. The advantages and disadvantages of these automated methods are pointed out. Using these ice velocity field estimates it is possible to construct one sea-ice image from the other member of the pair. Comparing the reconstructed image with the observed image, errors in the estimated velocity field can be recognized and a useful probable error display created automatically to accompany ice velocity estimates. It is suggested that this error display may be useful in segmenting the sea ice observed into regions that move as rigid plates of significant ice velocity shear and distortion.
Validation of an automated fluorescein method for determining bromide in water
Fishman, M. J.; Schroder, L.J.; Friedman, L.C.
1985-01-01
Surface, atmospheric precipitation and deionized water samples were spiked with ??g l-1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0.015 to 0.5 mg l-1 of bromide. The correlation coefficient for the same sets of paired data is 0.9987. Recovery data, except for the surface water samples to which 0.005 mg l-1 of bromide was added, range from 89 to 112%. There appears to be no loss of bromide from solution in either type of container.Surface, atmospheric precipitation and deionized water samples were spiked with mu g l** minus **1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0. 015 to 0. 5 mg l** minus **1 of bromide. The correlation coefficient for the same sets of paired data is 0. 9987. Recovery data, except for the surface water samples to which 0. 005 mg l** minus **1 of bromide was added, range from 89 to 112%. Refs.
Lupidi, Marco; Coscas, Florence; Cagini, Carlo; Fiore, Tito; Spaccini, Elisa; Fruttini, Daniela; Coscas, Gabriel
2016-09-01
To describe a new automated quantitative technique for displaying and analyzing macular vascular perfusion using optical coherence tomography angiography (OCT-A) and to determine a normative data set, which might be used as reference in identifying progressive changes due to different retinal vascular diseases. Reliability study. A retrospective review of 47 eyes of 47 consecutive healthy subjects imaged with a spectral-domain OCT-A device was performed in a single institution. Full-spectrum amplitude-decorrelation angiography generated OCT angiograms of the retinal superficial and deep capillary plexuses. A fully automated custom-built software was used to provide quantitative data on the foveal avascular zone (FAZ) features and the total vascular and avascular surfaces. A comparative analysis between central macular thickness (and volume) and FAZ metrics was performed. Repeatability and reproducibility were also assessed in order to establish the feasibility and reliability of the method. The comparative analysis between the superficial capillary plexus and the deep capillary plexus revealed a statistically significant difference (P < .05) in terms of FAZ perimeter, surface, and major axis and a not statistically significant difference (P > .05) when considering total vascular and avascular surfaces. A linear correlation was demonstrated between central macular thickness (and volume) and the FAZ surface. Coefficients of repeatability and reproducibility were less than 0.4, thus demonstrating high intraobserver repeatability and interobserver reproducibility for all the examined data. A quantitative approach on retinal vascular perfusion, which is visible on Spectralis OCT angiography, may offer an objective and reliable method for monitoring disease progression in several retinal vascular diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry
NASA Astrophysics Data System (ADS)
Gerard, Libby F.; Linn, Marcia C.
2016-02-01
Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.
De Tobel, J; Radesh, P; Vandermeulen, D; Thevissen, P W
2017-12-01
Automated methods to evaluate growth of hand and wrist bones on radiographs and magnetic resonance imaging have been developed. They can be applied to estimate age in children and subadults. Automated methods require the software to (1) recognise the region of interest in the image(s), (2) evaluate the degree of development and (3) correlate this to the age of the subject based on a reference population. For age estimation based on third molars an automated method for step (1) has been presented for 3D magnetic resonance imaging and is currently being optimised (Unterpirker et al. 2015). To develop an automated method for step (2) based on lower third molars on panoramic radiographs. A modified Demirjian staging technique including ten developmental stages was developed. Twenty panoramic radiographs per stage per gender were retrospectively selected for FDI element 38. Two observers decided in consensus about the stages. When necessary, a third observer acted as a referee to establish the reference stage for the considered third molar. This set of radiographs was used as training data for machine learning algorithms for automated staging. First, image contrast settings were optimised to evaluate the third molar of interest and a rectangular bounding box was placed around it in a standardised way using Adobe Photoshop CC 2017 software. This bounding box indicated the region of interest for the next step. Second, several machine learning algorithms available in MATLAB R2017a software were applied for automated stage recognition. Third, the classification performance was evaluated in a 5-fold cross-validation scenario, using different validation metrics (accuracy, Rank-N recognition rate, mean absolute difference, linear kappa coefficient). Transfer Learning as a type of Deep Learning Convolutional Neural Network approach outperformed all other tested approaches. Mean accuracy equalled 0.51, mean absolute difference was 0.6 stages and mean linearly weighted kappa was 0.82. The overall performance of the presented automated pilot technique to stage lower third molar development on panoramic radiographs was similar to staging by human observers. It will be further optimised in future research, since it represents a necessary step to achieve a fully automated dental age estimation method, which to date is not available.
NASA Astrophysics Data System (ADS)
Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip
2017-04-01
Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.
Surface-enhanced Raman scattering (SERS) dosimeter and probe
Vo-Dinh, T.
1995-03-21
A dosimeter and probe for measuring exposure to chemical and biological compounds is disclosed. The dosimeter or probe includes a collector which may be analyzed by surface-enhanced Raman spectroscopy. The collector comprises a surface-enhanced Raman scattering-active material having a coating applied thereto to improve the adsorption properties of the collector. The collector may also be used in automated sequential devices, in probe array devices. 10 figures.
Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.
Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J
2007-09-01
Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.
Larrabide, Ignacio; Cruz Villa-Uriol, Maria; Cárdenes, Rubén; Pozo, Jose Maria; Macho, Juan; San Roman, Luis; Blasco, Jordi; Vivas, Elio; Marzo, Alberto; Hose, D Rod; Frangi, Alejandro F
2011-05-01
Morphological descriptors are practical and essential biomarkers for diagnosis and treatment selection for intracranial aneurysm management according to the current guidelines in use. Nevertheless, relatively little work has been dedicated to improve the three-dimensional quantification of aneurysmal morphology, to automate the analysis, and hence to reduce the inherent intra and interobserver variability of manual analysis. In this paper we propose a methodology for the automated isolation and morphological quantification of saccular intracranial aneurysms based on a 3D representation of the vascular anatomy. This methodology is based on the analysis of the vasculature skeleton's topology and the subsequent application of concepts from deformable cylinders. These are expanded inside the parent vessel to identify different regions and discriminate the aneurysm sac from the parent vessel wall. The method renders as output the surface representation of the isolated aneurysm sac, which can then be quantified automatically. The proposed method provides the means for identifying the aneurysm neck in a deterministic way. The results obtained by the method were assessed in two ways: they were compared to manual measurements obtained by three independent clinicians as normally done during diagnosis and to automated measurements from manually isolated aneurysms by three independent operators, nonclinicians, experts in vascular image analysis. All the measurements were obtained using in-house tools. The results were qualitatively and quantitatively compared for a set of the saccular intracranial aneurysms (n = 26). Measurements performed on a synthetic phantom showed that the automated measurements obtained from manually isolated aneurysms where the most accurate. The differences between the measurements obtained by the clinicians and the manually isolated sacs were statistically significant (neck width: p <0.001, sac height: p = 0.002). When comparing clinicians' measurements to automatically isolated sacs, only the differences for the neck width were significant (neck width: p <0.001, sac height: p = 0.95). However, the correlation and agreement between the measurements obtained from manually and automatically isolated aneurysms for the neck width: p = 0.43 and sac height: p = 0.95 where found. The proposed method allows the automated isolation of intracranial aneurysms, eliminating the interobserver variability. In average, the computational cost of the automated method (2 min 36 s) was similar to the time required by a manual operator (measurement by clinicians: 2 min 51 s, manual isolation: 2 min 21 s) but eliminating human interaction. The automated measurements are irrespective of the viewing angle, eliminating any bias or difference between the observer criteria. Finally, the qualitative assessment of the results showed acceptable agreement between manually and automatically isolated aneurysms.
First On-Site Data Analysis System for Subaru/Suprime-Cam
NASA Astrophysics Data System (ADS)
Furusawa, Hisanori; Okura, Yuki; Mineo, Sogo; Takata, Tadafumi; Nakata, Fumiaki; Tanaka, Manobu; Katayama, Nobuhiko; Itoh, Ryosuke; Yasuda, Naoki; Miyazaki, Satoshi; Komiyama, Yutaka; Utsumi, Yousuke; Uchida, Tomohisa; Aihara, Hiroaki
2011-03-01
We developed an automated on-site quick analysis system for mosaic CCD data of Suprime-Cam, which is a wide-field camera mounted at the prime focus of the Subaru Telescope, Mauna Kea, Hawaii. The first version of the data-analysis system was constructed, and started to operate in general observations. This system is a new function of observing support at the Subaru Telescope to provide the Subaru user community with an automated on-site data evaluation, aiming at improvements of observers' productivity, especially in large imaging surveys. The new system assists the data evaluation tasks in observations by the continuous monitoring of the characteristics of every data frame during observations. The evaluation results and data frames processed by this system are also useful for reducing the data-processing time in a full analysis after an observation. The primary analysis functions implemented in the data-analysis system are composed of automated realtime analysis for data evaluation and on-demand analysis, which is executed upon request, including mosaicing analysis and flat making analysis. In data evaluation, which is controlled by the organizing software, the database keeps track of the analysis histories, as well as the evaluated values of data frames, including seeing and sky background levels; it also helps in the selection of frames for mosaicing and flat making analysis. We examined the system performance and confirmed an improvement in the data-processing time by a factor of 9 with the aid of distributed parallel data processing and on-memory data processing, which makes the automated data evaluation effective.
A Novel ImageJ Macro for Automated Cell Death Quantitation in the Retina
Maidana, Daniel E.; Tsoka, Pavlina; Tian, Bo; Dib, Bernard; Matsumoto, Hidetaka; Kataoka, Keiko; Lin, Haijiang; Miller, Joan W.; Vavvas, Demetrios G.
2015-01-01
Purpose TUNEL assay is widely used to evaluate cell death. Quantification of TUNEL-positive (TUNEL+) cells in tissue sections is usually performed manually, ideally by two masked observers. This process is time consuming, prone to measurement errors, and not entirely reproducible. In this paper, we describe an automated quantification approach to address these difficulties. Methods We developed an ImageJ macro to quantitate cell death by TUNEL assay in retinal cross-section images. The script was coded using IJ1 programming language. To validate this tool, we selected a dataset of TUNEL assay digital images, calculated layer area and cell count manually (done by two observers), and compared measurements between observers and macro results. Results The automated macro segmented outer nuclear layer (ONL) and inner nuclear layer (INL) successfully. Automated TUNEL+ cell counts were in-between counts of inexperienced and experienced observers. The intraobserver coefficient of variation (COV) ranged from 13.09% to 25.20%. The COV between both observers was 51.11 ± 25.83% for the ONL and 56.07 ± 24.03% for the INL. Comparing observers' results with macro results, COV was 23.37 ± 15.97% for the ONL and 23.44 ± 18.56% for the INL. Conclusions We developed and validated an ImageJ macro that can be used as an accurate and precise quantitative tool for retina researchers to achieve repeatable, unbiased, fast, and accurate cell death quantitation. We believe that this standardized measurement tool could be advantageous to compare results across different research groups, as it is freely available as open source. PMID:26469755
3D Imaging and Automated Ice Bottom Tracking of Canadian Arctic Archipelago Ice Sounding Data
NASA Astrophysics Data System (ADS)
Paden, J. D.; Xu, M.; Sprick, J.; Athinarapu, S.; Crandall, D.; Burgess, D. O.; Sharp, M. J.; Fox, G. C.; Leuschen, C.; Stumpf, T. M.
2016-12-01
The basal topography of the Canadian Arctic Archipelago ice caps is unknown for a number of the glaciers which drain the ice caps. The basal topography is needed for calculating present sea level contribution using the surface mass balance and discharge method and to understand future sea level contributions using ice flow model studies. During the NASA Operation IceBridge 2014 arctic campaign, the Multichannel Coherent Radar Depth Sounder (MCoRDS) used a three transmit beam setting (left beam, nadir beam, right beam) to illuminate a wide swath across the ice glacier in a single pass during three flights over the archipelago. In post processing we have used a combination of 3D imaging methods to produce images for each of the three beams which are then merged to produce a single digitally formed wide swath beam. Because of the high volume of data produced by 3D imaging, manual tracking of the ice bottom is impractical on a large scale. To solve this problem, we propose an automated technique for extracting ice bottom surfaces by viewing the task as an inference problem on a probabilistic graphical model. We first estimate layer boundaries to generate a seed surface, and then incorporate additional sources of evidence, such as ice masks, surface digital elevation models, and feedback from human users, to refine the surface in a discrete energy minimization formulation. We investigate the performance of the imaging and tracking algorithms using flight crossovers since crossing lines should produce consistent maps of the terrain beneath the ice surface and compare manually tracked "ground truth" to the automated tracking algorithms. We found the swath width at the nominal flight altitude of 1000 m to be approximately 3 km. Since many of the glaciers in the archipelago are narrower than this, the radar imaging, in these instances, was able to measure the full glacier cavity in a single pass.
Waller, John S.; Doctor, Daniel H.; Terziotti, Silvia
2015-01-01
Closed depressions on the land surface can be identified by ‘filling’ a digital elevation model (DEM) and subtracting the filled model from the original DEM. However, automated methods suffer from artificial ‘dams’ where surface streams cross under bridges and through culverts. Removal of these false depressions from an elevation model is difficult due to the lack of bridge and culvert inventories; thus, another method is needed to breach these artificial dams. Here, we present a semi-automated workflow and toolbox to remove falsely detected closed depressions created by artificial dams in a DEM. The approach finds the intersections between transportation routes (e.g., roads) and streams, and then lowers the elevation surface across the roads to stream level allowing flow to be routed under the road. Once the surface is corrected to match the approximate location of the National Hydrologic Dataset stream lines, the procedure is repeated with sequentially smaller flow accumulation thresholds in order to generate stream lines with less contributing area within the watershed. Through multiple iterations, artificial depressions that may arise due to ephemeral flow paths can also be removed. Preliminary results reveal that this new technique provides significant improvements for flow routing across a DEM and minimizes artifacts within the elevation surface. Slight changes in the stream flow lines generally improve the quality of flow routes; however some artificial dams may persist. Problematic areas include extensive road ditches, particularly along divided highways, and where surface flow crosses beneath road intersections. Limitations do exist, and the results partially depend on the quality of data being input. Of 166 manually identified culverts from a previous study by Doctor and Young in 2013, 125 are within 25 m of culverts identified by this tool. After three iterations, 1,735 culverts were identified and cataloged. The result is a reconditioned elevation dataset, which retains the karst topography for further analysis, and a culvert catalog.
Azim, Syed; Juergens, Craig; Hines, John; McLaws, Mary-Louise
2016-07-01
Human auditing and collating hand hygiene compliance data take hundreds of hours. We report on 24/7 overt observations to establish adjusted average daily hand hygiene opportunities (HHOs) used as the denominator in an automated surveillance that reports daily compliance rates. Overt 24/7 automated surveillance collected HHOs in medical and surgical wards. Accredited auditors observed health care workers' interaction between patient and patient zones to collect the total number of HHOs, indications, and compliance and noncompliance. Automated surveillance captured compliance (ie, events) via low power radio connected to alcohol-based handrub (ABHR) dispensers. Events were divided by HHOs, adjusted for daily patient-to-nurse ratio, to establish daily rates. Human auditors collected 21,450 HHOs during 24/7 with 1,532 average unadjusted HHOs per day. This was 4.4 times larger than the minimum ward sample required for accreditation. The average adjusted HHOs for ABHR alone on the medical ward was 63 HHOs per patient day and 40 HHOs per patient day on the surgical ward. From July 1, 2014-July 31, 2015 the automated surveillance system collected 889,968 events. Automated surveillance collects 4 times the amount of data on each ward per day than a human auditor usually collects for a quarterly compliance report. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Procedure for Automated Eddy Current Crack Detection in Thin Titanium Plates
NASA Technical Reports Server (NTRS)
Wincheski, Russell A.
2012-01-01
This procedure provides the detailed instructions for conducting Eddy Current (EC) inspections of thin (5-30 mils) titanium membranes with thickness and material properties typical of the development of Ultra-Lightweight diaphragm Tanks Technology (ULTT). The inspection focuses on the detection of part-through, surface breaking fatigue cracks with depths between approximately 0.002" and 0.007" and aspect ratios (a/c) of 0.2-1.0 using an automated eddy current scanning and image processing technique.
Expert system isssues in automated, autonomous space vehicle rendezvous
NASA Technical Reports Server (NTRS)
Goodwin, Mary Ann; Bochsler, Daniel C.
1987-01-01
The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.
NASA Astrophysics Data System (ADS)
Khansari, Maziyar M.; O'Neill, William; Penn, Richard; Blair, Norman P.; Chau, Felix; Shahidi, Mahnaz
2017-03-01
The conjunctiva is a densely vascularized tissue of the eye that provides an opportunity for imaging of human microcirculation. In the current study, automated fine structure analysis of conjunctival microvasculature images was performed to discriminate stages of diabetic retinopathy (DR). The study population consisted of one group of nondiabetic control subjects (NC) and 3 groups of diabetic subjects, with no clinical DR (NDR), non-proliferative DR (NPDR), or proliferative DR (PDR). Ordinary least square regression and Fisher linear discriminant analyses were performed to automatically discriminate images between group pairs of subjects. Human observers who were masked to the grouping of subjects performed image discrimination between group pairs. Over 80% and 70% of images of subjects with clinical and non-clinical DR were correctly discriminated by the automated method, respectively. The discrimination rates of the automated method were higher than human observers. The fine structure analysis of conjunctival microvasculature images provided discrimination of DR stages and can be potentially useful for DR screening and monitoring.
Is partially automated driving a bad idea? Observations from an on-road study.
Banks, Victoria A; Eriksson, Alexander; O'Donoghue, Jim; Stanton, Neville A
2018-04-01
The automation of longitudinal and lateral control has enabled drivers to become "hands and feet free" but they are required to remain in an active monitoring state with a requirement to resume manual control if required. This represents the single largest allocation of system function problem with vehicle automation as the literature suggests that humans are notoriously inefficient at completing prolonged monitoring tasks. To further explore whether partially automated driving solutions can appropriately support the driver in completing their new monitoring role, video observations were collected as part of an on-road study using a Tesla Model S being operated in Autopilot mode. A thematic analysis of video data suggests that drivers are not being properly supported in adhering to their new monitoring responsibilities and instead demonstrate behaviour indicative of complacency and over-trust. These attributes may encourage drivers to take more risks whilst out on the road. Copyright © 2017 Elsevier Ltd. All rights reserved.
Flip the tip: an automated, high quality, cost-effective patch clamp screen.
Lepple-Wienhues, Albrecht; Ferlinz, Klaus; Seeger, Achim; Schäfer, Arvid
2003-01-01
The race for creating an automated patch clamp has begun. Here, we present a novel technology to produce true gigaseals and whole cell preparations at a high rate. Suspended cells are flushed toward the tip of glass micropipettes. Seal, whole-cell break-in, and pipette/liquid handling are fully automated. Extremely stable seals and access resistance guarantee high recording quality. Data obtained from different cell types sealed inside pipettes show long-term stability, voltage clamp and seal quality, as well as block by compounds in the pM range. A flexible array of independent electrode positions minimizes consumables consumption at maximal throughput. Pulled micropipettes guarantee a proven gigaseal substrate with ultra clean and smooth surface at low cost.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
NASA Astrophysics Data System (ADS)
Buck, J. J. H.; Phillips, A.; Lorenzo, A.; Kokkinaki, A.; Hearn, M.; Gardner, T.; Thorne, K.
2017-12-01
The National Oceanography Centre (NOC) operate a fleet of approximately 36 autonomous marine platforms including submarine gliders, autonomous underwater vehicles, and autonomous surface vehicles. Each platform effectivity has the capability to observe the ocean and collect data akin to a small research vessel. This is creating a growth in data volumes and complexity while the amount of resource available to manage data remains static. The OceanIds Command and Control (C2) project aims to solve these issues by fully automating the data archival, processing and dissemination. The data architecture being implemented jointly by NOC and the Scottish Association for Marine Science (SAMS) includes a single Application Programming Interface (API) gateway to handle authentication, forwarding and delivery of both metadata and data. Technicians and principle investigators will enter expedition data prior to deployment of vehicles enabling automated data processing when vehicles are deployed. The system will support automated metadata acquisition from platforms as this technology moves towards operational implementation. The metadata exposure to the web builds on a prototype developed by the European Commission supported SenseOCEAN project and is via open standards including World Wide Web Consortium (W3C) RDF/XML and the use of the Semantic Sensor Network ontology and Open Geospatial Consortium (OGC) SensorML standard. Data will be delivered in the marine domain Everyone's Glider Observatory (EGO) format and OGC Observations and Measurements. Additional formats will be served by implementation of endpoints such as the NOAA ERDDAP tool. This standardised data delivery via the API gateway enables timely near-real-time data to be served to Oceanids users, BODC users, operational users and big data systems. The use of open standards will also enable web interfaces to be rapidly built on the API gateway and delivery to European research infrastructures that include aligned reference models for data infrastructure.
Issues and Concerns in Robotic Drilling
NASA Technical Reports Server (NTRS)
Glass, Brian
2003-01-01
Exploration of the Martian subsurface will be essential in the search for life and water, given the desiccated and highly oxidized conditions on the surface. Discovery of these, at least in non-fossil form, is unlikely without drilling or other physical access to the subsurface. Hence subsurface access will be critical for both future in-situ science and Mars sample return. Drilling applications present many new challenges for diagnosis and control technology. Traditionally, diagnosis has concentrated on determining the internal state of a system, and detecting failures of system components. In the case of drilling applications, an additional challenge is to diagnose the interactions between the drill and its environment. This is necessary because particular observations of the drilling operation may be consistent with a number of possible problems, including faults in the equipment, but also changes in the material being drilled (for example, from rock to ice). The diagnosis of a particular observation may also depend on knowledge of geological formations previously encountered during drilling, and different remedial actions may be required for each diagnosis. Current 2009 Mars mission scenarios call for no more than 33 sols to be spent drilling. Yet they also call for a baseline of two 2m-deep holes in each of three target areas, for a total of six drilling operations. Using current levels of automation, it is estimated that 15-16 sols would be required to drill each hole. As a result of this, either the drilling part of the mission plan will need to be severely downscoped to no more than two holes total, or on-board automation and robotics must be increased in order to reduce the number of sols required per hole by removing ground control from the drilling control loop. This lecture will discuss salient issues and concerns of robotic drilling automation compares with other applications, and implementation constraints.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M
2008-01-01
To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM
2009-01-01
Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582
Chin, Chai Fung; Ler, Lian Wee; Choong, Yee Siew; Ong, Eugene Boon Beng; Ismail, Asma; Tye, Gee Jun; Lim, Theam Soon
2016-01-01
Antibody phage display panning involves the enrichment of antibodies against specific targets by affinity. In recent years, several new methods for panning have been introduced to accommodate the growing application of antibody phage display. The present work is concerned with the application of streptavidin mass spectrometry immunoassay (MSIA™) Disposable Automation Research Tips (D.A.R.T's®) for antibody phage display. The system was initially designed to isolate antigens by affinity selection for mass spectrometry analysis. The streptavidin MSIA™ D.A.R.T's® system allows for easy attachment of biotinylated target antigens on the solid surface for presentation to the phage library. As proof-of-concept, a domain antibody library was passed through the tips attached with the Hemolysin E antigen. After binding and washing, the bound phages were eluted via standard acid dissociation and the phages were rescued for subsequent panning rounds. Polyclonal enrichment was observed for three rounds of panning with five monoclonal domain antibodies identified. The proposed method allows for a convenient, rapid and semi-automated alternative to conventional antibody panning strategies. Copyright © 2015 Elsevier B.V. All rights reserved.
Automated test-site radiometer for vicarious calibration
NASA Astrophysics Data System (ADS)
Li, Xin; Yin, Ya-peng; Liu, En-chao; Zhang, Yan-na; Xun, Li-na; Wei, Wei; Zhang, Zhi-peng; Qiu, Gang-gang; Zhang, Quan; Zheng, Xiao-bing
2014-11-01
In order to realize unmanned vicarious calibration, Automated Test-site Radiometer (ATR) was developed for surface reflectance measurements. ATR samples the spectrum from 400nm-1600 nm with 8 interference filters coupled with silicon and InGaAs detectors. The field of view each channel is 10 ° with parallel optical axis. One SWIR channel lies in the center and the other seven VNIR channels are on the circle of 4.8cm diameters which guarantee each channel to view nearly the same section of ground. The optical head as a whole is temperature controlled utilizing a TE cooler for greater stability and lower noise. ATR is powered by a solar panel and transmit its data through a BDS (China's BeiDou Navigation Satellite System) terminator for long-term measurements without personnel in site. ATR deployed in Dunhuang test site with ground field about 30-cm-diameter area for multi-spectral reflectance measurements. Other instruments at the site include a Cimel sunphotometer and a diffuser-to-globe irradiance meter for atmosphere observations. The methodology for band-averaged reflectance retrieval and hyperspectral reflectance fitting process are described. Then the hyperspectral reflectance and atmospheric parameters are put into 6s code to predict TOA radiance which compare with MODIS radiance.
NASA Astrophysics Data System (ADS)
Satyanto, K. S.; Abang, Z. E.; Arif, C.; Yanuar, J. P. M.
2018-05-01
An automatic water management system for agriculture land was developed based on mini PC as controller to manage irrigation and drainage. The system was integrated with perforated pipe network installed below the soil surface to enable water flow in and out through the network, and so water table of the land can be set at a certain level. The system was operated by using solar power electricity supply to power up water level and soil moisture sensors, Raspberry Pi controller and motorized valve actuator. This study aims to implement the system in controlling water level at a soybean production land, and further to observe water footprint and carbon footprint contribution of the soybean production process with application of the automated system. The water level of the field can be controlled around 19 cm from the base. Crop water requirement was calculated using Penman-Monteith approach, with the productivity of soybean 3.57t/ha, total water footprint in soybean production is 872.01 m3/t. Carbon footprint was calculated due to the use of solar power electric supply system and during the soybean production emission was estimated equal to 1.85 kg of CO2.
NASA Astrophysics Data System (ADS)
Nelson, B. R.; Prat, O. P.; Stevens, S. E.; Seo, D. J.; Zhang, J.; Howard, K.
2014-12-01
The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor QPE (NMQ/Q2) based on the WSR-88D Next-generation Radar (NEXRAD) network over Continental United States (CONUS) is nearly completed for the period covering from 2001 to 2012. Reanalysis data are available at 1-km and 5-minute resolution. An important step in generating the best possible precipitation data is to assess the bias in the radar-only product. In this work, we use data from a combination of rain gauge networks to assess the bias in the NMQ reanalysis. Rain gauge networks such as the Hydrometeorological Automated Data System (HADS), the Automated Surface Observing Systems (ASOS), the Climate Reference Network (CRN), and the Global Historical Climatology Network Daily (GHCN-D) are combined for use in the assessment. These rain gauge networks vary in spatial density and temporal resolution. The challenge hence is to optimally utilize them to assess the bias at the finest resolution possible. For initial assessment, we propose to subset the CONUS data in climatologically representative domains, and perform bias assessment using information in the Q2 dataset on precipitation type and phase.
Differential tinnitus-related neuroplastic alterations of cortical thickness and surface area.
Meyer, Martin; Neff, Patrick; Liem, Franziskus; Kleinjung, Tobias; Weidt, Steffi; Langguth, Berthold; Schecklmann, Martin
2016-12-01
Structural neuroimaging techniques have been used to identify cortical and subcortical regions constituting the neuroarchitecture of tinnitus. One recent investigation used voxel-based morphometry (VBM) to analyze a sample of tinnitus patients (TI, n = 257) (Schecklmann et al., 2013). A negative relationship between individual distress and cortical volume (CV) in bilateral auditory regions was observed. However, CV has meanwhile been identified as a neuroanatomical measurement that confounds genetically distinct neuroanatomical traits, namely cortical thickness (CT) and cortical surface area (CSA). We performed a re-analysis of the identical sample using the automated FreeSurfer surface-based morphometry (SBM) approach (Fischl, 2012). First, we replicated the negative correlation between tinnitus distress and bilateral supratemporal gray matter volume. Second, we observed a negative correlation for CSA in the left periauditory cortex and anterior insula. Furthermore, we noted a positive correlation between tinnitus duration and CT in the left periauditory cortex as well as a negative correlation in the subcallosal anterior cingulate, a region collated to the serotonergic circuit and germane to inhibitory functions. In short, the results elucidate differential neuroanatomical alterations of CSA and CT for the two independent tinnitus-related psychological traits distress and duration. Beyond this, the study provides further evidence for the distinction and specific susceptibility of CSA and CT within the context of neuroplasticity of the human brain. Copyright © 2016 Elsevier B.V. All rights reserved.
Modeling the atomistic growth behavior of gold nanoparticles in solution
NASA Astrophysics Data System (ADS)
Turner, C. Heath; Lei, Yu; Bao, Yuping
2016-04-01
The properties of gold nanoparticles strongly depend on their three-dimensional atomic structure, leading to an increased emphasis on controlling and predicting nanoparticle structural evolution during the synthesis process. In order to provide this atomistic-level insight and establish a link to the experimentally-observed growth behavior, a kinetic Monte Carlo simulation (KMC) approach is developed for capturing Au nanoparticle growth characteristics. The advantage of this approach is that, compared to traditional molecular dynamics simulations, the atomistic nanoparticle structural evolution can be tracked on time scales that approach the actual experiments. This has enabled several different comparisons against experimental benchmarks, and it has helped transition the KMC simulations from a hypothetical toy model into a more experimentally-relevant test-bed. The model is initially parameterized by performing a series of automated comparisons of Au nanoparticle growth curves versus the experimental observations, and then the refined model allows for detailed structural analysis of the nanoparticle growth behavior. Although the Au nanoparticles are roughly spherical, the maximum/minimum dimensions deviate from the average by approximately 12.5%, which is consistent with the corresponding experiments. Also, a surface texture analysis highlights the changes in the surface structure as a function of time. While the nanoparticles show similar surface structures throughout the growth process, there can be some significant differences during the initial growth at different synthesis conditions.
Earth Observations taken by the Expedition 23 Crew
2010-05-04
ISS023-E-032397 (4 May 2010) --- The Gulf of Mexico oil spill is featured in this image photographed by an Expedition 23 crew member on the International Space Station. On April 20, 2010 the oil rig Deepwater Horizon suffered an explosion and sank two days later. Shortly thereafter oil began leaking into the Gulf of Mexico from ruptured pipes as safety cutoff mechanisms failed to operate. Automated nadir-viewing orbital NASA sensors have been tracking the growth of the oil spill as it has spread towards the northern Gulf Coast. This detailed photograph provides a different viewing perspective on the ongoing event. The image is oblique, meaning that it was taken with a sideways viewing angle from the space station, rather than the ?straight down? or nadir view typical of automated satellite sensors. The view is towards the west; the ISS was located over the eastern edge of the Gulf of Mexico when the image was taken. The Mississippi River Delta and nearby Louisiana coast (top) appear dark in the sunglint that illuminates most of the image. This phenomenon is caused by sunlight reflecting off the water surface ? much like a mirror ? directly back towards the astronaut observer onboard the orbital complex. The sunglint improves the identification of the oil spill (colored dark to light gray) which is creating a different water texture, and therefore a contrast, between the smooth and rougher water of the reflective ocean surface (colored silver to white). Wind and water current patterns have modified the oil spill?s original shape into streamers and elongated masses. Efforts are ongoing to contain the spill and protect fragile coastal ecosystems and habitats such as the Chandeleur Islands (right center). Other features visible in the image include a solid field of low cloud cover at the lower left corner of the image. A part of one of the ISS solar arrays is visible at lower right. Wave patterns at lower right are most likely caused by tidal effects.
A Liquid-Handling Robot for Automated Attachment of Biomolecules to Microbeads.
Enten, Aaron; Yang, Yujia; Ye, Zihan; Chu, Ryan; Van, Tam; Rothschild, Ben; Gonzalez, Francisco; Sulchek, Todd
2016-08-01
Diagnostics, drug delivery, and other biomedical industries rely on cross-linking ligands to microbead surfaces. Microbead functionalization requires multiple steps of liquid exchange, incubation, and mixing, which are laborious and time intensive. Although automated systems exist, they are expensive and cumbersome, limiting their routine use in biomedical laboratories. We present a small, bench-top robotic system that automates microparticle functionalization and streamlines sample preparation. The robot uses a programmable microcontroller to regulate liquid exchange, incubation, and mixing functions. Filters with a pore diameter smaller than the minimum bead diameter are used to prevent bead loss during liquid exchange. The robot uses three liquid reagents and processes up to 10(7) microbeads per batch. The effectiveness of microbead functionalization was compared with a manual covalent coupling process and evaluated via flow cytometry and fluorescent imaging. The mean percentages of successfully functionalized beads were 91% and 92% for the robot and manual methods, respectively, with less than 5% bead loss. Although the two methods share similar qualities, the automated approach required approximately 10 min of active labor, compared with 3 h for the manual approach. These results suggest that a low-cost, automated microbead functionalization system can streamline sample preparation with minimal operator intervention. © 2015 Society for Laboratory Automation and Screening.
Using WorldView-2 Imagery to Track Flooding in Thailand in a Multi-Asset Sensorweb
NASA Technical Reports Server (NTRS)
McLaren, David; Doubleday, Joshua; Chien, Steve
2012-01-01
For the flooding seasons of 2011-2012 multiple space assets were used in a "sensorweb" to track major flooding in Thailand. Worldview-2 multispectral data was used in this effort and provided extremely high spatial resolution (2m / pixel) multispectral (8 bands at 0.45-1.05 micrometer spectra) data from which mostly automated workflows derived surface water extent and volumetric water information for use by a range of NGO and national authorities. We first describe how Worldview-2 and its data was integrated into the overall flood tracking sensorweb. We next describe the use of Support Vector Machine learning techniques that were used to derive surface water extent classifiers. Then we describe the fusion of surface water extent and digital elevation map (DEM) data to derive volumetric water calculations. Finally we discuss key future work such as speeding up the workflows and automating the data registration process (the only portion of the workflow requiring human input).
Computational efficiency for the surface renewal method
NASA Astrophysics Data System (ADS)
Kelley, Jason; Higgins, Chad
2018-04-01
Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.
Automation &robotics for future Mars exploration
NASA Astrophysics Data System (ADS)
Schulte, W.; von Richter, A.; Bertrand, R.
2003-04-01
Automation and Robotics (A&R) are currently considered as a key technology for Mars exploration. initiatives in this field aim at developing new A&R systems and technologies for planetary surface exploration. Kayser-Threde led the study AROMA (Automation &Robotics for Human Mars Exploration) under ESA contract in order to define a reference architecture of A&R elements in support of a human Mars exploration program. One of the goals was to define new developments and to maintain the competitiveness of European industry within this field. We present a summary of the A&R study in respect to a particular system: The Autonomous Research Island (ARI). In the Mars exploration scenario initially a robotic outpost system lands at pre-selected sites in order to search for life forms and water and to analyze the surface, geology and atmosphere. A&R systems, i.e. rovers and autonomous instrument packages, perform a number of missions with scientific and technology development objectives on the surface of Mars as part of preparations for a human exploration mission. In the Robotic Outpost Phase ARI is conceived as an automated lander which can perform in-situ analysis. It consists of a service module and a micro-rover system for local investigations. Such a system is already under investigation and development in other TRP activities. The micro-rover system provides local mobility for in-situ scientific investigations at a given landing or deployment site. In the long run ARI supports also human Mars missions. An astronaut crew would travel larger distances in a pressurized rover on Mars. Whenever interesting features on the surface are identified, the crew would interrupt the travel and perform local investigations. In order to save crew time ARI could be deployed by the astronauts to perform time-consuming investigations as for example in-situ geochemistry analysis of rocks/soil. Later, the crew could recover the research island for refurbishment and deployment at another site. In the frame of near-term Mars exploration a dedicated exobiology mission is envisaged. Scientific and technical studies for a facility to detect the evidence of past of present life have been carried out under ESA contract. Mars soil/rock samples are to be analyzed for their morphology, organic and inorganic composition using a suite of scientific instruments. Robotic devices, e.g. for the acquisition, handling and onboard processing of Mars sample material retrieved from different locations, and surface mobility are important elements in a fully automated mission. Necessary robotic elements have been identified in past studies. Their realization can partly be based on heritage of existing space hardware, but will require dedicated development effort.
Downscaling of Remotely Sensed Land Surface Temperature with multi-sensor based products
NASA Astrophysics Data System (ADS)
Jeong, J.; Baik, J.; Choi, M.
2016-12-01
Remotely sensed satellite data provides a bird's eye view, which allows us to understand spatiotemporal behavior of hydrologic variables at global scale. Especially, geostationary satellite continuously observing specific regions is useful to monitor the fluctuations of hydrologic variables as well as meteorological factors. However, there are still problems regarding spatial resolution whether the fine scale land cover can be represented with the spatial resolution of the satellite sensor, especially in the area of complex topography. To solve these problems, many researchers have been trying to establish the relationship among various hydrological factors and combine images from multi-sensor to downscale land surface products. One of geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS), has Meteorological Imager (MI) and Geostationary Ocean Color Imager (GOCI). MI performing the meteorological mission produce Rainfall Intensity (RI), Land Surface Temperature (LST), and many others every 15 minutes. Even though it has high temporal resolution, low spatial resolution of MI data is treated as major research problem in many studies. This study suggests a methodology to downscale 4 km LST datasets derived from MI in finer resolution (500m) by using GOCI datasets in Northeast Asia. Normalized Difference Vegetation Index (NDVI) recognized as variable which has significant relationship with LST are chosen to estimate LST in finer resolution. Each pixels of NDVI and LST are separated according to land cover provided from MODerate resolution Imaging Spectroradiometer (MODIS) to achieve more accurate relationship. Downscaled LST are compared with LST observed from Automated Synoptic Observing System (ASOS) for assessing its accuracy. The downscaled LST results of this study, coupled with advantage of geostationary satellite, can be applied to observe hydrologic process efficiently.
New approaches to observation and modeling of fast-moving glaciers and ice streams
NASA Astrophysics Data System (ADS)
Herzfeld, U. C.; Trantow, T.; Markle, M. J.; Medley, G.; Markus, T.; Neumann, T.
2016-12-01
In this paper, we will give an overview of several new approaches to remote-sensing observations and analysis and to modeling of fast glacier flow. The approaches will be applied in case studies of different types of fast-moving glaciers: (1) The Bering-Bagley Glacier System, Alaska (a surge-type glacier system), (2) Jakobshavn Isbræ, Greenland (a tide-water terminating fjord glacier and outlet of the Greenland Inland Ice), and (3) Icelandic Ice Caps (manifestations of the interaction of volcanic and glaciologic processes). On the observational side, we will compare the capabilities of lidar and radar altimeters, including ICESat's Geoscience Laser Altimeter System (GLAS), CryoSat-2's Synthetic Aperture Interferometric Radar Altimeter (SIRAL) and the future ICESat-2 Advanced Topographic Laser Altimeter System (ATLAS), especially regarding retrieval of surface heights over crevassed regions as typical of spatial and temporal acceleration. Properties that can be expected from ICESat-2 ATLAS data will be illustrated based on analyses of data from ICESat-2 simulator instruments: the Slope Imaging Multi-polarization Photon-counting Lidar (SIMPL) and the Multiple Altimeter Beam Experimental Lidar (MABEL). Information from altimeter data will be augmented by an automated surface classification based on image data, which includes satellite imagery such as LANDSAT and WorldView as well as airborne video imagery of ice surfaces. Numerical experiments using Elmer/Ice will be employed to link parameters derived in observations to physical processes during the surge of the Bering Bagley Glacier System. This allows identification of processes that can be explained in an existing framework and processes that may require new concepts for glacier evolution. Topics include zonation of surge progression in a complex glacier system and crevassing as an indication, storage of glacial water, influence of basal topography and the role of friction laws.
Automated Database Schema Design Using Mined Data Dependencies.
ERIC Educational Resources Information Center
Wong, S. K. M.; Butz, C. J.; Xiang, Y.
1998-01-01
Describes a bottom-up procedure for discovering multivalued dependencies in observed data without knowing a priori the relationships among the attributes. The proposed algorithm is an application of technique designed for learning conditional independencies in probabilistic reasoning; a prototype system for automated database schema design has…
2008-12-01
clearly observed in the game industry ( Introversion , 2008). Currently there are many tools available to assist in automating the production of large...Graphics and Interactive Techniques, Melbourne, Australia, February 11 – 14. Introversion Software, 2008: Procedural Content Generation. http
NASA Technical Reports Server (NTRS)
Moser, D. E.
2017-01-01
Most meteoroids are broken up by Earth's atmosphere before they reach the ground. The Moon, however, has little-to-no atmosphere to prevent meteoroids from impacting the lunar surface. Upon impact they excavate a crater and generate a plume of debris. A flash of light at the moment of impact can also be seen. Meteoroids striking the Moon create an impact flash observable by telescopes here on Earth. NASA observers use telescopes at the Automated Lunar and Meteor Observatory (ALaMO) to routinely monitor the Moon for impact flashes each month when the lunar phase is right. Flashes recorded by two telescope simultaneously rule out false signals from cosmic rays and satellites. Over 400 impact flashes have been observed by NASA since 2005. This map shows the location of each flash. No observations are made near the poles or center line. On average, one impact is observed every two hours. The brightest and longest-lasting impact flash was observed in Mare Imbrium on March 17, 2013. The imaging satellite Lunar Reconnaissance Orbiter, in orbit around the Moon, discovered the fresh crater created by this impact. The crater is 60 across and was caused by a meteoroid 9 inches in diameter likely traveling at a speed of 57,000 mph!
Trajectory-based change detection for automated characterization of forest disturbance dynamics
Robert E. Kennedy; Warren B. Cohen; Todd A. Schroeder
2007-01-01
Satellite sensors are well suited to monitoring changes on the Earth's surface through provision of consistent and repeatable measurements at a spatial scale appropriate for many processes causing change on the land surface. Here, we describe and test a new conceptual approach to change detection of forests using a dense temporal stack of Landsat Thematic Mapper (...
Enhancement of surface definition and gridding in the EAGLE code
NASA Technical Reports Server (NTRS)
Thompson, Joe F.
1991-01-01
Algorithms for smoothing of curves and surfaces for the EAGLE grid generation program are presented. The method uses an existing automated technique which detects undesirable geometric characteristics by using a local fairness criterion. The geometry entity is then smoothed by repeated removal and insertion of spline knots in the vicinity of the geometric irregularity. The smoothing algorithm is formulated for use with curves in Beta spline form and tensor product B-spline surfaces.
NDT of railway components using induction thermography
NASA Astrophysics Data System (ADS)
Netzelmann, U.; Walle, G.; Ehlen, A.; Lugin, S.; Finckbohner, M.; Bessert, S.
2016-02-01
Induction or eddy current thermography is used to detect surface cracks in ferritic steel. The technique is applied to detect surface cracks in rails from a moving test car. Cracks were detected at a train speed between 2 and 15 km/h. An automated demonstrator system for testing railway wheels after production is described. While the wheel is rotated, a robot guides the detection unit consisting of inductor and infrared camera over the surface.
Toward Expanding Tremor Observations in the Northern San Andreas Fault System in the 1990s
NASA Astrophysics Data System (ADS)
Damiao, L. G.; Dreger, D. S.; Nadeau, R. M.; Taira, T.; Guilhem, A.; Luna, B.; Zhang, H.
2015-12-01
The connection between tremor activity and active fault processes continues to expand our understanding of deep fault zone properties and deformation, the tectonic process, and the relationship of tremor to the occurrence of larger earthquakes. Compared to tremors in subduction zones, known tremor signals in California are ~5 to ~10 smaller in amplitude and duration. These characteristics, in addition to scarce geographic coverage, lack of continuous data (e.g., before mid-2001 at Parkfield), and absence of instrumentation sensitive enough to monitor these events have stifled tremor detection. The continuous monitoring of these events over a relatively short time period in limited locations may lead to a parochial view of the tremor phenomena and its relationship to fault, tectonic, and earthquake processes. To help overcome this, we have embarked on a project to expand the geographic and temporal scope of tremor observation along the Northern SAF system using available continuous seismic recordings from a broad array of 100s of surface seismic stations from multiple seismic networks. Available data for most of these stations also extends back into the mid-1990s. Processing and analysis of tremor signal from this large and low signal-to-noise dataset requires a heavily automated, data-science type approach and specialized techniques for identifying and extracting reliable data. We report here on the automated, envelope based methodology we have developed. We finally compare our catalog results with pre-existing tremor catalogs in the Parkfield area.
NASA Technical Reports Server (NTRS)
Johnston, Patrick H.; Juarez, Peter D.
2017-01-01
Automated tow placement has become a widely used fabrication technique, especially for large aerospace structures. Robotic heads lay down strips (tows) of preimpregnated fiber along programmed paths. The intention is to lay adjacent tows abutted to one another, but sometimes a gap is left between a tow and the previously-placed tow. If a tow gap exists, it fills with resin during cure, forming a fiber-free volume. In immersion ultrasonic pulse-echo measurements of a cured laminate, the gap can be observed to produce a noticeable echo, without significantly attenuating the back-wall reflection of the laminate. To understand this behavior, we considered a one dimensional model of the composite laminate, with a thin layer having the ultrasonic sound speed and density of neat resin, sandwiched between two layers of material having the sound speed and density of fiber-reinforced composite and surrounded on both sides by water. Neglecting attenuation, we considered the transmission and reflection coefficients of each interface, as well as that of the thin resin layer. Using the initial water/composite reflection as a reference, we computed the relative magnitude of the back surface/water reflection in the presence and in the absence of a resin-only layer, as well as the relative magnitude of the reflection arising from a thin resin layer in composite. While the one-dimensional model did not fully match the measurements, it did qualitatively explain the observed behavior.
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.; Stevens, S. E.; Nickl, E.; Seo, D. J.; Kim, B.; Zhang, J.; Qi, Y.
2015-12-01
The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor Quantitative (NMQ/Q2) based on the WSR-88D Next-generation Radar (Nexrad) network over the Continental United States (CONUS) is completed for the period covering from 2002 to 2011. While this constitutes a unique opportunity to study precipitation processes at higher resolution than conventionally possible (1-km, 5-min), the long-term radar-only product needs to be merged with in-situ information in order to be suitable for hydrological, meteorological and climatological applications. The radar-gauge merging is performed by using rain gauge information at daily (Global Historical Climatology Network-Daily: GHCN-D), hourly (Hydrometeorological Automated Data System: HADS), and 5-min (Automated Surface Observing Systems: ASOS; Climate Reference Network: CRN) resolution. The challenges related to incorporating differing resolution and quality networks to generate long-term large-scale gridded estimates of precipitation are enormous. In that perspective, we are implementing techniques for merging the rain gauge datasets and the radar-only estimates such as Inverse Distance Weighting (IDW), Simple Kriging (SK), Ordinary Kriging (OK), and Conditional Bias-Penalized Kriging (CBPK). An evaluation of the different radar-gauge merging techniques is presented and we provide an estimate of uncertainty for the gridded estimates. In addition, comparisons with a suite of lower resolution QPEs derived from ground based radar measurements (Stage IV) are provided in order to give a detailed picture of the improvements and remaining challenges.
Haralambieva, Iana H.; Ovsyannikova, Inna G.; O’Byrne, Megan; Pankratz, V. Shane; Jacobson, Robert M.; Poland, Gregory A.
2011-01-01
The measurement of measles-specific neutralizing antibodies, directed against the surface measles virus hemagglutinin and fusion proteins, is considered the gold standard in measles serology. We assessed functional measles-specific neutralizing antibody levels in a racially diverse cohort of 763 young healthy adolescents after receipt of two doses of measles-mumps-rubella vaccine, by the use of an automated plaque reduction microneutralization (PRMN) assay, and evaluated their relevance to protective antibody levels, as well as their associations with demographic and clinical variables. We also concurrently assessed measles-specific IFNγ Elispot responses and their relation to the observed antibody concentrations. The geometric mean titer for our cohort was 832 mIU/mL (95% CIs: 776; 891). Sixty-eight subjects (8.9%) had antibody concentrations of less than the protective threshold of 210 mIU/mL (corresponding to PRMN titer of 120; suggesting protection against symptomatic disease), and 177 subjects (23.2%) demonstrated persisting antibody concentrations above 1,841 mIU/mL (corresponding to PRMN titer of 1,052; suggesting total protection against viral infection), 7.4 years after vaccination, in the absence of wild-type virus boosting. The mean measles-specific IFNγ Elispot response for our cohort was 46 (95% CIs: 43; 49) IFNγ-positive spots per 200,000 cells with no relation of cellular immunity measures to the observed antibody concentrations. No significant associations between antibody titers and demographic and clinical variables, including gender and race, were observed in our study. In conclusion, in a large observational study of measles immunity, we used an automated high-throughput measles virus-specific neutralization assay to measure humoral immunity, and concurrently determined measles-specific cellular immunity to aid the assessment of potential susceptibility to measles in vaccinated populations. PMID:21539880
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Jones, Thomas C.; Doggett, William R.; Roithmayr, Carlos M.; King, Bruce D.; Mikulas, Marting M.
2009-01-01
The objective of this paper is to describe and summarize the results of the development efforts for the Lunar Surface Manipulation System (LSMS) with respect to increasing the performance, operational versatility, and automation. Three primary areas of development are covered, including; the expansion of the operational envelope and versatility of the current LSMS test-bed, the design of a second generation LSMS, and the development of automation and remote control capability. The first generation LSMS, which has been designed, built, and tested both in lab and field settings, is shown to have increased range of motion and operational versatility. Features such as fork lift mode, side grappling of payloads, digging and positioning of lunar regolith, and a variety of special end effectors are described. LSMS operational viability depends on bei nagble to reposition its base from an initial position on the lander to a mobility chassis or fixed locations around the lunar outpost. Preliminary concepts are presented for the second generation LSMS design, which will perform this self-offload capability. Incorporating design improvements, the second generation will have longer reach and three times the payload capability, yet it will have approximately equivalent mass to the first generation. Lastly, this paper covers improvements being made to the control system of the LSMS test-bed, which is currently operated using joint velocity control with visual cues. These improvements include joint angle sensors, inverse kinematics, and automated controls.
Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.
Greenlee, Eric T; DeLucia, Patricia R; Newton, David C
2018-06-01
The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.
Brosnahan, Michael L; Ralston, David K; Fischer, Alexis D; Solow, Andrew R; Anderson, Donald M
2017-11-01
New resting cyst production is crucial for the survival of many microbial eukaryotes including phytoplankton that cause harmful algal blooms. Production in situ has previously been estimated through sediment trap deployments, but here was instead assessed through estimation of the total number of planktonic cells and new resting cysts produced by a localized, inshore bloom of Alexandrium catenella , a dinoflagellate that is a globally important cause of paralytic shellfish poisoning. Our approach utilizes high frequency, automated water monitoring, weekly observation of new cyst production, and pre- and post-bloom spatial surveys of total resting cyst abundance. Through this approach, new cyst recruitment within the study area was shown to account for at least 10.9% ± 2.6% (SE) of the bloom's decline, ∼ 5× greater than reported from comparable, sediment trap based studies. The observed distribution and timing of new cyst recruitment indicate that: (1) planozygotes, the immediate precursor to cysts in the life cycle, migrate nearer to the water surface than other planktonic stages and (2) encystment occurs after planozygote settlement on bottom sediments. Near surface localization by planozygotes explains the ephemerality of red surface water discoloration by A. catenella blooms, and also enhances the dispersal of new cysts. Following settlement, bioturbation and perhaps active swimming promote sediment infiltration by planozygotes, reducing the extent of cyst redistribution between blooms. The concerted nature of bloom sexual induction, especially in the context of an observed upper limit to A. catenella bloom intensities and heightened susceptibility of planozygotes to the parasite Amoebophrya , is also discussed.
New Information Technologies: Some Observations on What Is in Store for Libraries.
ERIC Educational Resources Information Center
Black, John B.
This outline of new technological developments and their applications in the library and information world considers innovations in three areas: automation, telecommunications, and the publishing industry. There is mention of the growth of online systems, minicomputers, microcomputers, and word processing; the falling costs of automation; the…
Ji, Yong Woo; Lee, Jeihoon; Lee, Hun; Seo, Kyoung Yul; Kim, Eung Kweon; Kim, Tae-Im
2017-02-01
To investigate automated values from an advanced corneal topographer with a built-in real keratometer, color camera, and ocular surface interferometer for the evaluation of non-Sjögren dry eye syndrome (NSDES) with meibomian gland dysfunction (MGD). Sixty-four patients (64 eyes) diagnosed with NSDES with MGD were enrolled. All eyes were evaluated using the Ocular Surface Disease Index (OSDI), fluorescence staining score, tear film breakup time (TBUT), Schirmer test, and MGD grade. Noninvasive Keratograph average tear film breakup time (NIKBUTav), tear meniscus height (TMHk), meibomian gland (MG) dropout grade, and lipid layer thickness (LLT) using interferometry were measured. Among automated indexes, NIKBUTav (mean 7.68 ± 4.07 s) and the MG dropout grade (mean 1.0 ± 0.5) significantly correlated with the OSDI (mean 40.6 ± 22.9) (r = -0.337, P = 0.006; and r = 0.201, P = 0.023, respectively), as did all conventional indicators, except the Schirmer score (mean 9.1 ± 5.9 mm). TMHk (mean 0.21 ± 0.18 mm) had significant correlation with the Schirmer score, the staining score (mean 1.2 ± 0.7), TBUT (mean 3.8 ± 1.8 s), and NIKBUTav (r = 0.298, P = 0.007; r = -0.268, P = 0.016; r = 0.459, P < 0.001; and r = 0.439, P < 0.001, respectively), but not any MGD indicator, even the MG dropout grade. NIKBUTav showed significant correlations with all clinical parameters and other automated values, except the Schirmer score and LLT (mean 83.94 ± 20.82 nm) (all (Equation is included in full-text article.)≥ 0.25 and P < 0.01). The MG dropout grade highly correlated with all indexes except TMHk (all (Equation is included in full-text article.)≥ 0.25 and P < 0.05). LLT was significantly associated with TBUT, MGD grade (mean 2.0 ± 0.7), and MG dropout grade (r = 0.219, P = 0.047; r = -0.221, P = 0.039; and r = 0.433, P < 0.001, respectively), although it was not related to patient symptoms. Automated noninvasive measurements using an advanced corneal topographer and LLT measured with an ocular surface interferometer can be alternatives to conventional methods to evaluate tear conditions on the ocular surface; the former device can provide information about conformational MG changes in NSDES with MGD.
NASA Technical Reports Server (NTRS)
Flanigan, Lee A.; Tamir, David; Weeks, Jack L.; Mcclure, Sidney R.; Kimbrough, Andrew G.
1994-01-01
This paper wrestles with the on-orbit operational challenges introduced by the proposed Space Construction, Repair, and Maintenance (SCRAM) tool kit for Extra-Vehicular Activity (EVA). SCRAM undertakes a new challenging series of on-orbit tasks in support of the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These new EVA tasks involve welding, brazing, cutting, coating, heat-treating, and cleaning operations. Anticipated near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by atomic oxygen, and cleaning of optical, solar panel, and high emissivity radiator surfaces which have been degraded by contaminants. Future EVA-SCRAM applications are also examined, involving mass production tasks automated with robotics and artificial intelligence, for construction of large truss, aerobrake, and reactor shadow shield structures. Realistically achieving EVA-SCRAM is examined by addressing manual, teleoperated, semi-automated, and fully-automated operation modes. The operational challenges posed by EVA-SCRAM tasks are reviewed with respect to capabilities of existing and upcoming EVA systems, such as the Extravehicular Mobility Unit, the Shuttle Remote Manipulating System, the Dexterous End Effector, and the Servicing Aid Tool.
Automated segmentation of oral mucosa from wide-field OCT images (Conference Presentation)
NASA Astrophysics Data System (ADS)
Goldan, Ryan N.; Lee, Anthony M. D.; Cahill, Lucas; Liu, Kelly; MacAulay, Calum; Poh, Catherine F.; Lane, Pierre
2016-03-01
Optical Coherence Tomography (OCT) can discriminate morphological tissue features important for oral cancer detection such as the presence or absence of basement membrane and epithelial thickness. We previously reported an OCT system employing a rotary-pullback catheter capable of in vivo, rapid, wide-field (up to 90 x 2.5mm2) imaging in the oral cavity. Due to the size and complexity of these OCT data sets, rapid automated image processing software that immediately displays important tissue features is required to facilitate prompt bed-side clinical decisions. We present an automated segmentation algorithm capable of detecting the epithelial surface and basement membrane in 3D OCT images of the oral cavity. The algorithm was trained using volumetric OCT data acquired in vivo from a variety of tissue types and histology-confirmed pathologies spanning normal through cancer (8 sites, 21 patients). The algorithm was validated using a second dataset of similar size and tissue diversity. We demonstrate application of the algorithm to an entire OCT volume to map epithelial thickness, and detection of the basement membrane, over the tissue surface. These maps may be clinically useful for delineating pre-surgical tumor margins, or for biopsy site guidance.
Automated breast segmentation in ultrasound computer tomography SAFT images
NASA Astrophysics Data System (ADS)
Hopp, T.; You, W.; Zapf, M.; Tan, W. Y.; Gemmeke, H.; Ruiter, N. V.
2017-03-01
Ultrasound Computer Tomography (USCT) is a promising new imaging system for breast cancer diagnosis. An essential step before further processing is to remove the water background from the reconstructed images. In this paper we present a fully-automated image segmentation method based on three-dimensional active contours. The active contour method is extended by applying gradient vector flow and encoding the USCT aperture characteristics as additional weighting terms. A surface detection algorithm based on a ray model is developed to initialize the active contour, which is iteratively deformed to capture the breast outline in USCT reflection images. The evaluation with synthetic data showed that the method is able to cope with noisy images, and is not influenced by the position of the breast and the presence of scattering objects within the breast. The proposed method was applied to 14 in-vivo images resulting in an average surface deviation from a manual segmentation of 2.7 mm. We conclude that automated segmentation of USCT reflection images is feasible and produces results comparable to a manual segmentation. By applying the proposed method, reproducible segmentation results can be obtained without manual interaction by an expert.
Benefits of automated surface decontamination of a radioiodine ward.
Westcott, Eliza; Broadhurst, Alicia; Crossley, Steven; Lee, Lloyd; Phan, Xuyen; Scharli, Rainer; Xu, Yan
2012-02-01
A floor-washing robot has been acquired to assist physicists with decontamination of radioiodine therapy ward rooms after discharge of the patient at Sir Charles Gairdner Hospital. The effectiveness of the robot in decontaminating the ward has been evaluated. A controlled experiment was performed by deliberately contaminating a polyvinyl chloride flooring offcut with 131I followed by automated decontamination with the robot. The extent of fixed and removable contamination was assessed before and after decontamination by two methods: (1) direct Geiger-Mueller counting and (2) beta-counting wipe tests. Surface contamination was also assessed in situ on the ward by Geiger-Mueller counting and wipe testing. Contamination maps confirmed that contamination was removed rather than spread around by the robot. Wipe testing revealed that the robot was successful in clearing approximately 60-80% of removable contamination. The robotic floor-washing device was considered suitable to provide effective automated decontamination of the radioiodine ward. In addition, the robot affords other benefits: the time spent by the physicists decontaminating the room is greatly reduced offering financial and occupational safety and health benefits. The robot has also found utility in other decontamination applications in the healthcare environment.
A Simple Method for Automated Equilibration Detection in Molecular Simulations.
Chodera, John D
2016-04-12
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.
Advanced Earth Observation System Instrumentation Study (AEOSIS)
NASA Technical Reports Server (NTRS)
Var, R. E.
1976-01-01
The feasibility, practicality, and cost are investigated for establishing a national system or grid of artificial landmarks suitable for automated (near real time) recognition in the multispectral scanner imagery data from an earth observation satellite (EOS). The intended use of such landmarks, for orbit determination and improved mapping accuracy is reviewed. The desirability of using xenon searchlight landmarks for this purpose is explored theoretically and by means of experimental results obtained with LANDSAT 1 and LANDSAT 2. These results are used, in conjunction with the demonstrated efficiency of an automated detection scheme, to determine the size and cost of a xenon searchlight that would be suitable for an EOS Searchlight Landmark Station (SLS), and to facilitate the development of a conceptual design for an automated and environmentally protected EOS SLS.
A simple method for automated equilibration detection in molecular simulations
Chodera, John D.
2016-01-01
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390
NASA Astrophysics Data System (ADS)
Gold, Ryan; Reitman, Nadine; Briggs, Richard; Barnhart, William; Hayes, Gavin
2015-04-01
The 24 September 2013 Mw7.7 Balochistan, Pakistan earthquake ruptured a ~200 km-long stretch of the 60° ± 15° northwest-dipping Hoshab fault in southern Pakistan. The earthquake is notable because it produced the second-largest lateral surface displacement observed for a continental strike-slip earthquake. Surface displacements and geodetic and teleseismic inversions indicate that peak slip occurred within the upper 0-3 km of the crust. To explore along-strike and fault-perpendicular surface deformation patterns, we remotely mapped the surface trace of the rupture and measured its surface deformation using high-resolution (0.5 m) pre- and post-event satellite imagery. Post-event images were collected 7-114 days following the earthquake, so our analysis captures the sum of both the coseismic and post-seismic (e.g., after slip) deformation. We document peak left-lateral offset of ~15 m using 289 near-field (±10 m from fault) laterally offset piercing points, such as streams, terrace risers, and roads. We characterize off-fault deformation by measuring the medium- (±200 m from fault) and far-field (±10 km from fault) displacement using manual (242 measurements) and automated image cross-correlation methods. Off-fault peak lateral displacement values (medium- and far-field) are ~16 m and commonly exceed the on-fault displacement magnitudes. Our observations suggest that coseismic surface displacement typically increases with distance away from the surface trace of the fault; however, the majority of surface displacement is within 100 m of the primary fault trace and is most localized on sections of the rupture exhibiting narrow (<5 m) zones of observable surface deformation. Furthermore, the near-field displacement measurements account for, on average, only 73% of the total coseismic displacement field and the pattern is highly heterogeneous. This analysis highlights the importance of identifying paleoseismic field study sites (e.g. trenches) that span fault sections with narrow deformation zones in order to capture the full deformation field. Our results imply that hazard analyses based on geologically-determined fault slip rates (e.g., near-field) should consider the significant and heterogeneous mismatch we document between on- and off-fault coseismic deformation.
Tools for Coordinated Planning Between Observatories
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Fishman, Mark; Grella, Vince; Kerbel, Uri; Maks, Lori; Misra, Dharitri; Pell, Vince; Powers, Edward I. (Technical Monitor)
2001-01-01
With the realization of NASA's era of great observatories, there are now more than three space-based telescopes operating in different wavebands. This situation provides astronomers with a unique opportunity to simultaneously observe with multiple observatories. Yet scheduling multiple observatories simultaneously is highly inefficient when compared to observations using only one single observatory. Thus, programs using multiple observatories are limited not due to scientific restrictions, but due to operational inefficiencies. At present, multi-observatory programs are conducted by submitting observing proposals separately to each concerned observatory. To assure that the proposed observations can be scheduled, each observatory's staff has to check that the observations are valid and meet all the constraints for their own observatory; in addition, they have to verify that the observations satisfy the constraints of the other observatories. Thus, coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Due to the lack of automated tools for coordinated observations, this process is time consuming, error-prone, and the outcome of the requests is not certain until the very end. To increase observatory operations efficiency, such manpower intensive processes need to undergo re-engineering. To overcome this critical deficiency, Goddard Space Flight Center's Advanced Architectures and Automation Branch is developing a prototype effort called the Visual Observation Layout Tool (VOLT). The main objective of the VOLT project is to provide visual tools to help automate the planning of coordinated observations by multiple astronomical observatories, as well as to increase the scheduling probability of all observations.
S3D: An interactive surface grid generation tool
NASA Technical Reports Server (NTRS)
Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David
1992-01-01
S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.
Stacked endoplasmic reticulum sheets are connected by helicoidal membrane motifs.
Terasaki, Mark; Shemesh, Tom; Kasthuri, Narayanan; Klemm, Robin W; Schalek, Richard; Hayworth, Kenneth J; Hand, Arthur R; Yankova, Maya; Huber, Greg; Lichtman, Jeff W; Rapoport, Tom A; Kozlov, Michael M
2013-07-18
The endoplasmic reticulum (ER) often forms stacked membrane sheets, an arrangement that is likely required to accommodate a maximum of membrane-bound polysomes for secretory protein synthesis. How sheets are stacked is unknown. Here, we used improved staining and automated ultrathin sectioning electron microscopy methods to analyze stacked ER sheets in neuronal cells and secretory salivary gland cells of mice. Our results show that stacked ER sheets form a continuous membrane system in which the sheets are connected by twisted membrane surfaces with helical edges of left- or right-handedness. The three-dimensional structure of tightly stacked ER sheets resembles a parking garage, in which the different levels are connected by helicoidal ramps. A theoretical model explains the experimental observations and indicates that the structure corresponds to a minimum of elastic energy of sheet edges and surfaces. The structure allows the dense packing of ER sheets in the restricted space of a cell. Copyright © 2013 Elsevier Inc. All rights reserved.
Coplen, T.B.; Neiman, P.J.; White, A.B.; Landwehr, J.M.; Ralph, F.M.; Dettinger, M.D.
2008-01-01
With a new automated precipitation collector we measured a remarkable decrease of 51??? in the hydrogen isotope ratio (?? 2H) of precipitation over a 60-minute period during the landfall of an extratropical cyclone along the California coast on 21 March 2005. The rapid drop in ??2H occurred as precipitation generation transitioned from a shallow to a much deeper cloud layer, in accord with synoptic-scale ascent and deep "seeder-feeder" precipitation. Such unexpected ?? 2H variations can substantially impact widely used isotope-hydrograph methods. From extreme ??2H values of -26 and -78???, we calculate precipitation temperatures of 9.7 and -4.2??C using an adiabatic condensation isotope model, in good agreement with temperatures estimated from surface observations and radar data. This model indicates that 60 percent of the moisture was precipitated during ascent as temperature decreased from 15??C at the ocean surface to -4??C above the measurement site.
NASA Astrophysics Data System (ADS)
Ryan, J. C.; Hubbard, A.; Irvine-Fynn, T. D.; Doyle, S. H.; Cook, J. M.; Stibal, M.; Box, J. E.
2017-06-01
Calibration and validation of satellite-derived ice sheet albedo data require high-quality, in situ measurements commonly acquired by up and down facing pyranometers mounted on automated weather stations (AWS). However, direct comparison between ground and satellite-derived albedo can only be justified when the measured surface is homogeneous at the length-scale of both satellite pixel and in situ footprint. Here we use digital imagery acquired by an unmanned aerial vehicle to evaluate point-to-pixel albedo comparisons across the western, ablating margin of the Greenland Ice Sheet. Our results reveal that in situ measurements overestimate albedo by up to 0.10 at the end of the melt season because the ground footprints of AWS-mounted pyranometers are insufficient to capture the spatial heterogeneity of the ice surface as it progressively ablates and darkens. Statistical analysis of 21 AWS across the entire Greenland Ice Sheet reveals that almost half suffer from this bias, including some AWS located within the wet snow zone.
NASA Astrophysics Data System (ADS)
Ryan, J.; Hubbard, A., II; Irvine-Fynn, T. D.; Doyle, S. H.; Cook, J.; Stibal, M.; Smith, L. C.; Box, J. E.
2017-12-01
Calibration and validation of satellite-derived ice sheet albedo data require high-quality, in situ measurements commonly acquired by up and down facing pyranometers mounted on automated weather stations (AWS). However, direct comparison between ground and satellite-derived albedo can only be justified when the measured surface is homogeneous at the length-scale of both satellite pixel and in situ footprint. We used digital imagery acquired by an unmanned aerial vehicle to evaluate point-to-pixel albedo comparisons across the western, ablating margin of the Greenland Ice Sheet. Our results reveal that in situ measurements overestimate albedo by up to 0.10 at the end of the melt season because the ground footprints of AWS-mounted pyranometers are insufficient to capture the spatial heterogeneity of the ice surface as it progressively ablates and darkens. Statistical analysis of 21 AWS across the entire Greenland Ice Sheet reveals that almost half suffer from this bias, including some AWS located within the wet snow zone.
A volumetric pulmonary CT segmentation method with applications in emphysema assessment
NASA Astrophysics Data System (ADS)
Silva, José Silvestre; Silva, Augusto; Santos, Beatriz S.
2006-03-01
A segmentation method is a mandatory pre-processing step in many automated or semi-automated analysis tasks such as region identification and densitometric analysis, or even for 3D visualization purposes. In this work we present a fully automated volumetric pulmonary segmentation algorithm based on intensity discrimination and morphologic procedures. Our method first identifies the trachea as well as primary bronchi and then the pulmonary region is identified by applying a threshold and morphologic operations. When both lungs are in contact, additional procedures are performed to obtain two separated lung volumes. To evaluate the performance of the method, we compared contours extracted from 3D lung surfaces with reference contours, using several figures of merit. Results show that the worst case generally occurs at the middle sections of high resolution CT exams, due the presence of aerial and vascular structures. Nevertheless, the average error is inferior to the average error associated with radiologist inter-observer variability, which suggests that our method produces lung contours similar to those drawn by radiologists. The information created by our segmentation algorithm is used by an identification and representation method in pulmonary emphysema that also classifies emphysema according to its severity degree. Two clinically proved thresholds are applied which identify regions with severe emphysema, and with highly severe emphysema. Based on this thresholding strategy, an application for volumetric emphysema assessment was developed offering new display paradigms concerning the visualization of classification results. This framework is easily extendable to accommodate other classifiers namely those related with texture based segmentation as it is often the case with interstitial diseases.
The Objective Identification and Quantification of Interstitial Lung Abnormalities in Smokers.
Ash, Samuel Y; Harmouche, Rola; Ross, James C; Diaz, Alejandro A; Hunninghake, Gary M; Putman, Rachel K; Onieva, Jorge; Martinez, Fernando J; Choi, Augustine M; Lynch, David A; Hatabu, Hiroto; Rosas, Ivan O; Estepar, Raul San Jose; Washko, George R
2017-08-01
Previous investigation suggests that visually detected interstitial changes in the lung parenchyma of smokers are highly clinically relevant and predict outcomes, including death. Visual subjective analysis to detect these changes is time-consuming, insensitive to subtle changes, and requires training to enhance reproducibility. Objective detection of such changes could provide a method of disease identification without these limitations. The goal of this study was to develop and test a fully automated image processing tool to objectively identify radiographic features associated with interstitial abnormalities in the computed tomography scans of a large cohort of smokers. An automated tool that uses local histogram analysis combined with distance from the pleural surface was used to detect radiographic features consistent with interstitial lung abnormalities in computed tomography scans from 2257 individuals from the Genetic Epidemiology of COPD study, a longitudinal observational study of smokers. The sensitivity and specificity of this tool was determined based on its ability to detect the visually identified presence of these abnormalities. The tool had a sensitivity of 87.8% and a specificity of 57.5% for the detection of interstitial lung abnormalities, with a c-statistic of 0.82, and was 100% sensitive and 56.7% specific for the detection of the visual subtype of interstitial abnormalities called fibrotic parenchymal abnormalities, with a c-statistic of 0.89. In smokers, a fully automated image processing tool is able to identify those individuals who have interstitial lung abnormalities with moderate sensitivity and specificity. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Objective automated quantification of fluorescence signal in histological sections of rat lens.
Talebizadeh, Nooshin; Hagström, Nanna Zhou; Yu, Zhaohua; Kronschläger, Martin; Söderberg, Per; Wählby, Carolina
2017-08-01
Visual quantification and classification of fluorescent signals is the gold standard in microscopy. The purpose of this study was to develop an automated method to delineate cells and to quantify expression of fluorescent signal of biomarkers in each nucleus and cytoplasm of lens epithelial cells in a histological section. A region of interest representing the lens epithelium was manually demarcated in each input image. Thereafter, individual cell nuclei within the region of interest were automatically delineated based on watershed segmentation and thresholding with an algorithm developed in Matlab™. Fluorescence signal was quantified within nuclei, cytoplasms and juxtaposed backgrounds. The classification of cells as labelled or not labelled was based on comparison of the fluorescence signal within cells with local background. The classification rule was thereafter optimized as compared with visual classification of a limited dataset. The performance of the automated classification was evaluated by asking 11 independent blinded observers to classify all cells (n = 395) in one lens image. Time consumed by the automatic algorithm and visual classification of cells was recorded. On an average, 77% of the cells were correctly classified as compared with the majority vote of the visual observers. The average agreement among visual observers was 83%. However, variation among visual observers was high, and agreement between two visual observers was as low as 71% in the worst case. Automated classification was on average 10 times faster than visual scoring. The presented method enables objective and fast detection of lens epithelial cells and quantification of expression of fluorescent signal with an accuracy comparable with the variability among visual observers. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Automated Long - Term Scheduling for the SOFIA Airborne Observatory
NASA Technical Reports Server (NTRS)
Civeit, Thomas
2013-01-01
The NASA Stratospheric Observatory for Infrared Astronomy (SOFIA) is a joint US/German project to develop and operate a gyro-stabilized 2.5-meter telescope in a Boeing 747SP. SOFIA's first science observations were made in December 2010. During 2011, SOFIA accomplished 30 flights in the "Early Science" program as well as a deployment to Germany. The new observing period, known as Cycle 1, is scheduled to begin in 2012. It includes 46 science flights grouped in four multi-week observing campaigns spread through a 13-month span. Automation of the flight scheduling process offers a major challenge to the SOFIA mission operations. First because it is needed to mitigate its relatively high cost per unit observing time compared to space-borne missions. Second because automated scheduling techniques available for ground-based and space-based telescopes are inappropriate for an airborne observatory. Although serious attempts have been made in the past to solve part of the problem, until recently mission operations staff was still manually scheduling flights. We present in this paper a new automated solution for generating SOFIA long-term schedules that will be used in operations from the Cycle 1 observing period. We describe the constraints that should be satisfied to solve the SOFIA scheduling problem in the context of real operations. We establish key formulas required to efficiently calculate the aircraft course over ground when evaluating flight schedules. We describe the foundations of the SOFIA long-term scheduler, the constraint representation, and the random search based algorithm that generates observation and instrument schedules. Finally, we report on how the new long-term scheduler has been used in operations to date.
Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.
2018-01-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
Discrimination between smiling faces: Human observers vs. automated face analysis.
Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo
2018-05-11
This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions.
NASA Astrophysics Data System (ADS)
Coughlan, J. C.
2005-12-01
The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle, human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future NASA missions.
Vortex lattice prediction of subsonic aerodynamics of hypersonic vehicle concepts
NASA Technical Reports Server (NTRS)
Pittman, J. L.; Dillon, J. L.
1977-01-01
The vortex lattice method introduced by Lamar and Gloss (1975) was applied to the prediction of subsonic aerodynamic characteristics of hypersonic body-wing configurations. The reliability of the method was assessed through comparison of the calculated and observed aerodynamic performances of two National Hypersonic Flight Research Facility craft at Mach 0.2. The investigation indicated that a vortex lattice model involving 120 or more panel elements can give good results for the lift and induced drag coefficients of the craft, as well as for the pitching moment at angles of attack below 10 to 15 deg. Automated processes for calculating the local slopes of mean-camber surfaces may also render the method suitable for use in preliminary design phases.
NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions
NASA Technical Reports Server (NTRS)
Coughlan, Joseph C.
2005-01-01
The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle. Human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future Nasa missions.
Gold, Ryan D.; Reitman, Nadine G.; Briggs, Richard; Barnhart, William; Hayes, Gavin; Wilson, Earl M.
2015-01-01
The 24 September 2013 Mw7.7 Balochistan, Pakistan earthquake ruptured a ~ 200 km-long stretch of the Hoshab fault in southern Pakistan and produced the second-largest lateral surface displacement observed for a continental strike-slip earthquake. We remotely measured surface deformation associated with this event using high-resolution (0.5 m) pre- and post-event satellite optical imagery. We document left lateral, near-field, on-fault offsets (10 m from fault) using 309 laterally offset piercing points, such as streams, terrace risers, and roads. Peak near-field displacement is 13.6 + 2.5/− 3.4 m. We characterize off-fault deformation by measuring medium- (< 350 m from fault) and far-field (> 350 m from fault) displacement using manual (259 measurements) and automated image cross-correlation methods, respectively. Off-fault peak lateral displacement values are ~ 15 m and exceed on-fault displacement magnitudes for ~ 85% of the rupture length. Our observations suggest that for this rupture, coseismic surface displacement typically increases with distance away from the surface trace of the fault; however, nearly 100% of total surface displacement occurs within a few hundred meters of the primary fault trace. Furthermore, off-fault displacement accounts for, on average, 28% of the total displacement but exhibits a highly heterogeneous along-strike pattern. The best agreement between near-field and far-field displacements generally corresponds to the narrowest fault zone widths. Our analysis demonstrates significant and heterogeneous mismatches between on- and off-fault coseismic deformation, and we conclude that this phenomenon should be considered in hazard models based on geologically determined on-fault slip rates.
NASA Astrophysics Data System (ADS)
Gold, Ryan D.; Reitman, Nadine G.; Briggs, Richard W.; Barnhart, William D.; Hayes, Gavin P.; Wilson, Earl
2015-10-01
The 24 September 2013 Mw7.7 Balochistan, Pakistan earthquake ruptured a ~ 200 km-long stretch of the Hoshab fault in southern Pakistan and produced the second-largest lateral surface displacement observed for a continental strike-slip earthquake. We remotely measured surface deformation associated with this event using high-resolution (0.5 m) pre- and post-event satellite optical imagery. We document left lateral, near-field, on-fault offsets (10 m from fault) using 309 laterally offset piercing points, such as streams, terrace risers, and roads. Peak near-field displacement is 13.6 + 2.5/- 3.4 m. We characterize off-fault deformation by measuring medium- (< 350 m from fault) and far-field (> 350 m from fault) displacement using manual (259 measurements) and automated image cross-correlation methods, respectively. Off-fault peak lateral displacement values are ~ 15 m and exceed on-fault displacement magnitudes for ~ 85% of the rupture length. Our observations suggest that for this rupture, coseismic surface displacement typically increases with distance away from the surface trace of the fault; however, nearly 100% of total surface displacement occurs within a few hundred meters of the primary fault trace. Furthermore, off-fault displacement accounts for, on average, 28% of the total displacement but exhibits a highly heterogeneous along-strike pattern. The best agreement between near-field and far-field displacements generally corresponds to the narrowest fault zone widths. Our analysis demonstrates significant and heterogeneous mismatches between on- and off-fault coseismic deformation, and we conclude that this phenomenon should be considered in hazard models based on geologically determined on-fault slip rates.
NASA Astrophysics Data System (ADS)
Warren, K.; Eppes, M.-C.; Swami, S.; Garbini, J.; Putkonen, J.
2013-11-01
The rates and processes that lead to non-tectonic rock fracture on Earth's surface are widely debated but poorly understood. Few, if any, studies have made the direct observations of rock fracturing under natural conditions that are necessary to directly address this problem. An instrumentation design that enables concurrent high spatial and temporal monitoring resolution of (1) diurnal environmental conditions of a natural boulder and its surroundings in addition to (2) the fracturing of that boulder under natural full-sun exposure is described herein. The surface of a fluvially transported granite boulder was instrumented with (1) six acoustic emission (AE) sensors that record micro-crack associated, elastic wave-generated activity within the three-dimensional space of the boulder, (2) eight rectangular rosette foil strain gages to measure surface strain, (3) eight thermocouples to measure surface temperature, and (4) one surface moisture sensor. Additionally, a soil moisture probe and a full weather station that measures ambient temperature, relative humidity, wind speed, wind direction, barometric pressure, insolation, and precipitation were installed adjacent to the test boulder. AE activity was continuously monitored by one logger while all other variables were acquired by a separate logger every 60 s. The protocols associated with the instrumentation, data acquisition, and analysis are discussed in detail. During the first four months, the deployed boulder experienced almost 12 000 AE events, the majority of which occur in the afternoon when temperatures are decreasing. This paper presents preliminary data that illustrates data validity and typical patterns and behaviors observed. This system offers the potential to (1) obtain an unprecedented record of the natural conditions under which rocks fracture and (2) decipher the mechanical processes that lead to rock fracture at a variety of temporal scales under a range of natural conditions.
NASA Astrophysics Data System (ADS)
Warren, K.; Eppes, M.-C.; Swami, S.; Garbini, J.; Putkonen, J.
2013-07-01
The rates and processes that lead to non-tectonic rock fracture on the Earth's surface are widely debated but poorly understood. Few, if any, studies have made the direct observations of rock fracturing under natural conditions that are necessary to directly address this problem. An instrumentation design that enables concurrent high spatial and temporal monitoring resolution of (1) diurnal environmental conditions of a natural boulder and its surroundings in addition to (2) the fracturing of that boulder under natural full-sun exposure is described herein. The surface of a fluvially transported granite boulder was instrumented with (1) six acoustic emission (AE) sensors that record micro-crack associated, elastic wave-generated activity within the three-dimensional space of the boulder, (2) eight rectangular rosette foil strain gages to measure surface strain, (3) eight thermocouples to measure surface temperature, and (4) one surface moisture sensor. Additionally, a soil moisture probe and a full weather station that measures ambient temperature, relative humidity, wind speed, wind direction, barometric pressure, insolation, and precipitation were installed adjacent to the test boulder. AE activity was continuously monitored by one logger while all other variables were acquired by a separate logger every 60 s. The protocols associated with the instrumentation, data acquisition, and analyses are discussed in detail. During the first four months, the deployed boulder experienced almost 12 000 AE events, the majority of which occur in the afternoon when temperatures are decreasing. This paper presents preliminary data that illustrates data validity and typical patterns and behaviors observed. This system offers the potential to (1) obtain an unprecedented record of the natural conditions under which rocks fracture and (2) decipher the mechanical processes that lead to rock fracture at a variety of temporal scales under a range of natural conditions.
A prototype for automation of land-cover products from Landsat Surface Reflectance Data Records
NASA Astrophysics Data System (ADS)
Rover, J.; Goldhaber, M. B.; Steinwand, D.; Nelson, K.; Coan, M.; Wylie, B. K.; Dahal, D.; Wika, S.; Quenzer, R.
2014-12-01
Landsat data records of surface reflectance provide a three-decade history of land surface processes. Due to the vast number of these archived records, development of innovative approaches for automated data mining and information retrieval were necessary. Recently, we created a prototype utilizing open source software libraries for automatically generating annual Anderson Level 1 land cover maps and information products from data acquired by the Landsat Mission for the years 1984 to 2013. The automated prototype was applied to two target areas in northwestern and east-central North Dakota, USA. The approach required the National Land Cover Database (NLCD) and two user-input target acquisition year-days. The Landsat archive was mined for scenes acquired within a 100-day window surrounding these target dates, and then cloud-free pixels where chosen closest to the specified target acquisition dates. The selected pixels were then composited before completing an unsupervised classification using the NLCD. Pixels unchanged in pairs of the NLCD were used for training decision tree models in an iterative process refined with model confidence measures. The decision tree models were applied to the Landsat composites to generate a yearly land cover map and related information products. Results for the target areas captured changes associated with the recent expansion of oil shale production and agriculture driven by economics and policy, such as the increase in biofuel production and reduction in Conservation Reserve Program. Changes in agriculture, grasslands, and surface water reflect the local hydrological conditions that occurred during the 29-year span. Future enhancements considered for this prototype include a web-based client, ancillary spatial datasets, trends and clustering algorithms, and the forecasting of future land cover.
Integrated microfluidic probe station.
Perrault, C M; Qasaimeh, M A; Brastaviceanu, T; Anderson, K; Kabakibo, Y; Juncker, D
2010-11-01
The microfluidic probe (MFP) consists of a flat, blunt tip with two apertures for the injection and reaspiration of a microjet into a solution--thus hydrodynamically confining the microjet--and is operated atop an inverted microscope that enables live imaging. By scanning across a surface, the microjet can be used for surface processing with the capability of both depositing and removing material; as it operates under immersed conditions, sensitive biological materials and living cells can be processed. During scanning, the MFP is kept immobile and centered over the objective of the inverted microscope, a few micrometers above a substrate that is displaced by moving the microscope stage and that is flushed continuously with the microjet. For consistent and reproducible surface processing, the gap between the MFP and the substrate, the MFP's alignment, the scanning speed, the injection and aspiration flow rates, and the image capture need all to be controlled and synchronized. Here, we present an automated MFP station that integrates all of these functionalities and automates the key operational parameters. A custom software program is used to control an independent motorized Z stage for adjusting the gap, a motorized microscope stage for scanning the substrate, up to 16 syringe pumps for injecting and aspirating fluids, and an inverted fluorescence microscope equipped with a charge-coupled device camera. The parallelism between the MFP and the substrate is adjusted using manual goniometer at the beginning of the experiment. The alignment of the injection and aspiration apertures along the scanning axis is performed using a newly designed MFP screw holder. We illustrate the integrated MFP station by the programmed, automated patterning of fluorescently labeled biotin on a streptavidin-coated surface.
NASA Astrophysics Data System (ADS)
Iles, E. J.; McCallum, L.; Lovell, J. E. J.; McCallum, J. N.
2018-02-01
As we move into the next era of geodetic VLBI, the scheduling process is one focus for improvement in terms of increased flexibility and the ability to react with changing conditions. A range of simulations were conducted to ascertain the impact of scheduling on geodetic results such as Earth Orientation Parameters (EOPs) and station coordinates. The potential capabilities of new automated scheduling modes were also simulated, using the so-called 'dynamic scheduling' technique. The primary aim was to improve efficiency for both cost and time without losing geodetic precision, particularly to maximise the uses of the Australian AuScope VLBI array. We show that short breaks in observation will not significantly degrade the results of a typical 24 h experiment, whereas simply shortening observing time degrades precision exponentially. We also confirm the new automated, dynamic scheduling mode is capable of producing the same standard of result as a traditional schedule, with close to real-time flexibility. Further, it is possible to use the dynamic scheduler to augment the 3 station Australian AuScope array and thereby attain EOPs of the current global precision with only intermittent contribution from 2 additional stations. We thus confirm automated, dynamic scheduling bears great potential for flexibility and automation in line with aims for future continuous VLBI operations.
Hadjiiski, Lubomir; Liu, Jordan; Chan, Heang-Ping; Zhou, Chuan; Wei, Jun; Chughtai, Aamer; Kuriakose, Jean; Agarwal, Prachi; Kazerooni, Ella
2016-01-01
The detection of stenotic plaques strongly depends on the quality of the coronary arterial tree imaged with coronary CT angiography (cCTA). However, it is time consuming for the radiologist to select the best-quality vessels from the multiple-phase cCTA for interpretation in clinical practice. We are developing an automated method for selection of the best-quality vessels from coronary arterial trees in multiple-phase cCTA to facilitate radiologist's reading or computerized analysis. Our automated method consists of vessel segmentation, vessel registration, corresponding vessel branch matching, vessel quality measure (VQM) estimation, and automatic selection of best branches based on VQM. For every branch, the VQM was calculated as the average radial gradient. An observer preference study was conducted to visually compare the quality of the selected vessels. 167 corresponding branch pairs were evaluated by two radiologists. The agreement between the first radiologist and the automated selection was 76% with kappa of 0.49. The agreement between the second radiologist and the automated selection was also 76% with kappa of 0.45. The agreement between the two radiologists was 81% with kappa of 0.57. The observer preference study demonstrated the feasibility of the proposed automated method for the selection of the best-quality vessels from multiple cCTA phases.
Durso, Francis T; Stearman, Eric J; Morrow, Daniel G; Mosier, Kathleen L; Fischer, Ute; Pop, Vlad L; Feigh, Karen M
2015-05-01
We attempted to understand the latent structure underlying the systems pilots use to operate in situations involving human-automation interaction (HAI). HAI is an important characteristic of many modern work situations. Of course, the cognitive subsystems are not immediately apparent by observing a functioning system, but correlations between variables may reveal important relations. The current report examined pilot judgments of 11 HAI dimensions (e.g., Workload, Task Management, Stress/Nervousness, Monitoring Automation, and Cross-Checking Automation) across 48 scenarios that required airline pilots to interact with automation on the flight deck. We found three major clusters of the dimensions identifying subsystems on the flight deck: a workload subsystem, a management subsystem, and an awareness subsystem. Relationships characterized by simple correlations cohered in ways that suggested underlying subsystems consistent with those that had previously been theorized. Understanding the relationship among dimensions affecting HAI is an important aspect in determining how a new piece of automation designed to affect one dimension will affect other dimensions as well. © 2014, Human Factors and Ergonomics Society.
NASA Astrophysics Data System (ADS)
Reddy, V.; Le Corre, L.; Nathues, A.; Hall, I.; Gutierrez-Marques, P.; Hoffmann, M.
2011-10-01
The Dawn mission will rendezvous with asteroid (4) Vesta in July 2011. We have developed a set of equations for extracting mean pyroxene chemistry (Ferrosilite and Wollastonite) for classifying terrains on Vesta by using the Dawn Framing Camera (FC) multi-color bands. The Automated Spectral System (ASS) utilizes pseudo-Band I minima to estimate the mean pyroxene chemistry of diogenites, and basaltic eucrites. The mean pyroxene chemistries of cumulate eucrites, and howardites overlap each other on the pyroxene quadrilateral and hence are harder to distinguish. We expect our ASS to carry a bulk of the terrain classification and mineralogy workload utilizing these equations and complement the work of DawnKey (Le Corre et al., 2011, DPS/EPSC 2011). The system will also provide surface mineral chemistry layers that can be used for mapping Vesta's surface.
Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas
2016-04-14
NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.
A Comparison of Rule-Based, K-Nearest Neighbor, and Neural Net Classifiers for Automated
Tai-Hoon Cho; Richard W. Conners; Philip A. Araman
1991-01-01
Over the last few years the authors have been involved in research aimed at developing a machine vision system for locating and identifying surface defects on materials. The particular problem being studied involves locating surface defects on hardwood lumber in a species independent manner. Obviously, the accurate location and identification of defects is of paramount...
Automated life-detection experiments for the Viking mission to Mars
NASA Technical Reports Server (NTRS)
Klein, H. P.
1974-01-01
As part of the Viking mission to Mars in 1975, an automated set of instruments is being built to test for the presence of metabolizing organisms on that planet. Three separate modules are combined in this instrument so that samples of the Martian surface can be subjected to a broad array of experimental conditions so as to measure biological activity. The first, the Pyrolytic Release Module, will expose surface samples to a mixture of C-14O and C-14O2 in the presence of Martian atmosphere and a light source that simulates the Martian visible spectrum. The assay system is designed to determine the extent of assimilation of CO or CO2 into organic compounds. The Gas Exchange Module will incubate surface samples in a humidified CO2 atmosphere. At specified times, portions of the incubation atmosphere will be analyzed by gas chromatography to detect the release or uptake of CO2 and several additional gases. The Label Release Module will incubate surface samples with a dilute aqueous solution of simple radioactive organic substrates in Martian atmosphere, and the gas phase will be monitored continuously for the release of labeled CO2.
A simple algorithm for identifying periods of snow accumulation on a radiometer
NASA Astrophysics Data System (ADS)
Lapo, Karl E.; Hinkelman, Laura M.; Landry, Christopher C.; Massmann, Adam K.; Lundquist, Jessica D.
2015-09-01
Downwelling solar, Qsi, and longwave, Qli, irradiances at the earth's surface are the primary energy inputs for many hydrologic processes, and uncertainties in measurements of these two terms confound evaluations of estimated irradiances and negatively impact hydrologic modeling. Observations of Qsi and Qli in cold environments are subject to conditions that create additional uncertainties not encountered in other climates, specifically the accumulation of snow on uplooking radiometers. To address this issue, we present an automated method for estimating these periods of snow accumulation. Our method is based on forest interception of snow and uses common meteorological observations. In this algorithm, snow accumulation must exceed a threshold to obscure the sensor and is only removed through scouring by wind or melting. The algorithm is evaluated at two sites representing different mountain climates: (1) Snoqualmie Pass, Washington (maritime) and (2) the Senator Beck Basin Study Area, Colorado (continental). The algorithm agrees well with time-lapse camera observations at the Washington site and with multiple measurements at the Colorado site, with 70-80% of observed snow accumulation events correctly identified. We suggest using the method for quality controlling irradiance observations in snow-dominated climates where regular, daily maintenance is not possible.
On automating domain connectivity for overset grids
NASA Technical Reports Server (NTRS)
Chiu, Ing-Tsau
1994-01-01
An alternative method for domain connectivity among systems of overset grids is presented. Reference uniform Cartesian systems of points are used to achieve highly efficient domain connectivity, and form the basis for a future fully automated system. The Cartesian systems are used to approximated body surfaces and to map the computational space of component grids. By exploiting the characteristics of Cartesian Systems, Chimera type hole-cutting and identification of donor elements for intergrid boundary points can be carried out very efficiently. The method is tested for a range of geometrically complex multiple-body overset grid systems.
Optimization of the Automated Spray Layer-by-Layer Technique for Thin Film Deposition
2010-06-01
pieces. All silicon was cleaned with ethanol and Milli-Q water to hydroxylate the surface. Quartz Crystal Microbalance Si02 coated sensors (Q-sense...was deposited onto a SiO2 coated QCM crystal using the automated dipping process described earlier. Once the film was deposited, it was dried over...night, and then placed in the QCM -D device. An additional layer of PAH was deposited onto the crystal in the QCM -D chamber at a flow rate of 1pL/minute
Orbán, Levente L.; Plowright, Catherine M.S.
2014-01-01
We present two methods for observing bumblebee choice behavior in an enclosed testing space. The first method consists of Radio Frequency Identification (RFID) readers built into artificial flowers that display various visual cues, and RFID tags (i.e., passive transponders) glued to the thorax of bumblebee workers. The novelty in our implementation is that RFID readers are built directly into artificial flowers that are capable of displaying several distinct visual properties such as color, pattern type, spatial frequency (i.e., “busyness” of the pattern), and symmetry (spatial frequency and symmetry were not manipulated in this experiment). Additionally, these visual displays in conjunction with the automated systems are capable of recording unrewarded and untrained choice behavior. The second method consists of recording choice behavior at artificial flowers using motion-sensitive high-definition camcorders. Bumblebees have number tags glued to their thoraces for unique identification. The advantage in this implementation over RFID is that in addition to observing landing behavior, alternate measures of preference such as hovering and antennation may also be observed. Both automation methods increase experimental control, and internal validity by allowing larger scale studies that take into account individual differences. External validity is also improved because bees can freely enter and exit the testing environment without constraints such as the availability of a research assistant on-site. Compared to human observation in real time, the automated methods are more cost-effective and possibly less error-prone. PMID:25489677
Orbán, Levente L; Plowright, Catherine M S
2014-11-15
We present two methods for observing bumblebee choice behavior in an enclosed testing space. The first method consists of Radio Frequency Identification (RFID) readers built into artificial flowers that display various visual cues, and RFID tags (i.e., passive transponders) glued to the thorax of bumblebee workers. The novelty in our implementation is that RFID readers are built directly into artificial flowers that are capable of displaying several distinct visual properties such as color, pattern type, spatial frequency (i.e., "busyness" of the pattern), and symmetry (spatial frequency and symmetry were not manipulated in this experiment). Additionally, these visual displays in conjunction with the automated systems are capable of recording unrewarded and untrained choice behavior. The second method consists of recording choice behavior at artificial flowers using motion-sensitive high-definition camcorders. Bumblebees have number tags glued to their thoraces for unique identification. The advantage in this implementation over RFID is that in addition to observing landing behavior, alternate measures of preference such as hovering and antennation may also be observed. Both automation methods increase experimental control, and internal validity by allowing larger scale studies that take into account individual differences. External validity is also improved because bees can freely enter and exit the testing environment without constraints such as the availability of a research assistant on-site. Compared to human observation in real time, the automated methods are more cost-effective and possibly less error-prone.
NASA Astrophysics Data System (ADS)
Chilson, P. B.; Fiebrich, C. A.; Huck, R.; Grimsley, J.; Salazar-Cerreno, J.; Carson, K.; Jacob, J.
2017-12-01
Fixed monitoring sites, such as those in the US National Weather Service Automated Surface Observing System (ASOS) and the Oklahoma Mesonet provide valuable, high temporal resolution information about the atmosphere to forecasters and the general public. The Oklahoma Mesonet is comprised of a network of 120 surface sites providing a wide array of atmospheric measurements up to a height of 10 m with an update time of five minutes. The deployment of small unmanned aircraft to collect in-situ vertical measurements of the atmospheric state in conjunction with surface conditions has potential to significantly expand weather observation capabilities. This concept can enhance the safety of individuals and support commerce through improved observations and short-term forecasts of the weather and other environmental variables in the lower atmosphere. We report on a concept of adding the capability of collecting vertical atmospheric measurements (profiles) through the use of unmanned aerial systems (UAS) at remote Oklahoma sites deemed suitable for this application. While there are a number of other technologies currently available that can provide measurements of one or a few variables, the proposed UAS concept will be expandable and modular to accommodate several different sensor packages and provide accurate in-situ measurements in virtually all weather conditions. Such a system would facilitate off-site maintenance and calibration and would provide the ability to add new sensors as they are developed or as new requirements are identified. The small UAS must be capable of accommodating the weight of all sensor packages and have lighting, communication, and aircraft avoidance systems necessary to meet existing or future FAA regulations. The system must be able to operate unattended, which necessitates the inclusion of risk mitigation measures such as a detect and avoid radar and the ability to transmit and receive transponder signals. Moreover, the system should be able to assess local weather conditions (visibility, surface winds, and cloud height) and the integrity of the vehicle (system diagnostics, fuel level) before takeoff. We provide a notional concept of operations for a 3D Mesonet being considered, describe the technical configuration for one station in the network, and discuss plans for future development.
NASA Astrophysics Data System (ADS)
Keeler, D. G.; Rupper, S.; Forster, R. R.; Miège, C.; Brewer, S.; Koenig, L.
2017-12-01
The West Antarctic Ice Sheet (WAIS) could be a substantial source of future sea level rise, with 3+ meters of potential increase stored in the ice sheet. Adequate predictions of WAIS contributions, however, depend on well-constrained surface mass balance estimates for the region. Given the sparsity of available data, such estimates are tenuous. Although new data are periodically added, further research (both to collect more data and better utilize existing data) is critical to addressing these issues. Here we present accumulation data from 9 shallow firn cores and 600 km of Ku band radar traces collected as part of the Satellite Era Antarctic Traverse (SEAT) 2011/2012 field season. Using these data, combined with similar data collected during the SEAT 2010/2011 field season, we investigate the spatial variability in accumulation across the WAIS Divide and surrounding regions. We utilize seismic interpretation and 3D visualization tools to investigate the extent and variations of laterally continuous internal horizons in the radar profiles, and compare the results to nearby firn cores. Previous results show that clearly visible, laterally continuous horizons in radar returns in this area do not always represent annual accumulation isochrones, but can instead represent multi-year or sub-annual events. The automated application of Bayesian inference techniques to averaged estimates of multiple adjacent radar traces, however, can estimate annually-resolved independent age-depth scales for these radar data. We use these same automated techniques on firn core isotopic records to infer past snow accumulation rates, allowing a direct comparison with the radar-derived results. Age-depth scales based on manual annual-layer counting of geochemical and isotopic species from these same cores provide validation for the automated approaches. Such techniques could theoretically be applied to additional radar/core data sets in polar regions (e.g. Operation IceBridge), thereby increasing the number of high resolution accumulation records available in these data-sparse regions. An increased understanding of the variability in magnitude and past rates of surface mass balance can provide better constraints on sea level projections and more precise context for present-day and future observations in these regions.
Automated video-microscopic imaging and data acquisition system for colloid deposition measurements
Abdel-Fattah, Amr I.; Reimus, Paul W.
2004-12-28
A video microscopic visualization system and image processing and data extraction and processing method for in situ detailed quantification of the deposition of sub-micrometer particles onto an arbitrary surface and determination of their concentration across the bulk suspension. The extracted data includes (a) surface concentration and flux of deposited, attached and detached colloids, (b) surface concentration and flux of arriving and departing colloids, (c) distribution of colloids in the bulk suspension in the direction perpendicular to the deposition surface, and (d) spatial and temporal distributions of deposited colloids.
Vrooman, Henri A; Cocosco, Chris A; van der Lijn, Fedde; Stokking, Rik; Ikram, M Arfan; Vernooij, Meike W; Breteler, Monique M B; Niessen, Wiro J
2007-08-01
Conventional k-Nearest-Neighbor (kNN) classification, which has been successfully applied to classify brain tissue in MR data, requires training on manually labeled subjects. This manual labeling is a laborious and time-consuming procedure. In this work, a new fully automated brain tissue classification procedure is presented, in which kNN training is automated. This is achieved by non-rigidly registering the MR data with a tissue probability atlas to automatically select training samples, followed by a post-processing step to keep the most reliable samples. The accuracy of the new method was compared to rigid registration-based training and to conventional kNN-based segmentation using training on manually labeled subjects for segmenting gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) in 12 data sets. Furthermore, for all classification methods, the performance was assessed when varying the free parameters. Finally, the robustness of the fully automated procedure was evaluated on 59 subjects. The automated training method using non-rigid registration with a tissue probability atlas was significantly more accurate than rigid registration. For both automated training using non-rigid registration and for the manually trained kNN classifier, the difference with the manual labeling by observers was not significantly larger than inter-observer variability for all tissue types. From the robustness study, it was clear that, given an appropriate brain atlas and optimal parameters, our new fully automated, non-rigid registration-based method gives accurate and robust segmentation results. A similarity index was used for comparison with manually trained kNN. The similarity indices were 0.93, 0.92 and 0.92, for CSF, GM and WM, respectively. It can be concluded that our fully automated method using non-rigid registration may replace manual segmentation, and thus that automated brain tissue segmentation without laborious manual training is feasible.
ERIC Educational Resources Information Center
Gilchrist, Kristin H.; Hegarty-Craver, Meghan; Christian, Robert B.; Grego, Sonia; Kies, Ashley C.; Wheeler, Anne C.
2018-01-01
Repetitive sensory motor behaviors are a direct target for clinical treatment and a potential treatment endpoint for individuals with intellectual or developmental disabilities. By removing the burden associated with video annotation or direct observation, automated detection of stereotypy would allow for longer term monitoring in ecologic…
JWST Associations overview: automated generation of combined products
NASA Astrophysics Data System (ADS)
Alexov, Anastasia; Swade, Daryl; Bushouse, Howard; Diaz, Rosa; Eisenhamer, Jonathan; Hack, Warren; Kyprianou, Mark; Levay, Karen; Rahmani, Christopher; Swam, Mike; Valenti, Jeff
2018-01-01
We are presenting the design of the James Webb Space Telescope (JWST) Data Management System (DMS) automated processing of Associations. An Association captures the relationship between exposures and higher level data products, such as combined mosaics created from dithered and tiled observations. The astronomer’s intent is captured within the Proposal Planning System (PPS) and provided to DMS as candidate associations. These candidates are converted into Association Pools and Association Generator Tables that serve as input to automated processing which create the combined data products. Association Pools are generated to capture a list of exposures that could potentially form associations and provide relevant information about those exposures. The Association Generator using definitions on groupings creates one or more Association Tables from a single input Association Pool. Each Association Table defines a set of exposures to be combined and the ruleset of the combination to be performed; the calibration software creates Associated data products based on these input tables. The initial design produces automated Associations within a proposal. Additionally this JWST overall design is conducive to eventually produce Associations for observations from multiple proposals, similar to the Hubble Legacy Archive (HLA).
Handheld Automated Microsurgical Instrumentation for Intraocular Laser Surgery
Yang, Sungwook; Lobes, Louis A.; Martel, Joseph N.; Riviere, Cameron N.
2016-01-01
Background and Objective Laser photocoagulation is a mainstay or adjuvant treatment for a variety of common retinal diseases. Automated laser photocoagulation during intraocular surgery has not yet been established. The authors introduce an automated laser photocoagulation system for intraocular surgery, based on a novel handheld instrument. The goals of the system are to enhance accuracy and efficiency and improve safety. Materials and Methods Triple-ring patterns are introduced as a typical arrangement for the treatment of proliferative retinopathy and registered to a preoperative fundus image. In total, 32 target locations are specified along the circumferences of three rings having diameters of 1, 2, and 3 mm, with a burn spacing of 600 μm. Given the initial system calibration, the retinal surface is reconstructed using stereo vision, and the targets specified on the preoperative image are registered with the control system. During automated operation, the laser probe attached to the manipulator of the active handheld instrument is deflected as needed via visual servoing in order to correct the error between the aiming beam and a specified target, regardless of any erroneous handle motion by the surgeon. A constant distance of the laser probe from the retinal surface is maintained in order to yield consistent size of burns and ensure safety during operation. Real-time tracking of anatomical features enables compensation for any movement of the eye. A graphical overlay system within operating microscope provides the surgeon with guidance cues for automated operation. Two retinal surgeons performed automated and manual trials in an artificial model of the eye, with each trial repeated three times. For the automated trials, various targeting thresholds (50–200 μm) were used to automatically trigger laser firing. In manual operation, fixed repetition rates were used, with frequencies of 1.0–2.5 Hz. The power of the 532 nm laser was set at 3.0 W with a duration of 20 ms. After completion of each trial, the speed of operation and placement error of burns were measured. The performance of the automated laser photocoagulation was compared with manual operation, using interpolated data for equivalent firing rates from 1.0 to 1.75 Hz. Results In automated trials, average error increased from 45 ± 27 to 60 ± 37 μm as the targeting threshold varied from 50 to 200 μm, while average firing rate significantly increased from 0.69 to 1.71 Hz. The average error in the manual trials increased from 102 ± 67 to 174 ± 98 μm as firing rate increased from 1.0 to 2.5 Hz. Compared to the manual trials, the average error in the automated trials was reduced by 53.0–56.4%, resulting in statistically significant differences (P ≤ 10−20) for all equivalent frequencies (1.0–1.75 Hz). The depth of the laser tip in the automated trials was consistently maintained within 18 ± 2 μm root-mean-square (RMS) of its initial position, whereas it significantly varied in the manual trials, yielding an error of 296 ± 30 μm RMS. At high firing rates in manual trials, such as at 2.5 Hz, laser photocoagulation is marginally attained, yielding failed burns of 30% over the entire pattern, whereas no failed burns are found in automated trials. Relatively regular burn sizes are attained in the automated trials by the depth servoing of the laser tip, while burn sizes in the manual trials vary considerably. Automated avoidance of blood vessels was also successfully demonstrated, utilizing the retina-tracking feature to identify avoidance zones. Conclusion Automated intraocular laser surgery can improve the accuracy of photocoagulation while ensuring safety during operation. This paper provides an initial demonstration of the technique under reasonably realistic laboratory conditions; development of a clinically applicable system requires further work. PMID:26287813
NASA Technical Reports Server (NTRS)
1996-01-01
Open Sesame! is the first commercial software product that learns user's behavior, and offers automation and coaching suggestions to the user. The neural learning module looks for repetitive patterns that have not been automated; when it finds one, it creates an observation and, upon approval, automates the task. The manufacturer, Charles River Analytics, credits Langley Research Center and Johnson Space Center Small Business Innovation Research grants and the time the president and vice president spent at the two centers in the 1970s as being essential to the development of their product line.
Sarter, Nadine B; Mumaw, Randall J; Wickens, Christopher D
2007-06-01
The objective of the study was to examine pilots' automation monitoring strategies and performance on highly automated commercial flight decks. A considerable body of research and operational experience has documented breakdowns in pilot-automation coordination on modern flight decks. These breakdowns are often considered symptoms of monitoring failures even though, to date, only limited and mostly anecdotal data exist concerning pilots' monitoring strategies and performance. Twenty experienced B-747-400 airline pilots flew a 1-hr scenario involving challenging automation-related events on a full-mission simulator. Behavioral, mental model, and eye-tracking data were collected. The findings from this study confirm that pilots monitor basic flight parameters to a much greater extent than visual indications of the automation configuration. More specifically, they frequently fail to verify manual mode selections or notice automatic mode changes. In other cases, they do not process mode annunciations in sufficient depth to understand their implications for aircraft behavior. Low system observability and gaps in pilots' understanding of complex automation modes were shown to contribute to these problems. Our findings describe and explain shortcomings in pilot's automation monitoring strategies and performance based on converging behavioral, eye-tracking, and mental model data. They confirm that monitoring failures are one major contributor to breakdowns in pilot-automation interaction. The findings from this research can inform the design of improved training programs and automation interfaces that support more effective system monitoring.
Community pharmacies automation: any impact on counselling duration and job satisfaction?
Cavaco, Afonso Miguel; Krookas, Anette Aaland
2014-04-01
One key indicator of the quality of health practitioners-patient interaction is the encounters' duration. Automation have been presented as beneficial to pharmacy staff work with patients and thus with a potential impact on pharmacists' and technicians' job satisfaction. To compare the interaction length between pharmacy staff and patients, as well as their job satisfaction, in community pharmacies with and without automation. Portuguese community pharmacies with and without automation. This cross-sectional study followed a quasi-experimental design, divided in two phases. In the first, paired community pharmacies with and without automation were purposively selected for a non-participant overt observation. The second phase comprised a job satisfaction questionnaire of both pharmacists and technical staff. Practitioners and patients demographic and interactional data, as well as job satisfaction, were statistically compared across automation. Interaction length and job satisfaction. Sixty-eight practitioners from 10 automated and non-automated pharmacies produced 721 registered interaction episodes. Automation had no significant influence in interaction duration, controlling for gender and professional categories, being significantly longer with older patients (p = 0.017). On average, staff working at the pharmacy counter had 45 % of free time from direct patient contact. The mean overall satisfaction in this sample was 5.52 (SD = 0.98) out of a maximum score of seven, with no significant differences with automation as well as between professional categories, only with a significant lower job satisfaction for younger pharmacists. As with previous studies in other settings, duration of the interactions was not influenced by pharmacy automation, as well as practitioners' job satisfaction, while practitioners' time constrains seem to be a subjective perception.
The effect of JPEG compression on automated detection of microaneurysms in retinal images
NASA Astrophysics Data System (ADS)
Cree, M. J.; Jelinek, H. F.
2008-02-01
As JPEG compression at source is ubiquitous in retinal imaging, and the block artefacts introduced are known to be of similar size to microaneurysms (an important indicator of diabetic retinopathy) it is prudent to evaluate the effect of JPEG compression on automated detection of retinal pathology. Retinal images were acquired at high quality and then compressed to various lower qualities. An automated microaneurysm detector was run on the retinal images of various qualities of JPEG compression and the ability to predict the presence of diabetic retinopathy based on the detected presence of microaneurysms was evaluated with receiver operating characteristic (ROC) methodology. The negative effect of JPEG compression on automated detection was observed even at levels of compression sometimes used in retinal eye-screening programmes and these may have important clinical implications for deciding on acceptable levels of compression for a fully automated eye-screening programme.
Automated Measurement of Facial Expression in Infant-Mother Interaction: A Pilot Study
Messinger, Daniel S.; Mahoor, Mohammad H.; Chow, Sy-Miin; Cohn, Jeffrey F.
2009-01-01
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two six-month-old/mother dyads who each engaged in a face-to-face interaction. Automated measurements showed high associations with anatomically based manual coding (concurrent validity); measurements of smiling showed high associations with mean ratings of positive emotion made by naive observers (construct validity). For both infants and mothers, smile strength and eye constriction (the Duchenne marker) were correlated over time, creating a continuous index of smile intensity. Infant and mother smile activity exhibited changing (nonstationary) local patterns of association, suggesting the dyadic repair and dissolution of states of affective synchrony. The study provides insights into the potential and limitations of automated measurement of facial action. PMID:19885384
Butler, Kenneth R; Minor, Deborah S; Benghuzzi, Hamed A; Tucci, Michelle
2010-01-01
The objective of this study was to evaluate terminal digit preference in blood pressure (BP) measurements taken from a sample of clinics at a large academic health sciences center. We hypothesized that terminal digit preference would occur more frequently in BP measurements taken with manual mercury sphygmomanometry compared to those obtained with semi-automated instruments. A total of 1,393 BP measures were obtained in 16 ambulatory and inpatient sites by personnel using both mercury (n=1,286) and semi-automated (n=107) devices For the semi-automated devices, a trained observer repeated the patients BP following American Heart Association recommendations using a similar device with a known calibration history. At least two recorded systolic and diastolic blood pressures (average of two or more readings for each) were obtained for all manual mercury readings. Data were evaluated using descriptive statistics and Chi square as appropriate (SPSS software, 17.0). Overall, zero and other terminal digit preference was observed more frequently in systolic (?2 = 883.21, df = 9, p < 0.001) and diastolic readings (?2 = 1076.77, df = 9, p < 0.001) from manual instruments, while all end digits obtained by clinic staff using semi-automated devices were more evenly distributed (?2 = 8.23, df = 9, p = 0.511 for systolic and ?2 = 10.48, df = 9, p = 0.313 for diastolic). In addition to zero digit bias in mercury readings, even numbers were reported with significantly higher frequency than odd numbers. There was no detectable digit preference observed when examining semi-automated measurements by clinic staff or device type for either systolic or diastolic BP measures. These findings demonstrate that terminal digit preference was more likely to occur with manual mercury sphygmomanometry. This phenomenon was most likely the result of mercury column graduation in 2 mm Hg increments producing a higher than expected frequency of even digits.
NASA Astrophysics Data System (ADS)
Ivanov, Anton; Muller, Jan-Peter; Tao, Yu; Kim, Jung-Rack; Gwinner, Klaus; Van Gasselt, Stephan; Morley, Jeremy; Houghton, Robert; Bamford, Steven; Sidiropoulos, Panagiotis; Fanara, Lida; Waenlish, Marita; Walter, Sebastian; Steinkert, Ralf; Schreiner, Bjorn; Cantini, Federico; Wardlaw, Jessica; Sprinks, James; Giordano, Michele; Marsh, Stuart
2016-07-01
Understanding planetary atmosphere-surface and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to be able to overlay different epochs back in time to the mid 1970s, to examine time-varying changes, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, UCL have developed a fully automated multi-resolution DTM processing chain, called the Co-registration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP), which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed and is being applied to level-1 EDR images taken by the 4 NASA orbital cameras since 1976 using the HRSC map products (both mosaics and orbital strips) as a map-base. The project has also included Mars Radar profiles from Mars Express and Mars Reconnaissance Orbiter missions. A webGIS has been developed for displaying this time sequence of imagery and a demonstration will be shown applied to one of the map-sheets. Automated quality control techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. These data mining techniques are then being employed within a citizen science project within the Zooniverse family to verify the results of these data mining techniques. Examples of data mining and its verification will be presented. We will present a software tool to ease access to co-registered MARSIS and SHARAD radargrams and geometry data such as probing point latitude and longitude and spacecraft altitude. Data are extracted from official ESA and NASA released data using self-developed python classes. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data. Radar geometry data will be available as a part of the iMars WebGIS framework and images will be available via PDS and PSA archives. Acknowledgements The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under iMars grant agreement n˚ 607379 as well as partial funding from the STFC "MSSL Consolidated Grant" ST/K000977/1.
Cognitive consequences of clumsy automation on high workload, high consequence human performance
NASA Technical Reports Server (NTRS)
Cook, Richard I.; Woods, David D.; Mccolligan, Elizabeth; Howie, Michael B.
1991-01-01
The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems.
High-throughput methods for characterizing the mechanical properties of coatings
NASA Astrophysics Data System (ADS)
Siripirom, Chavanin
The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component segregates to the surface during curing. The distribution of siloxane as a function of thickness into the sample showed differences depending on the formulation parameters. The coatings which had higher siloxane content near the surface were those coatings found to perform well in field tests.
Non-destructive forensic latent fingerprint acquisition with chromatic white light sensors
NASA Astrophysics Data System (ADS)
Leich, Marcus; Kiltz, Stefan; Dittmann, Jana; Vielhauer, Claus
2011-02-01
Non-destructive latent fingerprint acquisition is an emerging field of research, which, unlike traditional methods, makes latent fingerprints available for additional verification or further analysis like tests for substance abuse or age estimation. In this paper a series of tests is performed to investigate the overall suitability of a high resolution off-the-shelf chromatic white light sensor for the contact-less and non-destructive latent fingerprint acquisition. Our paper focuses on scanning previously determined regions with exemplary acquisition parameter settings. 3D height field and reflection data of five different latent fingerprints on six different types of surfaces (HDD platter, brushed metal, painted car body (metallic and non-metallic finish), blued metal, veneered plywood) are experimentally studied. Pre-processing is performed by removing low-frequency gradients. The quality of the results is assessed subjectively; no automated feature extraction is performed. Additionally, the degradation of the fingerprint during the acquisition period is observed. While the quality of the acquired data is highly dependent on surface structure, the sensor is capable of detecting the fingerprint on all sample surfaces. On blued metal the residual material is detected; however, the ridge line structure dissolves within minutes after fingerprint placement.
Validation of automated white matter hyperintensity segmentation.
Smart, Sean D; Firbank, Michael J; O'Brien, John T
2011-01-01
Introduction. White matter hyperintensities (WMHs) are a common finding on MRI scans of older people and are associated with vascular disease. We compared 3 methods for automatically segmenting WMHs from MRI scans. Method. An operator manually segmented WMHs on MRI images from a 3T scanner. The scans were also segmented in a fully automated fashion by three different programmes. The voxel overlap between manual and automated segmentation was compared. Results. Between observer overlap ratio was 63%. Using our previously described in-house software, we had overlap of 62.2%. We investigated the use of a modified version of SPM segmentation; however, this was not successful, with only 14% overlap. Discussion. Using our previously reported software, we demonstrated good segmentation of WMHs in a fully automated fashion.
Automating the Processing of Earth Observation Data
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wan-Lin; Nemani, Ramakrishna; Votava, Petr
2003-01-01
NASA s vision for Earth science is to build a "sensor web": an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving this vision will require automation not only in the scheduling of the observations but also in the processing of the resulting data. To address this need, we are developing a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products.
Characterizing the Frequency and Elevation of Rapid Drainage Events in West Greenland
NASA Astrophysics Data System (ADS)
Cooley, S.; Christoffersen, P.
2016-12-01
Rapid drainage of supraglacial lakes on the Greenland Ice Sheet is critical for the establishment of surface-to-bed hydrologic connections and the subsequent transfer of water from surface to bed. Yet, estimates of the number and spatial distribution of rapidly draining lakes vary widely due to limitations in the temporal frequency of image collection and obscureness by cloud. So far, no study has assessed the impact of these observation biases. In this study, we examine the frequency and elevation of rapidly draining lakes in central West Greenland, from 68°N to 72.6°N, and we make a robust statistical analysis to estimate more accurately the likelihood of lakes draining rapidly. Using MODIS imagery and a fully automated lake detection method, we map more than 500 supraglacial lakes per year over a 63000 km2 study area from 2000-2015. Through testing four different definitions of rapidly draining lakes from previously published studies, we find that the number of rapidly draining lakes varies from 3% to 38%. Logistic regression between rapid drainage events and image sampling frequency demonstrates that the number of rapid drainage events is strongly dependent on cloud-free observation percentage. We then develop three new drainage criteria and apply an observation bias correction that suggests a true rapid drainage probability between 36% and 45%, considerably higher than previous studies without bias assessment have reported. We find rapid-draining lakes are on average larger and disappear earlier than slow-draining lakes, and we also observe no elevation differences for the lakes detected as rapidly draining. We conclude a) that methodological problems in rapid drainage research caused by observation bias and varying detection methods have obscured large-scale rapid drainage characteristics and b) that the lack of evidence for an elevation limit on rapid drainage suggests surface-to-bed hydrologic connections may continue to propagate inland as climate warms.
NASA Astrophysics Data System (ADS)
Ahrends, H. E.; Oberbauer, S. F.; Tweedie, C.; Hollister, R. D.
2010-12-01
Knowledge of changing tundra vegetation and its response to climate variability is critical for understanding the land-atmosphere-interactions for the Arctic and the global system. However, vegetation characteristics, such as phenology, structure and species composition, are characterized by an extreme heterogeneity at a small scale. Manual observations of these variables are highly time-consuming, labor intensive, subjective, and disturbing to the vegetation. In contrast, recently developed robotic systems (networked infomechanical systems, NIMS) allow for performing non-intrusive spatially integrated measurements of vegetation communities. Within the ITEX (International Tundra Experiment) AON (Arctic Observation Network) project we installed a cable-based sensor system, running over a transect of approximately 50 m length and 2 m width, at two long-term arctic research sites in Alaska. The trolley was initially equipped with instruments recording the distance to vegetation canopy, up- and downwelling short- and longwave radiation, air and surface temperature and spectral reflection. We aim to study the thermal and spectral response of the vegetation communities over a wide range of ecosystem types. We expect that automated observations, covering the spatial heterogeneity of vegetation and surface characteristics, can give a deeper insight in ecosystem functioning and vegetation response to climate. The data can be used for scaling up vegetation characteristics derived from manual measurements and for linking them to aircraft and satellite data and to carbon, water and surface energy budgets measured at the ecosystem scale. Sampling errors due to cable sag are correctable and effects of wind-driven movements can be offset by repeat measurements. First hand-pulled test measurements during summer 2010 show strong heterogeneity of the observation parameters and a variable spectral and thermal response of the plants within the transects. Differences support the importance of our approach for upscaling purposes and for a comprehensive understanding of the arctic biome.
Wu, Zht Cheng; de Keyzer, Jeanine; Kusters, Ilja; Driessen, Arnold J M
2013-01-01
The interaction between membrane proteins and their (protein) ligands is conventionally investigated by nonequilibrium methods such as co-sedimentation or pull-down assays. Surface Plasmon Resonance can be used to monitor such binding events in real-time using isolated membranes immobilized to a surface providing insights in the kinetics of binding under equilibrium conditions. This application provides a fast, automated way to detect interacting species and to determine the kinetics and affinity (Kd) of the interaction.
Automated Array Assembly Task In-depth Study of Silicon Wafer Surface Texturizing
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.; Allison, K. L.
1979-01-01
A low cost wafer surface texturizing process was studied. An investigation of low cost cleaning operations to clean residual wax and organics from the surface of silicon wafers was made. The feasibility of replacing dry nitrogen with clean dry air for drying silicon wafers was examined. The two stage texturizing process was studied for the purpose of characterizing relevant parameters in large volume applications. The effect of gettering solar cells on photovoltaic energy conversion efficiency is described.
Wang, Yang; Ruan, Qingyu; Lei, Zhi-Chao; Lin, Shui-Chao; Zhu, Zhi; Zhou, Leiji; Yang, Chaoyong
2018-04-17
Digital microfluidics (DMF) is a powerful platform for a broad range of applications, especially immunoassays having multiple steps, due to the advantages of low reagent consumption and high automatization. Surface enhanced Raman scattering (SERS) has been proven as an attractive method for highly sensitive and multiplex detection, because of its remarkable signal amplification and excellent spatial resolution. Here we propose a SERS-based immunoassay with DMF for rapid, automated, and sensitive detection of disease biomarkers. SERS tags labeled with Raman reporter 4-mercaptobenzoic acid (4-MBA) were synthesized with a core@shell nanostructure and showed strong signals, good uniformity, and high stability. A sandwich immunoassay was designed, in which magnetic beads coated with antibodies were used as solid support to capture antigens from samples to form a beads-antibody-antigen immunocomplex. By labeling the immunocomplex with a detection antibody-functionalized SERS tag, antigen can be sensitively detected through the strong SERS signal. The automation capability of DMF can greatly simplify the assay procedure while reducing the risk of exposure to hazardous samples. Quantitative detection of avian influenza virus H5N1 in buffer and human serum was implemented to demonstrate the utility of the DMF-SERS method. The DMF-SERS method shows excellent sensitivity (LOD of 74 pg/mL) and selectivity for H5N1 detection with less assay time (<1 h) and lower reagent consumption (∼30 μL) compared to the standard ELISA method. Therefore, this DMF-SERS method holds great potentials for automated and sensitive detection of a variety of infectious diseases.
NASA Astrophysics Data System (ADS)
Savant, Vaibhav; Smith, Niall
2016-07-01
We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.
NASA Astrophysics Data System (ADS)
Forster, Linda; Seefeldner, Meinhard; Wiegner, Matthias; Mayer, Bernhard
2017-07-01
Halo displays in the sky contain valuable information about ice crystal shape and orientation: e.g., the 22° halo is produced by randomly oriented hexagonal prisms while parhelia (sundogs) indicate oriented plates. HaloCam, a novel sun-tracking camera system for the automated observation of halo displays is presented. An initial visual evaluation of the frequency of halo displays for the ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) field campaign from October to mid-November 2014 showed that sundogs were observed more often than 22° halos. Thus, the majority of halo displays was produced by oriented ice crystals. During the campaign about 27 % of the cirrus clouds produced 22° halos, sundogs or upper tangent arcs. To evaluate the HaloCam observations collected from regular measurements in Munich between January 2014 and June 2016, an automated detection algorithm for 22° halos was developed, which can be extended to other halo types as well. This algorithm detected 22° halos about 2 % of the time for this dataset. The frequency of cirrus clouds during this time period was estimated by co-located ceilometer measurements using temperature thresholds of the cloud base. About 25 % of the detected cirrus clouds occurred together with a 22° halo, which implies that these clouds contained a certain fraction of smooth, hexagonal ice crystals. HaloCam observations complemented by radiative transfer simulations and measurements of aerosol and cirrus cloud optical thickness (AOT and COT) provide a possibility to retrieve more detailed information about ice crystal roughness. This paper demonstrates the feasibility of a completely automated method to collect and evaluate a long-term database of halo observations and shows the potential to characterize ice crystal properties.
Automated management for pavement inspection system (AMPIS)
NASA Astrophysics Data System (ADS)
Chung, Hung Chi; Girardello, Roberto; Soeller, Tony; Shinozuka, Masanobu
2003-08-01
An automated in-situ road surface distress surveying and management system, AMPIS, has been developed on the basis of video images within the framework of GIS software. Video image processing techniques are introduced to acquire, process and analyze the road surface images obtained from a moving vehicle. ArcGIS platform is used to integrate the routines of image processing and spatial analysis in handling the full-scale metropolitan highway surface distress detection and data fusion/management. This makes it possible to present user-friendly interfaces in GIS and to provide efficient visualizations of surveyed results not only for the use of transportation engineers to manage road surveying documentations, data acquisition, analysis and management, but also for financial officials to plan maintenance and repair programs and further evaluate the socio-economic impacts of highway degradation and deterioration. A review performed in this study on fundamental principle of Pavement Management System (PMS) and its implementation indicates that the proposed approach of using GIS concept and its tools for PMS application will reshape PMS into a new information technology-based system providing a convenient and efficient pavement inspection and management.
NASA Technical Reports Server (NTRS)
Thompson, David S.; Soni, Bharat K.
2000-01-01
An integrated software package, ICEG2D, was developed to automate computational fluid dynamics (CFD) simulations for single-element airfoils with ice accretion. ICEG2D is designed to automatically perform three primary functions: (1) generating a grid-ready, surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generating a high-quality grid using the generated surface point distribution, and (3) generating the input and restart files needed to run the general purpose CFD solver NPARC. ICEG2D can be executed in batch mode using a script file or in an interactive mode by entering directives from a command line. This report summarizes activities completed in the first year of a three-year research and development program to address issues related to CFD simulations for aircraft components with ice accretion. Specifically, this document describes the technology employed in the software, the installation procedure, and a description of the operation of the software package. Validation of the geometry and grid generation modules of ICEG2D is also discussed.
Blacker, Teddy D.
1994-01-01
An automatic quadrilateral surface discretization method and apparatus is provided for automatically discretizing a geometric region without decomposing the region. The automated quadrilateral surface discretization method and apparatus automatically generates a mesh of all quadrilateral elements which is particularly useful in finite element analysis. The generated mesh of all quadrilateral elements is boundary sensitive, orientation insensitive and has few irregular nodes on the boundary. A permanent boundary of the geometric region is input and rows are iteratively layered toward the interior of the geometric region. Also, an exterior permanent boundary and an interior permanent boundary for a geometric region may be input and the rows are iteratively layered inward from the exterior boundary in a first counter clockwise direction while the rows are iteratively layered from the interior permanent boundary toward the exterior of the region in a second clockwise direction. As a result, a high quality mesh for an arbitrary geometry may be generated with a technique that is robust and fast for complex geometric regions and extreme mesh gradations.
Triangulation methods for automated docking
NASA Technical Reports Server (NTRS)
Bales, John W.
1996-01-01
An automated docking system must have a reliable method for determining range and orientation of the passive (target) vehicle with respect to the active vehicle. This method must also provide accurate information on the rates of change of range to and orientation of the passive vehicle. The method must be accurate within required tolerances and capable of operating in real time. The method being developed at Marshall Space Flight Center employs a single TV camera, a laser illumination system and a target consisting, in its minimal configuration, of three retro-reflectors. Two of the retro-reflectors are mounted flush to the same surface, with the third retro-reflector mounted to a post fixed midway between the other two and jutting at a right angle from the surface. For redundancy, two additional retroreflectors are mounted on the surface on a line at right angles to the line containing the first two retro-reflectors, and equally spaced on either side of the post. The target vehicle will contain a large target for initial acquisition and several smaller targets for close range.
Surface thermohardening by the fast-moving electric arch
NASA Astrophysics Data System (ADS)
Gabdrakhmanov, Az T.; Shafigullin, L. N.; Galimov, E. R.; Ibragimov, A. R.
2017-01-01
This paper describes the technology of modern engineering-plasma hardening steels and prospects of its application. It gives the opportunity to manage the process without using of cooling media, vacuum, special coatings to improve the absorptive capacity of hardened surfaces; the simplicity, the low cost, the maneuverability, a small size of the process equipment; a possibility of the automation and the robotization of technological process.
Remote monitoring of primates using automated GPS technology in open habitats.
Markham, A Catherine; Altmann, Jeanne
2008-05-01
Automated tracking using a satellite global position system (GPS) has major potential as a research tool in studies of primate ecology. However, implementation has been limited, at least partly because of technological difficulties associated with the dense forest habitat of many primates. In contrast, primates inhabiting relatively open environments may provide ideal subjects for use of GPS collars, yet no empirical tests have evaluated this proposition. Here, we used an automated GPS collar to record the locations, approximate body surface temperature, and activity for an adult female baboon during 90 days in the savannah habitat of Amboseli, Kenya. Given the GPS collar's impressive reliability, high spatial accuracy, other associated measurements, and low impact on the study animal, our results indicate the great potential of applying GPS technology to research on wild primates. © 2008 Wiley-Liss, Inc.
Automation of the electron-beam welding process
NASA Astrophysics Data System (ADS)
Koleva, E.; Dzharov, V.; Kardjiev, M.; Mladenov, G.
2016-03-01
In this work, the automatic control is considered of the vacuum and cooling systems of the located in the IE-BAS equipment for electron-beam welding, evaporation and surface modification. A project was elaborated for the control and management based on the development of an engineering support system using existing and additional technical means of automation. Optimization of the indicators, which are critical for the duration of reaching the working regime and stopping the operation of the installation, can be made using experimentally obtained transient characteristics. The automation of the available equipment aimed at improving its efficiency and the repeatability of the obtained results, as well as at stabilizing the process parameters, should be integrated in an Engineering Support System which, besides the operator supervision, consists of several subsystems for equipment control, data acquisition, information analysis, system management and decision-making support.
Automation Technology in Elementary Technology Education.
ERIC Educational Resources Information Center
Hiltunen, Jukka; Jarvinen, Esa-Matti
2000-01-01
Finnish fifth-graders (n=20) and sixth-graders (n=23) worked in teams in a Lego/Logo-Control Lab to complete Lego design activities. Observations showed that they became familiar with automation technology but their skills were not always up to their ideas. Activities based on real-life situations gave them ownership and engaged them in learning.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
...''); Amistar Automation, Inc. (``Amistar'') of San Marcos, California; Techno Soft Systemnics, Inc. (``Techno..., the ALJ's construction of the claim terms ``test,'' ``match score surface,'' and ``gradient direction...
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
NASA Astrophysics Data System (ADS)
Jones, Louis Chin
This thesis entails the synthesis, automated catalytic testing, and in situ molecular characterization of supported Pt and Pt-alloy nanoparticle (NP) catalysts, with emphasis on how to assess the molecular distributions of Pt environments that are affecting overall catalytic activity and selectivity. We have taken the approach of (a) manipulating nucleation and growth of NPs using oxide supports, surfactants, and inorganic complexes to create Pt NPs with uniform size, shape, and composition, (b) automating batch and continuous flow catalytic reaction tests, and (c) characterizing the molecular environments of Pt surfaces using in situ infrared (IR) spectroscopy and solid-state 195Pt NMR. The following will highlight the synthesis and characterization of Ag-doped Pt NPs and their influence on C 2H2 hydrogenation selectivity, and the implementation of advanced solid-state 195Pt NMR techniques to distinguish how distributions of molecular Pt environments vary with nanoparticle size, support, and surface composition.
NASA Technical Reports Server (NTRS)
Weaver, W. L.; Norton, H. N.; Darnell, W. L.
1975-01-01
Mission concepts were investigated for automated return to Earth of a Mars surface sample adequate for detailed analyses in scientific laboratories. The minimum sample mass sufficient to meet scientific requirements was determined. Types of materials and supporting measurements for essential analyses are reported. A baseline trajectory profile was selected for its low energy requirements and relatively simple implementation, and trajectory profile design data were developed for 1979 and 1981 launch opportunities. Efficient spacecraft systems were conceived by utilizing existing technology where possible. Systems concepts emphasized the 1979 launch opportunity, and the applicability of results to other opportunities was assessed. It was shown that the baseline missions (return through Mars parking orbit) and some comparison missions (return after sample transfer in Mars orbit) can be accomplished by using a single Titan III E/Centaur as the launch vehicle. All missions investigated can be accomplished by use of Space Shuttle/Centaur vehicles.
Fast and accurate determination of the detergent efficiency by optical fiber sensors
NASA Astrophysics Data System (ADS)
Patitsa, Maria; Pfeiffer, Helge; Wevers, Martine
2011-06-01
An optical fiber sensor was developed to control the cleaning efficiency of surfactants. Prior to the measurements, the sensing part of the probe is covered with a uniform standardized soil layer (lipid multilayer), and a gold mirror is deposited at the end of the optical fiber. For the lipid multilayer deposition on the fiber, Langmuir-Blodgett technique was used and the progress of deposition was followed online by ultraviolet spectroscopy. The invention provides a miniaturized Surface Plasmon Resonance dip-sensor for automated on-line testing that can replace the cost and time consuming existing methods and develop a breakthrough in detergent testing in combining optical sensing, surface chemistry and automated data acquisition. The sensor is to be used to evaluate detergency of different cleaning products and also indicate how formulation, concentration, lipid nature and temperature affect the cleaning behavior of a surfactant.
Mapping Cortical Laminar Structure in the 3D BigBrain.
Wagstyl, Konrad; Lepage, Claude; Bludau, Sebastian; Zilles, Karl; Fletcher, Paul C; Amunts, Katrin; Evans, Alan C
2018-07-01
Histological sections offer high spatial resolution to examine laminar architecture of the human cerebral cortex; however, they are restricted by being 2D, hence only regions with sufficiently optimal cutting planes can be analyzed. Conversely, noninvasive neuroimaging approaches are whole brain but have relatively low resolution. Consequently, correct 3D cross-cortical patterns of laminar architecture have never been mapped in histological sections. We developed an automated technique to identify and analyze laminar structure within the high-resolution 3D histological BigBrain. We extracted white matter and pial surfaces, from which we derived histologically verified surfaces at the layer I/II boundary and within layer IV. Layer IV depth was strongly predicted by cortical curvature but varied between areas. This fully automated 3D laminar analysis is an important requirement for bridging high-resolution 2D cytoarchitecture and in vivo 3D neuroimaging. It lays the foundation for in-depth, whole-brain analyses of cortical layering.
Ging, Patricia B.
1999-01-01
Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.
Video Guidance Sensor for Surface Mobility Operations
NASA Technical Reports Server (NTRS)
Fernandez, Kenneth R.; Fischer, Richard; Bryan, Thomas; Howell, Joe; Howard, Ricky; Peters, Bruce
2008-01-01
Robotic systems and surface mobility will play an increased role in future exploration missions. Unlike the LRV during Apollo era which was an astronaut piloted vehicle future systems will include teleoperated and semi-autonomous operations. The tasks given to these vehicles will run the range from infrastructure maintenance, ISRU, and construction to name a few. A common task that may be performed would be the retrieval and deployment of trailer mounted equipment. Operational scenarios may require these operations to be performed remotely via a teleoperated mode,or semi-autonomously. This presentation describes the on-going project to adapt the Automated Rendezvous and Capture (AR&C) sensor developed at the Marshall Space Flight Center for use in an automated trailer pick-up and deployment operation. The sensor which has been successfully demonstrated on-orbit has been mounted on an iRobot/John Deere RGATOR autonomous vehicle for this demonstration which will be completed in the March 2008 time-frame.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Andrew; Lawrence, Earl
The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less
Cislunar space infrastructure: Lunar technologies
NASA Technical Reports Server (NTRS)
Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.
1989-01-01
Continuing its emphasis on the creation of a cisluar infrastructure as an appropriate and cost-effective method of space exploration and development, the University of Colorado explores the technologies necessary for the creation of such an infrastructure, namely (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologes. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by automation and robotics (LOARS). Under direction from the NASA Office of Exploration, automation and robotics have been extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a 'buddy system', these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of lunar resources and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar Environmentally Controlled Life Support System. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the lunar surface. Physiochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ lunar resources will be both tested and used within this bio-volume. Second phase development on the lunar surface calls for manned operations. Repairs and reconfiguration of the initial framework will ensue. An autonomously initiated, manned Lunar Oasis can become an essential component of the United States space program. The Lunar Oasis will provide support to science, technology, and commerce. It will enable more cost-effective space exploration to the planets and beyond.
Raith, Stefan; Vogel, Eric Per; Anees, Naeema; Keul, Christine; Güth, Jan-Frederik; Edelhoff, Daniel; Fischer, Horst
2017-01-01
Chairside manufacturing based on digital image acquisition is gainingincreasing importance in dentistry. For the standardized application of these methods, it is paramount to have highly automated digital workflows that can process acquired 3D image data of dental surfaces. Artificial Neural Networks (ANNs) arenumerical methods primarily used to mimic the complex networks of neural connections in the natural brain. Our hypothesis is that an ANNcan be developed that is capable of classifying dental cusps with sufficient accuracy. This bears enormous potential for an application in chairside manufacturing workflows in the dental field, as it closes the gap between digital acquisition of dental geometries and modern computer-aided manufacturing techniques.Three-dimensional surface scans of dental casts representing natural full dental arches were transformed to range image data. These data were processed using an automated algorithm to detect candidates for tooth cusps according to salient geometrical features. These candidates were classified following common dental terminology and used as training data for a tailored ANN.For the actual cusp feature description, two different approaches were developed and applied to the available data: The first uses the relative location of the detected cusps as input data and the second method directly takes the image information given in the range images. In addition, a combination of both was implemented and investigated.Both approaches showed high performance with correct classifications of 93.3% and 93.5%, respectively, with improvements by the combination shown to be minor.This article presents for the first time a fully automated method for the classification of teeththat could be confirmed to work with sufficient precision to exhibit the potential for its use in clinical practice,which is a prerequisite for automated computer-aided planning of prosthetic treatments with subsequent automated chairside manufacturing. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Meneguz, Elena; Turp, Debi; Wells, Helen
2015-04-01
It is well known that encounters with moderate or severe turbulence can lead to passenger injuries and incur high costs for airlines from compensation and litigation. As one of two World Area Forecast Centres (WAFCs), the Met Office has responsibility for forecasting en-route weather hazards worldwide for aviation above a height of 10,000 ft. Observations from commercial aircraft provide a basis for gaining a better understanding of turbulence and for improving turbulence forecasts through verification. However there is currently a lack of information regarding the possible cause of the observed turbulence, or whether the turbulence occurred within cloud. Such information would be invaluable for the development of forecasting techniques for particular types of turbulence and for forecast verification. Of all the possible sources of turbulence, convective activity is believed to be a major cause of turbulence. Its relative importance over the Europe and North Atlantic area has not been yet quantified in a systematic way: in this study, a new approach is developed to automate identification of turbulent encounters in the proximity of convective clouds. Observations of convection are provided from two independent sources: a surface based lightning network and satellite imagery. Lightning observations are taken from the Met Office Arrival Time Detections network (ATDnet). ATDnet has been designed to identify cloud-to-ground flashes over Europe but also detects (a smaller fraction of) strikes over the North Atlantic. Meteosat Second Generation (MSG) satellite products are used to identify convective clouds by applying a brightness temperature filtering technique. The morphological features of cold cloud tops are also investigated. The system is run for all in situ turbulence reports received from airlines for a total of 12 months during summer 2013 and 2014 for the domain of interest. Results of this preliminary short term climatological study show significant intra-seasonal variability and an average of 15% of all aircraft encounters with turbulence are found in the proximity of convective clouds.
A real-time automated quality control of rain gauge data based on multiple sensors
NASA Astrophysics Data System (ADS)
qi, Y.; Zhang, J.
2013-12-01
Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.
The automated data processing architecture for the GPI Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce
2017-09-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
the Underestimation of Isorene in Houston during the Texas 2013 DISCOVER-AQ Campaign
NASA Astrophysics Data System (ADS)
Choi, Y.; Diao, L.; Czader, B.; Li, X.; Estes, M. J.
2014-12-01
This study applies principal component analysis to aircraft data from the Texas 2013 DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) field campaign to characterize isoprene sources over Houston during September 2013. The biogenic isoprene signature appears in the third principal component and anthropogenic signals in the following two. Evaluations of the Community Multiscale Air Quality (CMAQ) model simulations of isoprene with airborne measurements are more accurate for suburban areas than for industrial areas. This study also compares model outputs to eight surface automated gas chromatograph (Auto-GC) measurements near the Houston ship channel industrial area during the nighttime and shows that modeled anthropogenic isoprene is underestimated by a factor of 10.60. This study employs a new simulation with a modified anthropogenic emissions inventory (constraining using the ratios of observed values versus simulated ones) that yields closer isoprene predictions at night with a reduction in the mean bias by 56.93%, implying that model-estimated isoprene emissions from the 2008 National Emission Inventory are underestimated in the city of Houston and that other climate models or chemistry and transport models using the same emissions inventory might also be underestimated in other Houston-like areas in the United States.
Dry deposition of large, airborne particles onto a surrogate surface
NASA Astrophysics Data System (ADS)
Kim, Eugene; Kalman, David; Larson, Timothy
Simultaneous measurements of particle dry deposition flux and airborne number concentration in the open atmosphere were made using three different types of artificially generated particles in the size range 10-100 μm - perlite, diatomaceous earth and glass beads. A combination of gravimetric analysis, automated microscopy and sonic anemometry provided size-resolved estimates of both the inertial and gravitational components of the quasi-laminar layer particle deposition velocity, ( Vd) b, as a function of size. Eddy inertial deposition efficiency ( ηdI) was determined as a function of dimensionless eddy Stokes number (Stk e). In the range 3
Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J
2017-08-01
Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.
Comparison of Inoculation with the InoqulA and WASP Automated Systems with Manual Inoculation
Croxatto, Antony; Dijkstra, Klaas; Prod'hom, Guy
2015-01-01
The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >106 bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >107 bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs. PMID:25972424
Scales of variability of bio-optical properties as observed from near-surface drifters
NASA Technical Reports Server (NTRS)
Abbott, Mark R.; Brink, Kenneth H.; Booth, C. R.; Blasco, Dolors; Swenson, Mark S.; Davis, Curtiss O.; Codispoti, L. A.
1995-01-01
A drifter equipped with bio-optical sensors and an automated water sampler was deployed in the California Current as part of the coastal transition zone program to study the biological, chemical, and physical dynamics of the meandering filaments. During deployments in 1987 and 1988, measurements were made of fluorescence, downwelling irradiance, upwelling radiance, and beam attenuation using several bio-optical sensors. Samples were collected by an automated sampler for later analysis of nutrients and phytoplankton species compositon. Large-scale spatial and temporal changes in the bio-optical and biological properties of the region were driven by changes in phytoplankton species composition which, in turn, were associated with the meandering circulation. Variance spectra of the bio-optical paramenters revealed fluctuations on both diel and semidiurnal scales, perhaps associated with solar variations and internal tides, respectively. Offshore, inertial-scale fluctuations were apparent in the variance spectra of temperature, fluorescence, and beam attenuation. Although calibration samples can help remove some of these variations, these results suggest that the use of bio-optical data from unattended platforms such as moorings and drifters must be analyzed carefully. Characterization of the scaled of phytoplankton variability must account for the scales of variability in the algorithms used to convert bio-optical measurments into biological quantities.
NASA Astrophysics Data System (ADS)
Muller, Jan-Peter; Sidiropoulos, Panagiotis; Yershov, Vladimir; Gwinner, Klaus; van Gasselt, Stephan; Walter, Sebastian; Ivanov, Anton; Morley, Jeremy; Sprinks, James; Houghton, Robert; Bamford, Stephen; Kim, Jung-Rack
2015-04-01
Understanding the role of different planetary surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 8 years, especially in 3D imaging of surface shape (down to resolutions of 10cm) and subsequent terrain correction of imagery from orbiting spacecraft. This has led to the ability to be able to overlay different epochs back to the mid-1970s, examine time-varying changes (such as impact craters, RSLs, CO2 geysers, gullies, boulder movements and a host of ice-related phenomena). Consequently we are seeing a dramatic improvement in our understanding of surface formation processes. Since January 2004 the ESA Mars Express has been acquiring global data, especially HRSC stereo (12.5-25m nadir images) with 98% coverage with images ≤100m and more than 70% useful for stereo mapping (e.g. atmosphere sufficiently clear). It has been demonstrated [Gwinner et al., 2010] that HRSC has the highest possible planimetric accuracy of ≤25m and is well co-registered with MOLA, which represents the global 3D reference frame. HRSC 3D and terrain-corrected image products therefore represent the best available 3D reference data for Mars. Recently [Gwinner et al., 2015] have shown the ability to generate mosaiced DTM and BRDF-corrected surface reflectance maps. NASA began imaging the surface of Mars, initially from flybys in the 1960s with the first orbiter with images ≤100m in the late 1970s from Viking Orbiter. The most recent orbiter to begin imaging in November 2006 is the NASA MRO which has acquired surface imagery of around 1% of the Martian surface from HiRISE (at ≈25cm) and ≈5% from CTX (≈6m) in stereo. Unfortunately, for most of these NASA images, especially MGS, MO, VO and HiRISE their accuracy of georeferencing is often worse than the quality of Mars reference data from HRSC. This reduces their value for analysing changes in time series. Within the iMars project (http://i-Mars.eu), a fully automated large-scale processing ("Big Data") solution has been developed to generate the best possible multi-resolution DTM of Mars co-registered to the DLR HRSC (50-100m grid) products with those from CTX (6-20m grid, loc.cit.) and HiRISE (1-3m grids) on a large-scale linux cluster based at MSSL with 224 cores and 0.25 Pb of storage. The HRSC products are employed to provide a geographic reference for all current, future and historical NASA products using automated co-registration based on feature points. Results of this automated co-registration and subsequent automated DTM will be shown. The metadata already available for all orbital imagery acquired to date, with poor georeferencing information, has been employed to determine the "sweet spots" which have long time series of measurements with different spatial resolution ranges over the last ≈50 years of observations and these will be shown. Starting in July 2015, as much of the entire NASA and ESA record of orbital images will be co-registered and the updated georeferencing information employed to generate a time series of terrain relief corrected orthorectified images (ORIs) back to 1977. Web-GIS using OGC protocols will be employed to allow exploration visually of changes of the surface. An example of this will be shown for the latest DLR HRSC DTMs at 100m and BRDF-corrected surface reflectance at 1km. Data mining processing algorithms are being developed to search for changes in the Martian surface from 1971-2015 and the output of this data mining will be compared against the results from citizen scientists' measurements in a specialised Zooniverse implementation. The results of an analysis of existing citizen science projects and lessons learnt for iMars will be shown. Final co-registered data sets will be distributed through both European and US channels in a manner to be decided towards the end of the project. The resultant co-registered image datasets will represent the best possible capture of changes and evolutions in the Martian surface. A workshop is planned to be held during the EPSC time period to demonstrate the first science results on these different types of changes based on initial results . Acknowledgements: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under iMars grant agreement n˚ 607379. Partial support is also provided from the STFC "MSSL Consolidated Grant" ST/K000977/1. References: Gwinner, K., F. et al. (2010) Topography of Mars from global mapping by HRSC high-resolution digital terrain models and orthoimages: characteristics and performance. Earth and Planetary Science Letters 294, 506-519, doi:10.1016/j.epsl.2009.11.007, 2010; Gwinner, K., F. et al. (2015) MarsExpress High Resolution Stereo Camera (HRSC) Multi-orbit Data Products: Methodology, Mapping Concepts and Performance for the first Quadrangle (MC-11E). This conference.
Solving for the Surface: An Automated Approach to THEMIS Atmospheric Correction
NASA Astrophysics Data System (ADS)
Ryan, A. J.; Salvatore, M. R.; Smith, R.; Edwards, C. S.; Christensen, P. R.
2013-12-01
Here we present the initial results of an automated atmospheric correction algorithm for the Thermal Emission Imaging System (THEMIS) instrument, whereby high spectral resolution Thermal Emission Spectrometer (TES) data are queried to generate numerous atmospheric opacity values for each THEMIS infrared image. While the pioneering methods of Bandfield et al. [2004] also used TES spectra to atmospherically correct THEMIS data, the algorithm presented here is a significant improvement because of the reduced dependency on user-defined inputs for individual images. Additionally, this technique is particularly useful for correcting THEMIS images that have captured a range of atmospheric conditions and/or surface elevations, issues that have been difficult to correct for using previous techniques. Thermal infrared observations of the Martian surface can be used to determine the spatial distribution and relative abundance of many common rock-forming minerals. This information is essential to understanding the planet's geologic and climatic history. However, the Martian atmosphere also has absorptions in the thermal infrared which complicate the interpretation of infrared measurements obtained from orbit. TES has sufficient spectral resolution (143 bands at 10 cm-1 sampling) to linearly unmix and remove atmospheric spectral end-members from the acquired spectra. THEMIS has the benefit of higher spatial resolution (~100 m/pixel vs. 3x5 km/TES-pixel) but has lower spectral resolution (8 surface sensitive spectral bands). As such, it is not possible to isolate the surface component by unmixing the atmospheric contribution from the THEMIS spectra, as is done with TES. Bandfield et al. [2004] developed a technique using atmospherically corrected TES spectra as tie-points for constant radiance offset correction and surface emissivity retrieval. This technique is the primary method used to correct THEMIS but is highly susceptible to inconsistent results if great care in the selection of TES spectra is not exercised. Our algorithm implements a newly populated TES database that was created using PostgreSQL/PostGIS geospatial database. TES pixels that meet user-defined quality criteria and that intersect a THEMIS observation of interest may be quickly retrieved using this new database. The THEMIS correction process [Bandfield et al. 2004] is then run using all TES pixels that pass an additional set of TES-THEMIS relational quality checks. The result is a spatially correlated set of atmospheric opacity values, determined from the difference between each atmospherically corrected TES pixel and the overlapping portion of the THEMIS image. The dust and ice contributions to the atmospheric opacity are estimated using known dust and ice spectral dependencies [Smith et al. 2003]. These opacity values may be used to determine atmospheric variation across the scene, from which topography- and temperature-scaled atmospheric contribution may be calculated and removed. References: Bandfield, JL et al. [2004], JGR 109, E10008. Smith, MD et al. [2003], JGR 108, E11, 5115.
Friedman, S N; Bambrough, P J; Kotsarini, C; Khandanpour, N; Hoggard, N
2012-12-01
Despite the established role of MRI in the diagnosis of brain tumours, histopathological assessment remains the clinically used technique, especially for the glioma group. Relative cerebral blood volume (rCBV) is a dynamic susceptibility-weighted contrast-enhanced perfusion MRI parameter that has been shown to correlate to tumour grade, but assessment requires a specialist and is time consuming. We developed analysis software to determine glioma gradings from perfusion rCBV scans in a manner that is quick, easy and does not require a specialist operator. MRI perfusion data from 47 patients with different histopathological grades of glioma were analysed with custom-designed software. Semi-automated analysis was performed with a specialist and non-specialist operator separately determining the maximum rCBV value corresponding to the tumour. Automated histogram analysis was performed by calculating the mean, standard deviation, median, mode, skewness and kurtosis of rCBV values. All values were compared with the histopathologically assessed tumour grade. A strong correlation between specialist and non-specialist observer measurements was found. Significantly different values were obtained between tumour grades using both semi-automated and automated techniques, consistent with previous results. The raw (unnormalised) data single-pixel maximum rCBV semi-automated analysis value had the strongest correlation with glioma grade. Standard deviation of the raw data had the strongest correlation of the automated analysis. Semi-automated calculation of raw maximum rCBV value was the best indicator of tumour grade and does not require a specialist operator. Both semi-automated and automated MRI perfusion techniques provide viable non-invasive alternatives to biopsy for glioma tumour grading.
NASA Astrophysics Data System (ADS)
Schuetze, C.; Sauer, U.; Dietrich, P.
2015-12-01
Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.
Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System.
Punjabi, Naresh M; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N
2015-10-01
Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Clinical sleep laboratories. A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90-0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91-0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. © 2015 Associated Professional Sleep Societies, LLC.
Collaborative real-time motion video analysis by human observer and image exploitation algorithms
NASA Astrophysics Data System (ADS)
Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen
2015-05-01
Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.
Dawson, Carolyn H; Mackrill, Jamie B; Cain, Rebecca
2017-12-01
Hand hygiene (HH) prevents harmful contaminants spreading in settings including domestic, health care and food handling. Strategies to improve HH range from behavioural techniques through to automated sinks that ensure hand surface cleaning. This study aimed to assess user experience and acceptance towards a new automated sink, compared to a normal sink. An adapted version of the technology acceptance model (TAM) assessed each mode of handwashing. A within-subjects design enabled N = 46 participants to evaluate both sinks. Perceived Ease of Use and Satisfaction of Use were significantly lower for the automated sink, compared to the conventional sink (p < 0.005). Across the remaining TAM factors, there was no significant difference. Participants suggested design features including jet strength, water temperature and device affordance may improve HH technology. We provide recommendations for future HH technology development to contribute a positive user experience, relevant to technology developers, ergonomists and those involved in HH across all sectors. Practitioner Summary: The need to facilitate timely, effective hand hygiene to prevent illness has led to a rise in automated handwashing systems across different contexts. User acceptance is a key factor in system uptake. This paper applies the technology acceptance model as a means to explore and optimise the design of such systems.
Drilling Automation Demonstrations in Subsurface Exploration for Astrobiology
NASA Technical Reports Server (NTRS)
Glass, Brian; Cannon, H.; Lee, P.; Hanagud, S.; Davis, K.
2006-01-01
This project proposes to study subsurface permafrost microbial habitats at a relevant Arctic Mars-analog site (Haughton Crater, Devon Island, Canada) while developing and maturing the subsurface drilling and drilling automation technologies that will be required by post-2010 missions. It builds on earlier drilling technology projects to add permafrost and ice-drilling capabilities to 5m with a lightweight drill that will be automatically monitored and controlled in-situ. Frozen cores obtained with this drill under sterilized protocols will be used in testing three hypotheses pertaining to near-surface physical geology and ground H2O ice distribution, viewed as a habitat for microbial life in subsurface ice and ice-consolidated sediments. Automation technologies employed will demonstrate hands-off diagnostics and drill control, using novel vibrational dynamical analysis methods and model-based reasoning to monitor and identify drilling fault states before and during faults. Three field deployments, to a Mars-analog site with frozen impact crater fallback breccia, will support science goals, provide a rigorous test of drilling automation and lightweight permafrost drilling, and leverage past experience with the field site s particular logistics.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks
Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen
2014-01-01
One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925
Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.
Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen
2013-11-01
One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.
Automation of NLO processes and decays and POWHEG matching in WHIZARD
NASA Astrophysics Data System (ADS)
Reuter, Jürgen; Chokoufé, Bijan; Hoang, André; Kilian, Wolfgang; Stahlhofen, Maximilian; Teubner, Thomas; Weiss, Christian
2016-10-01
We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one- loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e+ e- processes are shown. Also, first NLO- studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the Powheg formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.
Input-output identification of controlled discrete manufacturing systems
NASA Astrophysics Data System (ADS)
Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques
2014-03-01
The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.
Discrete Surface Evolution and Mesh Deformation for Aircraft Icing Applications
NASA Technical Reports Server (NTRS)
Thompson, David; Tong, Xiaoling; Arnoldus, Qiuhan; Collins, Eric; McLaurin, David; Luke, Edward; Bidwell, Colin S.
2013-01-01
Robust, automated mesh generation for problems with deforming geometries, such as ice accreting on aerodynamic surfaces, remains a challenging problem. Here we describe a technique to deform a discrete surface as it evolves due to the accretion of ice. The surface evolution algorithm is based on a smoothed, face-offsetting approach. We also describe a fast algebraic technique to propagate the computed surface deformations into the surrounding volume mesh while maintaining geometric mesh quality. Preliminary results presented here demonstrate the ecacy of the approach for a sphere with a prescribed accretion rate, a rime ice accretion, and a more complex glaze ice accretion.
High-touch surfaces: microbial neighbours at hand.
Cobrado, L; Silva-Dias, A; Azevedo, M M; Rodrigues, A G
2017-11-01
Despite considerable efforts, healthcare-associated infections (HAIs) continue to be globally responsible for serious morbidity, increased costs and prolonged length of stay. Among potentially preventable sources of microbial pathogens causing HAIs, patient care items and environmental surfaces frequently touched play an important role in the chain of transmission. Microorganisms contaminating such high-touch surfaces include Gram-positive and Gram-negative bacteria, viruses, yeasts and parasites, with improved cleaning and disinfection effectively decreasing the rate of HAIs. Manual and automated surface cleaning strategies used in the control of infectious outbreaks are discussed and current trends concerning the prevention of contamination by the use of antimicrobial surfaces are taken into consideration in this manuscript.
NASA Astrophysics Data System (ADS)
Chavis, Christopher
Using commercial digital cameras in conjunction with Unmanned Aerial Systems (UAS) to generate 3-D Digital Surface Models (DSMs) and orthomosaics is emerging as a cost-effective alternative to Light Detection and Ranging (LiDAR). Powerful software applications such as Pix4D and APS can automate the generation of DSM and orthomosaic products from a handful of inputs. However, the accuracy of these models is relatively untested. The objectives of this study were to generate multiple DSM and orthomosaic pairs of the same area using Pix4D and APS from flights of imagery collected with a lightweight UAS. The accuracy of each individual DSM was assessed in addition to the consistency of the method to model one location over a period of time. Finally, this study determined if the DSMs automatically generated using lightweight UAS and commercial digital cameras could be used for detecting changes in elevation and at what scale. Accuracy was determined by comparing DSMs to a series of reference points collected with survey grade GPS. Other GPS points were also used as control points to georeference the products within Pix4D and APS. The effectiveness of the products for change detection was assessed through image differencing and observance of artificially induced, known elevation changes. The vertical accuracy with the optimal data and model is ≈ 25 cm and the highest consistency over repeat flights is a standard deviation of ≈ 5 cm. Elevation change detection based on such UAS imagery and DSM models should be viable for detecting infrastructure change in urban or suburban environments with little dense canopy vegetation.
Large volume serial section tomography by Xe Plasma FIB dual beam microscopy.
Burnett, T L; Kelley, R; Winiarski, B; Contreras, L; Daly, M; Gholinia, A; Burke, M G; Withers, P J
2016-02-01
Ga(+) Focused Ion Beam-Scanning Electron Microscopes (FIB-SEM) have revolutionised the level of microstructural information that can be recovered in 3D by block face serial section tomography (SST), as well as enabling the site-specific removal of smaller regions for subsequent transmission electron microscope (TEM) examination. However, Ga(+) FIB material removal rates limit the volumes and depths that can be probed to dimensions in the tens of microns range. Emerging Xe(+) Plasma Focused Ion Beam-Scanning Electron Microscope (PFIB-SEM) systems promise faster removal rates. Here we examine the potential of the method for large volume serial section tomography as applied to bainitic steel and WC-Co hard metals. Our studies demonstrate that with careful control of milling parameters precise automated serial sectioning can be achieved with low levels of milling artefacts at removal rates some 60× faster. Volumes that are hundreds of microns in dimension have been collected using fully automated SST routines in feasible timescales (<24h) showing good grain orientation contrast and capturing microstructural features at the tens of nanometres to the tens of microns scale. Accompanying electron back scattered diffraction (EBSD) maps show high indexing rates suggesting low levels of surface damage. Further, under high current Ga(+) FIB milling WC-Co is prone to amorphisation of WC surface layers and phase transformation of the Co phase, neither of which have been observed at PFIB currents as high as 60nA at 30kV. Xe(+) PFIB dual beam microscopes promise to radically extend our capability for 3D tomography, 3D EDX, 3D EBSD as well as correlative tomography. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Automated inspection of hot steel slabs
Martin, R.J.
1985-12-24
The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes. 5 figs.
High pressure water jet cutting and stripping
NASA Technical Reports Server (NTRS)
Hoppe, David T.; Babai, Majid K.
1991-01-01
High pressure water cutting techniques have a wide range of applications to the American space effort. Hydroblasting techniques are commonly used during the refurbishment of the reusable solid rocket motors. The process can be controlled to strip a thermal protective ablator without incurring any damage to the painted surface underneath by using a variation of possible parameters. Hydroblasting is a technique which is easily automated. Automation removes personnel from the hostile environment of the high pressure water. Computer controlled robots can perform the same task in a fraction of the time that would be required by manual operation.
Development of automated optical verification technologies for control systems
NASA Astrophysics Data System (ADS)
Volegov, Peter L.; Podgornov, Vladimir A.
1999-08-01
The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.
Automated recognition and characterization of solar active regions based on the SOHO/MDI images
NASA Technical Reports Server (NTRS)
Pap, J. M.; Turmon, M.; Mukhtar, S.; Bogart, R.; Ulrich, R.; Froehlich, C.; Wehrli, C.
1997-01-01
The first results of a new method to identify and characterize the various surface structures on the sun, which may contribute to the changes in solar total and spectral irradiance, are shown. The full disk magnetograms (1024 x 1024 pixels) of the Michelson Doppler Imager (MDI) experiment onboard SOHO are analyzed. Use of a Bayesian inference scheme allows objective, uniform, automated processing of a long sequence of images. The main goal is to identify the solar magnetic features causing irradiance changes. The results presented are based on a pilot time interval of August 1996.
Automated food microbiology: potential for the hydrophobic grid-membrane filter.
Sharpe, A N; Diotte, M P; Dudas, I; Michaud, G L
1978-01-01
Bacterial counts obtained on hydrophobic grid-membrane filters were comparable to conventional plate counts for Pseudomonas aeruginosa, Escherichia coli, and Staphylococcus aureus in homogenates from a range of foods. The wide numerical operating range of the hydrophobic grid-membrane filters allowed sequential diluting to be reduced or even eliminated, making them attractive as components in automated systems of analysis. Food debris could be rinsed completely from the unincubated hydrophobic grid-membrane filter surface without affecting the subsequent count, thus eliminating the possibility of counting food particles, a common source of error in electronic counting systems. PMID:100054
Automated inspection of hot steel slabs
Martin, Ronald J.
1985-01-01
The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes.
Compliant Task Execution and Learning for Safe Mixed-Initiative Human-Robot Operations
NASA Technical Reports Server (NTRS)
Dong, Shuonan; Conrad, Patrick R.; Shah, Julie A.; Williams, Brian C.; Mittman, David S.; Ingham, Michel D.; Verma, Vandana
2011-01-01
We introduce a novel task execution capability that enhances the ability of in-situ crew members to function independently from Earth by enabling safe and efficient interaction with automated systems. This task execution capability provides the ability to (1) map goal-directed commands from humans into safe, compliant, automated actions, (2) quickly and safely respond to human commands and actions during task execution, and (3) specify complex motions through teaching by demonstration. Our results are applicable to future surface robotic systems, and we have demonstrated these capabilities on JPL's All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) robot.
Lee, Robert C.; Kang, Hobin; Darling, Cynthia L.; Fried, Daniel
2014-01-01
Accurate measurement of the highly mineralized transparent surface layer that forms on caries lesions is important for diagnosis of the lesion activity because chemical intervention can slow or reverse the caries process via remineralization. Previous in-vitro and in-vivo studies have demonstrated that polarization-sensitive optical coherence tomography (PS-OCT) can nondestructively image the subsurface lesion structure and the highly mineralized transparent surface zone of caries lesions. The purpose of this study was to develop an approach to automatically process 3-dimensional PS-OCT images and to accurately assess the remineralization process in simulated enamel lesions. Artificial enamel lesions were prepared on twenty bovine enamel blocks using two models to produce varying degree of demineralization and remineralization. The thickness of the transparent surface layer and the integrated reflectivity of the subsurface lesion were measured using PS-OCT. The automated transparent surface layer detection algorithm was able to successfully detect the transparent surface layers with high sensitivity ( = 0.92) and high specificity ( = 0.97). The estimated thickness of the transparent surface layer showed a strong correlation with polarized light microscopy (PLM) measurements of all regions (R2 = 0.90). The integrated reflectivity, ΔR, and the integrated mineral loss, ΔZ, showed a moderate correlation (R2 = 0.32). This study demonstrates that PS-OCT can automatically measure the changes in artificial enamel lesion structure and severity upon exposure to remineralization solutions. PMID:25401009
Automated Detection of Craters in Martian Satellite Imagery Using Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Norman, C. J.; Paxman, J.; Benedix, G. K.; Tan, T.; Bland, P. A.; Towner, M.
2018-04-01
Crater counting is used in determining surface age of planets. We propose improvements to martian Crater Detection Algorithms by implementing an end-to-end detection approach with the possibility of scaling the algorithm planet-wide.